SlideShare ist ein Scribd-Unternehmen logo
1 von 11
Name: NEETHU ABRAHAM
Mob: 9742748401
Personal Email: mailtonabraham@gmail.com
IT Professional with strong DW experience
Key Skills (Oracle,ETL(Informatica),Unix shell scripting, Pl/Sql, Informatica MDM)
• Overall 9 years of IT experience in ETL Architecture, Analysis, design, development, testing ,implementation,
maintenance and supporting of Enterprise level Data Integration, Data Warehouse (EDW) Business Intelligence
(BI) solutions using Operational Data Store(ODS)Data Warehouse (DW)/Data Mart (DM), ETL, OLAP on Unix
platform.
• 8+ years of experience in data warehousing ETL design and development using Informatica7/8/9
• Good understanding on Dimensional Data Modeling using Star & Snow Flake schema, De normalization,
Normalization, and Aggregations.
• Overall 8 years of experience working in large scale data warehouse using Databases like Oracle
9i/10g/11g/,SQL Server 2000 and Flat files.
• More than 1 year of experience in using PL/SQL, Stored procedures/functions, triggers and packages.
• Has performed complex sql joins, correlated sub-queries, aggregate functions analytic functions, materialized
views, indexing, partitioning and performance tuning the same using explain plan.
• Have Good understanding of ETL/Informatica standards and best practices ,Confirmed Dimensions, Slowly
Changing Dimensions (SCD1,SCD2,SCD3)
• Experience in writing UNIX Korn shell scripting
• Basic Knowledge on Informatica MDM Hub console and worked on operation support work including load
monitoring, defect fixes etc..
• Experience in testing coordination ,writing test cases and executing test scripts ,And logged defects in Defect
tracker(excel based),
• Version control using StarTeam and Microsoft TFS.
• Maintaining and Supporting during Integration Test ,QA,UAT, and Production for Issues/Bug fixes/Defects
• Have involved in the low level design and preparation of mapping specification documents
• Experience in all phases of Water fall Life Cycle(SDLC) and incremental model
• Excellent team player with very good communication skills and leadership qualities.
• Have worked on performance tuning of Informatica slow running mappings.
• A quick learner who can swiftly adapt to new challenges, Self-starter.
Page 1 of 11
SUMMARY
Presently working as senior consultant at Ciber Sites India Pvt Ltd(Nov 2013 –Till Date)
Consultant at Headstrong/Genpact(May 2011-Oct 2013)
Associate at Cognizant(Dec 2006-Apr 2011)
Professional Certifications
 SCJP certification
 Informatica 7.1 mapping designer certification
ETL Tools: Informatica Power Center 7.x/8.1.1/8.5.1/8.6.1/9.1(Primary Skill),
MDM Tools Informatica MDM Hub console(Secondary skill)
RDBMS: Oracle 9i/10g(Primary Skill),
SQL server 2000/2008,Teradata,DB2,Netezza(Secondary skill)
GUI Tools: Toad, Aqua data studio, SQL assistant
Version Control TFS, Star team(Internal)
Programming
Languages:
Unix Korn Shell scripting (Secondary skill),
SQL,PL/SQL(Secondary skill)
Platforms: MS Windows, Unix
Domain Health care, retail and hospitality
End Client: Wyndham Vacations Ownership
Position Title: ETL Architect/Team Lead
Duration: Nov, 2013 - To Date
Project: Voyager Release 1,2,3
EDM
Team Size: 5
Page 2 of 11
PROJECTS EXECUTED at CIBER Sites India (2013 Nov - Till Date)
TECHNICAL EXPERTISE
PROFESSIONAL EXPERTISE
Description: :
Wyndham vacations ownership is a US based client. They provide packages for vacations and resorts in
cheaper rates. Also they provide special allowances or points for each booking. Allows early booking for
festive seasons. The resorts have many products which provide special facilities and fun activities. The resorts
and their features are categorized under Product Hub Module.
The Customer Hub module deals with the contract agreement data for members and owners. This information
is loaded from CSS main frame files to staging area. From there a CDC process is implemented to extract the
daily loads to central repository.IDQ process is in place for address, email, phone details verification. From CR
layer data is loaded to MLZ.MDM process is used to standardize the data in MLZ layer and the golden records
are loaded back to these layers again.
EDM is an enterprise data management project which is focused on building the centralized data ware house
for various reporting purposes. This involves creating and loading dimensions and facts from source tables on
oracle and sql server. Each party and the related contracts signed,along with their membership information is
processed from MDM tables and loaded to dimension tables. An aggregated fact table is also loaded for the
party with accumulating snapshot information.
Role:
• Played role of a team lead
• Interacted with onshore team for requirement gathering
• Participated daily status meetings
• Driven offshore team and monitored the tasks completion.
• Enhancements and defect fixing for R2 Account management and R3.
• Analysis of the architecture and business data flow.
• Fixing defects for Day0 and incremental runs.
• Implementing performance tuning techniques.
• Unit testing.
• Provide guidance to junior team mates.
Environment:
Tools:Informatica 9.1.5
DB:Oracle 10G
OS:Unix
End Client: Davita
Position Title: Senior ETL developer
Duration: May, 2012 - Oct-2013
Project: Clinical Insights
Team Size: 4
Page 3 of 11
PROJECTS EXECUTED at HEADSTRONG/GENPACT (2011 May -2013 Oct)
Description: :
Clinical insight users mainly the healthcare providers would like to see the report about their facilities
performance on a weekly or monthly basis. This projects helps to load the main report table after implementing
all the business requirements through informatica. This report contains the summary and detail information
about patients ,facilities,region,division and village. Patient information with their assigned facility,
Modality type(treatment type) ,access used, physician etc and the score calculation for each lab
test he/she has undergone in the reporting period will be populated. Major modules in this projects are patient
assignment,lab bucketing and scoring.
Role:
• Involved in requirement analysis and initial design.
• Sole ownership for the development of ETL mappings, workflows for the entire modules in
thisproject.
• Code review.
• Unt testing.
• Provide guidance to junior team mates.
Environment:
Tools:Informatica 9.1.0
DB:Oracle 10G
OS:Unix
End Client: Davita
Position Title: Senior ETL developer
Duration: April, 2012 - May, 2012
Project: Crownweb
Team Size: 3
Description:
CROWNWeb is a web application for Medicare’s End-Stage Renal Disease
clinical management processes. The existing system used windows batch scripts to generate the xml files with
patient details from the sql server tables. The sql server source tables were already build with xml equivalent
tags through stored procedure. Automating this crownweb submission process through ETL
Role:
• Requirement analysis with end user clients
• ETL code development to automate the xml file generation from the sql server source tables
using ‘for xml explicit’ property.
• Preparing mapping specification documents.
• Code review.
• Unt testing.
• Preparing project hand over documents.
Environment:
Tools:Informatica 9.1.0
DB:Oracle 10G,Sql server 2008
Page 4 of 11
OS:Unix
End Client: Davita
Position Title: Senior ETL developer
Duration: March, 2012 - April, 2012
Project: Esig-Data quality fixes
Team Size: 2
Description:
Fixing the data issues with the existing clinical data mart system
Role:
• Involved in meetings with clients.
• Development of ETL mappings and sql scripts to fix the issues ,
• Code review.
• Unt testing.
Environment:
Tools:Informatica 9.1.0
DB:Oracle 10G
OS:Unix
End Client: Davita
Position Title: ETL developer
Duration: September, 2011 - March, 2012
Project: Cwiz
Team Size: 6
Description:
Building clinical data mart from snappy system
Role:
• Create etl mappings with scd type 2 logic for dimension and fact table loads.
• Interacting with clients to discuss about the mapping specification documents.
• Code review.
• Unt testing.
Environment:
Tools:Informatica 9.1.0
DB:Oracle 10G
OS:Unix
Page 5 of 11
End Client: Davita
Position Title: ETL developer
Duration: July, 2011 - September, 2011
Project: Snapshot Phase-II
Team Size: 1
Description: Automating snapshot cube builds using informatica
Role:
• Automating cube builds using informatica through an ssh connection to the cognos server were
the cubes need to be build.
• Tracking the load control date in the snapshot control table for the next cube build
• Creating mappings,sessions,worklets,workflows as per Davita standard.
• Unit testing.
• Code review.
Environment:
Tools:Informatica 9.1.0
DB:Oracle 10g
OS:Unix
End Client: Wyndham Hotel Group
Position Title: ETL Developer
Duration: February, 2011 - April, 2011
Project: WHG-POI/POR
Description:
As part of the migration of data from GPS system database to SPE system database all the existing ETL in
EDW and Datamart required modification to source the data from SPE instead of GPS.This project aims to
implement the required change as part of this migration.
Role:
• The team lead of a 2 member project
• Change request gathering from onsite team cordinator
• Impact analysis on different modules
• Preparing the impact analysis sheet and design document for the ETL process
• Modifying existing mappings and workflows for the new change request
• Preparing unit test cases and executing them to ensure the proper functionality
• Capturinng the test logs
Page 6 of 11
PROJECTS EXECUTED at Cognizant (2006 Dec – 2011 Apr)
• Code migration from development to QA
• Supporting QA team for testing
• Cordinating with onsite team on issues/clarifications.
Environment:
Tools:Informatica 8.6.1,Toad 7.6,Aqua data studio
Database:Oracle 9i,DB2,SQL Server 2000
OS:Unix/Windows 2000
End Client: Wyndham Hotel Group
Position Title: ETL Developer
Duration: December, 2010 - February, 2011
Project: WHG-Chain Code Integration
Description:
Chain code Integration is an enhancement project which deals with minimizing the manual effort required to
integrate new chain codes to the existing system.When new chain codes comes to the existing system the
integration should be made easier by parameterising the chain codes.Also the existing billing process for
different chain codes has been combined together to a single ETL process as part of this project
Role:
• Team lead of the 2 member project.
• Change Requirement gathering and impact analysis.
• Designing ETL process details.
• Development of the Informatica mappings,look ups,mapplets,reusable
transformations,sessions,workflows etc as per the design document/communication.
• Developed Unix shell scripts for FTP,Source file varification etc.
• Performance tuning of the existing ETL process.
• Prepare test cases and testing the ETL code for the module.
• Involved in reviewing the test cases written by the peers.
• Fix the issues that come out of unit testing code reviews.
• Communicate with onsite team cordinator to get project related information.
Environment:
Tools:Informatica 8.6.1,Toad 7.6,Aqua data studio
Database:Oracle 9i,DB2,SQL Server 2000
OS:Unix/Windows 2000
End Client: Wyndham Hotel group
Position Title: ETL developer
Duration: November, 2009 - December, 2010
Project: WHG – Cpay
Description:
Page 7 of 11
The Cpay is a commission payment system designed for Wyndham hotel group. When a travel agent book
hotels for a customer he is paid with a certain amount of commission by that hotel (property).This entire
calculation of commission is done by a third party vendor called Pegasus. The stay information along with the
control table information(rules for calculating commission) is send to Pegasus and the Pegasus send back
invoices & payment details after the calculation of commission.. This invoices need to be send to the property
which they use to pay Pegasus
Role:
• Team member of a six member project
• Requirement Gathering and Analysis
• Designing ETL process details
• Development of the Informatica Mappings, lookups, mapplets, reusable components, Sessions,
Work Flows etc. (on ETL side) as per the design documents/communication.
• Involved in reviewing the Test Cases written by the peers
• Writing DB2 and Oracle stored procedures for control table operations
• Fix the Issues that come out of Unit Testing Code Reviews
• Communicate with onsite team to get project related information
• Preparing test plan, strategies, test cases, building and testing of the informatica mappings
Environment:
Tools:Informatica 8.6.1,Toad 7.6,Aqua data studio
Database:Oracle 9i,DB2,SQL Server 2000
OS:Unix/Windows 2000
End Client: Wellpoint
Position Title: ETL Developer
Duration: August, 2008 - October, 2009
Project: Direct sourcing Provider- Phase 3(CSA-EDW)
Description:
The Client is one of the world's largest healthcare insurance provider companies.
The objective of the EDLR3-Direct sourcing Provider- Phase 3(CSA-EDW) project is to load the provider
specific information from CSA tables to EDW tables using bteq scripts.The aim is to load the changed or new
source data from various CSA tables to EDW tables. During this load the derivation of coupled code for each
source row is done by look up on translation tables.
Role:
• Team lead of a two member team
• Technical guidance to the team members
• Requirement Gathering and Analysis
• Identify the changed or new records using Teradata bteq scripts
• Applying business logic to source data using teradata
• Derive source coupled code logic through look up on translation tables
• Load balancing using bteq scripts
• Code Reviews
Page 8 of 11
• Prepare workflows to run the bteq scripts using informatica
• Analyzing and providing solutions for the issues raised by development team
• Preparation of test plan, strategies
Environment:
Tools:Informatica 8.1.1,Teradata SQL assistant 7.1
Database:Teradata,Oracle 9i
OS:Unix/Windows 2000
End Client: Wellpoint
Position Title: ETL Developer
Duration: September, 2007 - August, 2008
Project: Direct Sourcing COA-FACETS Phase-3
Description:
The objective of the ‘IM EDL-EDW Direct Sourcing to EDL R2.3’ project is to develop a detailed approach to
source the data from different sources (Core FACETS, ACES and NASCO) directly to the EDW (Enterprise
Data Warehouse) base model.Our concern was to load the Core FACETS COA data into the target EDW
environment. COA History data will be loaded from the Core FACETS Product tables, Core FACETS
Extension tables and Finance IT tables of the Core FACETS database. The on-going plan is to load only the
new or changed records to the EDW base model to capture the changed data in the Core FACETS Product,
Extension and Finance tables.
COA History data and the Incremental data will be first loaded into the Conformed Staging Area (CSA) and
most of the business transformations will be applied to the data before it is loaded into CSA.
Role:
• Requirement Gathering and Analysis
• Come up with multiple design approaches and recommend the best possible solution
considering all the constraints
• Extract transform and load the COA FACETS data from source flat files to EDW database
• Writing complex Teradata SQL query to derive COA related chartfields for the source records
• Load balancing using bteq scripts
• Performance tuning of the existing ETL process,Teradata SQL
• Code Reviews
• Analyzing and providing solutions for the issues raised by development team
• Preparation of test plan, strategies, test cases, building and testing of the Informatica mappings
Environment:
Tools:Informatica 8.1.1,Teradata SQL assistant 7.1
Database:Teradata,Oracle 9i
OS:Unix/Windows 2000
End Client: Wellpoint
Position Title: ETL Developer
Page 9 of 11
Duration: January, 2007 - September, 2007
Project: Direct sourcing Provider Phase-1
Description:
The purpose of the EDW - Direct Sourcing of WGS 2.0 project is to source WGS 2.0 data directly to the EDW
base model.
In provider two source system
> WGS
> CS90
In WGS Provider data will be sourced from three different systems-
> EPDS
> NMS
> WMS
The EPDS system will be a single source for information for Institutional Providers. The Professional Data will
be loaded from the WMS and NMS Systems directly until this data is migrated into the EPDS system.
Data will be extracted from the WMS/NMS operational source systems and moved to a landing zone Teradata
tables. The data would then be mapped from the landing zone to the CSA tables. All the transformation logic
will be handled from landing zone to CSA feed. There will be two loads one for history data and another for
incremental.
The WMS/NMS and EPDS data will be loaded to the EDW on daily basis.
Role:
• Preparing high level and low level design document for ETL applications, Source –Target
mapping documents
• Working with heterogeneous sources – mainframes, RDBMS
• Develop mappings and workflows to load source data from Cobol files to Landing zone area
• Load data from LZ to staging and staging to EDW using Teradata bteq scripts
• Technical Guidance to team members
• Load balancing using bteq scripts
Environment:
Tools:Informatica 8.1.1,Teradata SQL assistant 7.1
Database:Teradata,Oracle 9i
OS:Unix/Windows 2000
Professional:
BTech in Information Technology,
College of Engineering Kallooppara, Cochin University, Kerala [79% Distinction]
Year Attended : 2002-2006
Under Graduation:
Higher Secondary: Model Technical Higher secondary school Kaloor Cochin Kerala. [77% Distinction]
Year Attended : 2000-2002
SSLC : St.Pauls G.H.S Mutholapuram Kerala. [93% Distinction]
Year Attended: 2000
Page 10 of 11
EDUCATION
Page 11 of 11

Weitere ähnliche Inhalte

Was ist angesagt?

shibindas_Plsql2year
shibindas_Plsql2yearshibindas_Plsql2year
shibindas_Plsql2yearshibindas pk
 
Rakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh Kumar
 
Resume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_expResume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_exprajarao marisa
 
Mohamed sakr Senior ETL Developer
Mohamed sakr   Senior ETL Developer Mohamed sakr   Senior ETL Developer
Mohamed sakr Senior ETL Developer Mohamed Sakr
 
Shane_O'Neill_CV_slim
Shane_O'Neill_CV_slimShane_O'Neill_CV_slim
Shane_O'Neill_CV_slimShane O'Neill
 
Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16Rajesh Dheeti
 
Resume_Md ZakirHussain
Resume_Md ZakirHussainResume_Md ZakirHussain
Resume_Md ZakirHussainzakir hussain
 
Oracle Data Integration Presentation
Oracle Data Integration PresentationOracle Data Integration Presentation
Oracle Data Integration Presentationkgissandaner
 
Farooq_Oracle_DBA_Updated.
Farooq_Oracle_DBA_Updated.Farooq_Oracle_DBA_Updated.
Farooq_Oracle_DBA_Updated.Farooq Omer
 
ChandraShekhar_5.11_yrs_exp
ChandraShekhar_5.11_yrs_expChandraShekhar_5.11_yrs_exp
ChandraShekhar_5.11_yrs_expChandra Shekhar
 

Was ist angesagt? (20)

shibindas_Plsql2year
shibindas_Plsql2yearshibindas_Plsql2year
shibindas_Plsql2year
 
PG_resume (2)
PG_resume (2)PG_resume (2)
PG_resume (2)
 
Saroj_Mahanta
Saroj_MahantaSaroj_Mahanta
Saroj_Mahanta
 
Rakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resume
 
resume_abdul_up
resume_abdul_upresume_abdul_up
resume_abdul_up
 
smrutikanta jena
smrutikanta jenasmrutikanta jena
smrutikanta jena
 
NaliniProfile
NaliniProfileNaliniProfile
NaliniProfile
 
Resume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_expResume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_exp
 
Mohamed sakr Senior ETL Developer
Mohamed sakr   Senior ETL Developer Mohamed sakr   Senior ETL Developer
Mohamed sakr Senior ETL Developer
 
azhar_Mohammed_INF
azhar_Mohammed_INFazhar_Mohammed_INF
azhar_Mohammed_INF
 
Shane_O'Neill_CV_slim
Shane_O'Neill_CV_slimShane_O'Neill_CV_slim
Shane_O'Neill_CV_slim
 
Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16
 
Resume_Raj Ganesh Subramanian
Resume_Raj Ganesh SubramanianResume_Raj Ganesh Subramanian
Resume_Raj Ganesh Subramanian
 
Ashutosh_Resume
Ashutosh_Resume Ashutosh_Resume
Ashutosh_Resume
 
Resume_Md ZakirHussain
Resume_Md ZakirHussainResume_Md ZakirHussain
Resume_Md ZakirHussain
 
ZakirHussain
ZakirHussainZakirHussain
ZakirHussain
 
Ganesh CV
Ganesh CVGanesh CV
Ganesh CV
 
Oracle Data Integration Presentation
Oracle Data Integration PresentationOracle Data Integration Presentation
Oracle Data Integration Presentation
 
Farooq_Oracle_DBA_Updated.
Farooq_Oracle_DBA_Updated.Farooq_Oracle_DBA_Updated.
Farooq_Oracle_DBA_Updated.
 
ChandraShekhar_5.11_yrs_exp
ChandraShekhar_5.11_yrs_expChandraShekhar_5.11_yrs_exp
ChandraShekhar_5.11_yrs_exp
 

Ähnlich wie Neethu_Abraham

Ähnlich wie Neethu_Abraham (20)

Mohd_Shaukath_5_Exp_Datastage
Mohd_Shaukath_5_Exp_DatastageMohd_Shaukath_5_Exp_Datastage
Mohd_Shaukath_5_Exp_Datastage
 
Ashish updated cv 17 dec 2015
Ashish updated cv 17 dec 2015Ashish updated cv 17 dec 2015
Ashish updated cv 17 dec 2015
 
Resume ratna rao updated
Resume ratna rao updatedResume ratna rao updated
Resume ratna rao updated
 
Resume_Ratna Rao updated
Resume_Ratna Rao updatedResume_Ratna Rao updated
Resume_Ratna Rao updated
 
Shivaprasada_Kodoth
Shivaprasada_KodothShivaprasada_Kodoth
Shivaprasada_Kodoth
 
Renu_Resume
Renu_ResumeRenu_Resume
Renu_Resume
 
SURENDRANATH GANDLA4
SURENDRANATH GANDLA4SURENDRANATH GANDLA4
SURENDRANATH GANDLA4
 
ananth_resume
ananth_resumeananth_resume
ananth_resume
 
Resume_kallesh_latest
Resume_kallesh_latestResume_kallesh_latest
Resume_kallesh_latest
 
Resume
ResumeResume
Resume
 
Resume_sukanta_updated
Resume_sukanta_updatedResume_sukanta_updated
Resume_sukanta_updated
 
Resume_APRIL_updated
Resume_APRIL_updatedResume_APRIL_updated
Resume_APRIL_updated
 
Resume april updated
Resume april updatedResume april updated
Resume april updated
 
Anil_Kumar_Andra_ETL
Anil_Kumar_Andra_ETLAnil_Kumar_Andra_ETL
Anil_Kumar_Andra_ETL
 
Resume
ResumeResume
Resume
 
Richa_Profile
Richa_ProfileRicha_Profile
Richa_Profile
 
Resume
ResumeResume
Resume
 
PradeepDWH
PradeepDWHPradeepDWH
PradeepDWH
 
Arun-Kumar-OEDQ-Developer
Arun-Kumar-OEDQ-DeveloperArun-Kumar-OEDQ-Developer
Arun-Kumar-OEDQ-Developer
 
Informatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQLInformatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQL
 

Neethu_Abraham

  • 1. Name: NEETHU ABRAHAM Mob: 9742748401 Personal Email: mailtonabraham@gmail.com IT Professional with strong DW experience Key Skills (Oracle,ETL(Informatica),Unix shell scripting, Pl/Sql, Informatica MDM) • Overall 9 years of IT experience in ETL Architecture, Analysis, design, development, testing ,implementation, maintenance and supporting of Enterprise level Data Integration, Data Warehouse (EDW) Business Intelligence (BI) solutions using Operational Data Store(ODS)Data Warehouse (DW)/Data Mart (DM), ETL, OLAP on Unix platform. • 8+ years of experience in data warehousing ETL design and development using Informatica7/8/9 • Good understanding on Dimensional Data Modeling using Star & Snow Flake schema, De normalization, Normalization, and Aggregations. • Overall 8 years of experience working in large scale data warehouse using Databases like Oracle 9i/10g/11g/,SQL Server 2000 and Flat files. • More than 1 year of experience in using PL/SQL, Stored procedures/functions, triggers and packages. • Has performed complex sql joins, correlated sub-queries, aggregate functions analytic functions, materialized views, indexing, partitioning and performance tuning the same using explain plan. • Have Good understanding of ETL/Informatica standards and best practices ,Confirmed Dimensions, Slowly Changing Dimensions (SCD1,SCD2,SCD3) • Experience in writing UNIX Korn shell scripting • Basic Knowledge on Informatica MDM Hub console and worked on operation support work including load monitoring, defect fixes etc.. • Experience in testing coordination ,writing test cases and executing test scripts ,And logged defects in Defect tracker(excel based), • Version control using StarTeam and Microsoft TFS. • Maintaining and Supporting during Integration Test ,QA,UAT, and Production for Issues/Bug fixes/Defects • Have involved in the low level design and preparation of mapping specification documents • Experience in all phases of Water fall Life Cycle(SDLC) and incremental model • Excellent team player with very good communication skills and leadership qualities. • Have worked on performance tuning of Informatica slow running mappings. • A quick learner who can swiftly adapt to new challenges, Self-starter. Page 1 of 11 SUMMARY
  • 2. Presently working as senior consultant at Ciber Sites India Pvt Ltd(Nov 2013 –Till Date) Consultant at Headstrong/Genpact(May 2011-Oct 2013) Associate at Cognizant(Dec 2006-Apr 2011) Professional Certifications  SCJP certification  Informatica 7.1 mapping designer certification ETL Tools: Informatica Power Center 7.x/8.1.1/8.5.1/8.6.1/9.1(Primary Skill), MDM Tools Informatica MDM Hub console(Secondary skill) RDBMS: Oracle 9i/10g(Primary Skill), SQL server 2000/2008,Teradata,DB2,Netezza(Secondary skill) GUI Tools: Toad, Aqua data studio, SQL assistant Version Control TFS, Star team(Internal) Programming Languages: Unix Korn Shell scripting (Secondary skill), SQL,PL/SQL(Secondary skill) Platforms: MS Windows, Unix Domain Health care, retail and hospitality End Client: Wyndham Vacations Ownership Position Title: ETL Architect/Team Lead Duration: Nov, 2013 - To Date Project: Voyager Release 1,2,3 EDM Team Size: 5 Page 2 of 11 PROJECTS EXECUTED at CIBER Sites India (2013 Nov - Till Date) TECHNICAL EXPERTISE PROFESSIONAL EXPERTISE
  • 3. Description: : Wyndham vacations ownership is a US based client. They provide packages for vacations and resorts in cheaper rates. Also they provide special allowances or points for each booking. Allows early booking for festive seasons. The resorts have many products which provide special facilities and fun activities. The resorts and their features are categorized under Product Hub Module. The Customer Hub module deals with the contract agreement data for members and owners. This information is loaded from CSS main frame files to staging area. From there a CDC process is implemented to extract the daily loads to central repository.IDQ process is in place for address, email, phone details verification. From CR layer data is loaded to MLZ.MDM process is used to standardize the data in MLZ layer and the golden records are loaded back to these layers again. EDM is an enterprise data management project which is focused on building the centralized data ware house for various reporting purposes. This involves creating and loading dimensions and facts from source tables on oracle and sql server. Each party and the related contracts signed,along with their membership information is processed from MDM tables and loaded to dimension tables. An aggregated fact table is also loaded for the party with accumulating snapshot information. Role: • Played role of a team lead • Interacted with onshore team for requirement gathering • Participated daily status meetings • Driven offshore team and monitored the tasks completion. • Enhancements and defect fixing for R2 Account management and R3. • Analysis of the architecture and business data flow. • Fixing defects for Day0 and incremental runs. • Implementing performance tuning techniques. • Unit testing. • Provide guidance to junior team mates. Environment: Tools:Informatica 9.1.5 DB:Oracle 10G OS:Unix End Client: Davita Position Title: Senior ETL developer Duration: May, 2012 - Oct-2013 Project: Clinical Insights Team Size: 4 Page 3 of 11 PROJECTS EXECUTED at HEADSTRONG/GENPACT (2011 May -2013 Oct)
  • 4. Description: : Clinical insight users mainly the healthcare providers would like to see the report about their facilities performance on a weekly or monthly basis. This projects helps to load the main report table after implementing all the business requirements through informatica. This report contains the summary and detail information about patients ,facilities,region,division and village. Patient information with their assigned facility, Modality type(treatment type) ,access used, physician etc and the score calculation for each lab test he/she has undergone in the reporting period will be populated. Major modules in this projects are patient assignment,lab bucketing and scoring. Role: • Involved in requirement analysis and initial design. • Sole ownership for the development of ETL mappings, workflows for the entire modules in thisproject. • Code review. • Unt testing. • Provide guidance to junior team mates. Environment: Tools:Informatica 9.1.0 DB:Oracle 10G OS:Unix End Client: Davita Position Title: Senior ETL developer Duration: April, 2012 - May, 2012 Project: Crownweb Team Size: 3 Description: CROWNWeb is a web application for Medicare’s End-Stage Renal Disease clinical management processes. The existing system used windows batch scripts to generate the xml files with patient details from the sql server tables. The sql server source tables were already build with xml equivalent tags through stored procedure. Automating this crownweb submission process through ETL Role: • Requirement analysis with end user clients • ETL code development to automate the xml file generation from the sql server source tables using ‘for xml explicit’ property. • Preparing mapping specification documents. • Code review. • Unt testing. • Preparing project hand over documents. Environment: Tools:Informatica 9.1.0 DB:Oracle 10G,Sql server 2008 Page 4 of 11
  • 5. OS:Unix End Client: Davita Position Title: Senior ETL developer Duration: March, 2012 - April, 2012 Project: Esig-Data quality fixes Team Size: 2 Description: Fixing the data issues with the existing clinical data mart system Role: • Involved in meetings with clients. • Development of ETL mappings and sql scripts to fix the issues , • Code review. • Unt testing. Environment: Tools:Informatica 9.1.0 DB:Oracle 10G OS:Unix End Client: Davita Position Title: ETL developer Duration: September, 2011 - March, 2012 Project: Cwiz Team Size: 6 Description: Building clinical data mart from snappy system Role: • Create etl mappings with scd type 2 logic for dimension and fact table loads. • Interacting with clients to discuss about the mapping specification documents. • Code review. • Unt testing. Environment: Tools:Informatica 9.1.0 DB:Oracle 10G OS:Unix Page 5 of 11
  • 6. End Client: Davita Position Title: ETL developer Duration: July, 2011 - September, 2011 Project: Snapshot Phase-II Team Size: 1 Description: Automating snapshot cube builds using informatica Role: • Automating cube builds using informatica through an ssh connection to the cognos server were the cubes need to be build. • Tracking the load control date in the snapshot control table for the next cube build • Creating mappings,sessions,worklets,workflows as per Davita standard. • Unit testing. • Code review. Environment: Tools:Informatica 9.1.0 DB:Oracle 10g OS:Unix End Client: Wyndham Hotel Group Position Title: ETL Developer Duration: February, 2011 - April, 2011 Project: WHG-POI/POR Description: As part of the migration of data from GPS system database to SPE system database all the existing ETL in EDW and Datamart required modification to source the data from SPE instead of GPS.This project aims to implement the required change as part of this migration. Role: • The team lead of a 2 member project • Change request gathering from onsite team cordinator • Impact analysis on different modules • Preparing the impact analysis sheet and design document for the ETL process • Modifying existing mappings and workflows for the new change request • Preparing unit test cases and executing them to ensure the proper functionality • Capturinng the test logs Page 6 of 11 PROJECTS EXECUTED at Cognizant (2006 Dec – 2011 Apr)
  • 7. • Code migration from development to QA • Supporting QA team for testing • Cordinating with onsite team on issues/clarifications. Environment: Tools:Informatica 8.6.1,Toad 7.6,Aqua data studio Database:Oracle 9i,DB2,SQL Server 2000 OS:Unix/Windows 2000 End Client: Wyndham Hotel Group Position Title: ETL Developer Duration: December, 2010 - February, 2011 Project: WHG-Chain Code Integration Description: Chain code Integration is an enhancement project which deals with minimizing the manual effort required to integrate new chain codes to the existing system.When new chain codes comes to the existing system the integration should be made easier by parameterising the chain codes.Also the existing billing process for different chain codes has been combined together to a single ETL process as part of this project Role: • Team lead of the 2 member project. • Change Requirement gathering and impact analysis. • Designing ETL process details. • Development of the Informatica mappings,look ups,mapplets,reusable transformations,sessions,workflows etc as per the design document/communication. • Developed Unix shell scripts for FTP,Source file varification etc. • Performance tuning of the existing ETL process. • Prepare test cases and testing the ETL code for the module. • Involved in reviewing the test cases written by the peers. • Fix the issues that come out of unit testing code reviews. • Communicate with onsite team cordinator to get project related information. Environment: Tools:Informatica 8.6.1,Toad 7.6,Aqua data studio Database:Oracle 9i,DB2,SQL Server 2000 OS:Unix/Windows 2000 End Client: Wyndham Hotel group Position Title: ETL developer Duration: November, 2009 - December, 2010 Project: WHG – Cpay Description: Page 7 of 11
  • 8. The Cpay is a commission payment system designed for Wyndham hotel group. When a travel agent book hotels for a customer he is paid with a certain amount of commission by that hotel (property).This entire calculation of commission is done by a third party vendor called Pegasus. The stay information along with the control table information(rules for calculating commission) is send to Pegasus and the Pegasus send back invoices & payment details after the calculation of commission.. This invoices need to be send to the property which they use to pay Pegasus Role: • Team member of a six member project • Requirement Gathering and Analysis • Designing ETL process details • Development of the Informatica Mappings, lookups, mapplets, reusable components, Sessions, Work Flows etc. (on ETL side) as per the design documents/communication. • Involved in reviewing the Test Cases written by the peers • Writing DB2 and Oracle stored procedures for control table operations • Fix the Issues that come out of Unit Testing Code Reviews • Communicate with onsite team to get project related information • Preparing test plan, strategies, test cases, building and testing of the informatica mappings Environment: Tools:Informatica 8.6.1,Toad 7.6,Aqua data studio Database:Oracle 9i,DB2,SQL Server 2000 OS:Unix/Windows 2000 End Client: Wellpoint Position Title: ETL Developer Duration: August, 2008 - October, 2009 Project: Direct sourcing Provider- Phase 3(CSA-EDW) Description: The Client is one of the world's largest healthcare insurance provider companies. The objective of the EDLR3-Direct sourcing Provider- Phase 3(CSA-EDW) project is to load the provider specific information from CSA tables to EDW tables using bteq scripts.The aim is to load the changed or new source data from various CSA tables to EDW tables. During this load the derivation of coupled code for each source row is done by look up on translation tables. Role: • Team lead of a two member team • Technical guidance to the team members • Requirement Gathering and Analysis • Identify the changed or new records using Teradata bteq scripts • Applying business logic to source data using teradata • Derive source coupled code logic through look up on translation tables • Load balancing using bteq scripts • Code Reviews Page 8 of 11
  • 9. • Prepare workflows to run the bteq scripts using informatica • Analyzing and providing solutions for the issues raised by development team • Preparation of test plan, strategies Environment: Tools:Informatica 8.1.1,Teradata SQL assistant 7.1 Database:Teradata,Oracle 9i OS:Unix/Windows 2000 End Client: Wellpoint Position Title: ETL Developer Duration: September, 2007 - August, 2008 Project: Direct Sourcing COA-FACETS Phase-3 Description: The objective of the ‘IM EDL-EDW Direct Sourcing to EDL R2.3’ project is to develop a detailed approach to source the data from different sources (Core FACETS, ACES and NASCO) directly to the EDW (Enterprise Data Warehouse) base model.Our concern was to load the Core FACETS COA data into the target EDW environment. COA History data will be loaded from the Core FACETS Product tables, Core FACETS Extension tables and Finance IT tables of the Core FACETS database. The on-going plan is to load only the new or changed records to the EDW base model to capture the changed data in the Core FACETS Product, Extension and Finance tables. COA History data and the Incremental data will be first loaded into the Conformed Staging Area (CSA) and most of the business transformations will be applied to the data before it is loaded into CSA. Role: • Requirement Gathering and Analysis • Come up with multiple design approaches and recommend the best possible solution considering all the constraints • Extract transform and load the COA FACETS data from source flat files to EDW database • Writing complex Teradata SQL query to derive COA related chartfields for the source records • Load balancing using bteq scripts • Performance tuning of the existing ETL process,Teradata SQL • Code Reviews • Analyzing and providing solutions for the issues raised by development team • Preparation of test plan, strategies, test cases, building and testing of the Informatica mappings Environment: Tools:Informatica 8.1.1,Teradata SQL assistant 7.1 Database:Teradata,Oracle 9i OS:Unix/Windows 2000 End Client: Wellpoint Position Title: ETL Developer Page 9 of 11
  • 10. Duration: January, 2007 - September, 2007 Project: Direct sourcing Provider Phase-1 Description: The purpose of the EDW - Direct Sourcing of WGS 2.0 project is to source WGS 2.0 data directly to the EDW base model. In provider two source system > WGS > CS90 In WGS Provider data will be sourced from three different systems- > EPDS > NMS > WMS The EPDS system will be a single source for information for Institutional Providers. The Professional Data will be loaded from the WMS and NMS Systems directly until this data is migrated into the EPDS system. Data will be extracted from the WMS/NMS operational source systems and moved to a landing zone Teradata tables. The data would then be mapped from the landing zone to the CSA tables. All the transformation logic will be handled from landing zone to CSA feed. There will be two loads one for history data and another for incremental. The WMS/NMS and EPDS data will be loaded to the EDW on daily basis. Role: • Preparing high level and low level design document for ETL applications, Source –Target mapping documents • Working with heterogeneous sources – mainframes, RDBMS • Develop mappings and workflows to load source data from Cobol files to Landing zone area • Load data from LZ to staging and staging to EDW using Teradata bteq scripts • Technical Guidance to team members • Load balancing using bteq scripts Environment: Tools:Informatica 8.1.1,Teradata SQL assistant 7.1 Database:Teradata,Oracle 9i OS:Unix/Windows 2000 Professional: BTech in Information Technology, College of Engineering Kallooppara, Cochin University, Kerala [79% Distinction] Year Attended : 2002-2006 Under Graduation: Higher Secondary: Model Technical Higher secondary school Kaloor Cochin Kerala. [77% Distinction] Year Attended : 2000-2002 SSLC : St.Pauls G.H.S Mutholapuram Kerala. [93% Distinction] Year Attended: 2000 Page 10 of 11 EDUCATION