1. Pranabesh Ghosh
Page 1
ProfessionalProfile
Technology Consultant,
HP BAS, India
Strengths
Good Client Interfacing skills
Good Interpersonal & Team Skills
Excellent Communication Skills
Quick Learner
Strong analytical skills
Proactive in taking up additional
roles and responsibilities
Skills
OS
Windows NT/XP/200x, UNIX
ETL
Informatica 8.6 ,9.1,Data stage, Cast-
iron
Database :
Oracle 8.x/9.x/10g, Teradata
13.1.0,Vertica
Languages
PL/SQL, SQL, Unix Shell programming
Tools
SQL Navigator, TOAD, Control-M
scheduling, ERWIN , Winscp, Putty
Reporting tools
Business Object XI, Cognos,
Certification:
Trainer certified in Oracle apps
Diploma in advanced Software
engineering .
Pranabesh Ghosh is working as a Senior Software
Engineer for HP Application Services Delivery Unit,
having around 11+ years of Software development.
He has professional experience in Information
technology in Analysis, design, development,
enhancement and maintenance of Banking Financial
services (BFS), Retail, and Healthcare applications.
He joined Hewlett Packard in May 2010 as a
Technical consultant.
Strengths
More than 11+ years of professional experience
experience in software engineering and
technology consulting. Successfully played roles
of Project lead, Solution Architect, informatica
Administrator and Technical Lead in various
assignments
Data modelling and data warehousing/business
intelligence architecture, metadata, master data
mgmt. and ETL design .
Adapt quickly in fast-paced environments,
handle concurrent responsibilities to deliver
quality work on time with multiple, competing
priorities that buys in customer confidence
Excellent interpersonal, leadership,
customer/multi-vendor interfacing and
collaborating skills
Professional Summary
Proficient at designing and developing ETL and
reporting solutions with business objects and
cognos tools. Awarded for completing and
delivering end to end data warehousing project.
Trained in oracle apps with experience in
implementing ETL, oracle apps projects.
2. Pranabesh Ghosh
Page 2
Proficient in leading team with multiple
responsibilities. Handled both etl and reporting
artefact’s with multiple modules expertise.
Designed the building blocks to promote data
from staging table to target table using Shell
script / ControlM and parallel job method
Designing Mappings and writing UNIX scripts to
schedule the various ETL Objects, Designing and
scheduling reposts using various reporting tools
like Business objects. Involved in end to end data
warehousing Projects.
Designed and Developed re-usable utilities to
increase productivity and quality delivery
Expertise in handling various Informatica Admin
Responsibilities.
Competency in writing PL/SQL stored procedures
and functions, Packages, Database Triggers and
data migration.
Expertise of working in Onsite-Offshore delivery
model.
Consistent exceptional performance and
excellent recognitions from customers and
business units
CDW Sunset Project
3. Pranabesh Ghosh
Page 3
Client AHOLD (USA)
Role Lead Developer
Duration Sept– 2014- Till date
Company Hewlett Packard
Project Description:
The Customer data warehouse is one of the several data warehouse which ahold Uses on daily
basis for analyzing the market .The CDW Sunset Phase 2 mainly represents the requirements
that are necessary to build out the remaining CDW Non-Mainframe feeds and processes in
EDW. This translates to creating in EDW the necessary table structures, build load jobs and
processes,create/modify Autosys jobs and schedules,load historical data, modify downstream
extracts and feeds. This phase also requires changing some VIEWS in EDW_SSPR schema that
are currently pointing to SSPR as a result of Phase 1A.
.
Responsibilities –
Understanding the existing autosys jobs and scripts which relates to Business logic and
converting the same to make it work in EDW environment.
Understanding the complex logic business transformation and implement into ETL
development and design.
Regular interaction with the client to review the design.
Analyzing the Design pattern of various staging tables and Dimension and facts tables.
Perform performance tuning in the load level to improve the performance.
Perform object deployments on versioned repositories from QA to Production
Environment:
Informatica,Unix,Oracle,autosys
TSA Project
Client Aurora Energy Pty LTD (Australia)
Role ETL Developer/ Informatica Administrator
Duration Sept– 2013- Till date
Company Hewlett Packard
Project Description:
This project is related to Retail energy business, it has been divided into two phases.
The first phase of the project involves the migration of components of the existing ETL
processing, databases and reports that are required to support TSA reporting to the new
technology platform. phase 2 the implementation of the remodeled Aurora target
architecture will commence for the in-scope areas of the TSA.HP and Aurora have jointly
recognized that the re-architecting of the migrated solution needs to be prioritized to focus
4. Pranabesh Ghosh
Page 4
on the highly-shared data in order to balance the mitigation of the data integrity risks with
the timeframe risks.
Responsibilities –
Understanding the existing BODI ETL Business logic and converting the same into
Informatica mapping logic in phase 1.
Understanding the complex logic business transformation and implement into ETL
development and design .
Designing the ETL artifacts accordingly with the business logic and implementing them
with minimum complexity.
Regular interaction with the client to review the design.
Analyzing the Design pattern of various staging tables and Dimension and facts tables.
Perform performance tuning in the ETL level to improve the performance.
Perform object deployments on versioned repositories from QA to Production
Setup and administer PowerCenter security
Setup and configure PowerCenter domain and services
Install PowerCenter software, administer repositories
Creating production environments including object migrations
Creation and maintenance of Informatica users and privileges.
Worked on SQL queries to query the Repository DB to find the deviations from
Company’s ETL Standards for the objects created by users such as Sources, Targets,
Transformations, Log Files, Mappings, Sessions and Workflows.
Environment:
Vertica,Informatica, Microstatergy
Aetna On-Demand Projects
Client CVS Caremark
Role Technical Lead
Duration Apr– 2011- Sept -13
Company Hewlett Packard
Project Description:
CVS Caremark is taking on an enterprise wide initiative to create an enterprise data
warehouse using an industry standard healthcare model. As such, we have existing source
system as RXclaim, this source system is adding three new feed in edw2 teradata , these new
feed will be fed to the warehouse with information from rxclaim to EDW2. The main objective
of this project is to populate the rxclaimrelated data using this three feeds to Edw2
5. Pranabesh Ghosh
Page 5
warehouse. Data is loaded into Oracle environment and relevant dimension data is built
based on claims. Same is cascaded to Teradata environment for report processing.
Responsibilities –
Designthe solution for individual projects like Aetna Patient Override and RxClaimGaps
,CDD FULL ,Prompt pay phase III FEDB.
Planning, analysing, implementing and documenting strategies related to ETL
development and design.
Perform design and code reviews and do the knowledge transfers with the team.
Analyzing the Design pattern of various staging tables and Dimension tables.
Designed the ETL using Shell script, Informatica, PL/SQL, Summary Management, DB
Sync tool and Teradata Bteq to load data from file to Teradata via oracle.
Planning, analysing, implementing and documenting strategies related to ETL
development and design.
Perform design and code reviews and do the knowledge transfers with the team.
Achievements:
Completed all the project in-time as well as coordinated multiple projects. Used several re-
usable components and reduced the development time as well as improved performance.
Implemented project execution standards for better project management and tracking.
Environment:
Oracle, Informatica, Teradata and Shell Scripting
EOMS Staging Implementation :
Client CVS Caremark
Role Technical Lead
Duration Apr– 2010- Apr 2011
Company Hewlett Packard
Project Description:
CVS Caremark is taking on an enterprise wide initiative to create an enterprise data
warehouse using an industry standard healthcare model. As such, EOMS (Enterprise
Opportunity Management System) application will be transitioning to use this model. This
Model is mainly used to migrate the data produced by the EOMS application to the new
model structure on EDW1 also to Load data from a XML file to the corresponding 6(Exact
6. Pranabesh Ghosh
Page 6
Number TBD) staging tables in the EOMSDMA schema. In the later phase of this Project the
data from the various EOMSDMA schemas will be used to load Denorm tables using various
business functionality. Using Informatica as an ETL tool and Unix shell scripting the data is
migrated.
Responsibilities –
Analyzing the Design pattern of various staging tables and Denorm tables.
Enhancing the Shell scripts to automate the process of ETL Objects.
Understanding the Business functionalities in low level design and to create a new
design document based on the requirement for Denorm tables.
Analyzing the data model of Denorm tables at mapping leveland preparing the HLDand
LLD for each and every mapping.
Conducting design workshop with our business team.
Preparing UT, QAT test cases.
Achievements:
Developed Complex etl objects with Improved performance for loading into the
dimension and facts tables.
Environment:
Oracle, Informatica 8.6 and Shell Scripting
ASSET MANAGEMET INVESTMENTS DATA DRIVEN DOLLERS (AMI D3) Implementation :
Client Bank of New York Mellon (BNYM UK)
Role Technical Lead
Duration May-2009 – April-2010
Company Cognizant Technology Solutions India Pvt. Ltd
Project Description:
The Bank of New York Mellon provides a broad range of products and services in the areas of
Asset Management, Asset Servicing, Wealth Management, Issuer Services, and Treasury
Services. The Project aimed at maintaining and enhancing detailed sales and holding
information from various source systems to the business in the form of reports for Asset
Management. Reports mainly related to current and historical holding and sales information.
7. Pranabesh Ghosh
Page 7
Sales and holding information related to different geographical location are pulled via
informatica where in the data is cleansed in the stage , after this phase the data is populated
to dimensions tables and fact tables respectively. The data in dimension and facts are used to
create reports.
Responsibilities –
Analyze the existing Campaign data mart.
Analyze and understand the business needs and effectively transfer knowledge among
the team members.
Creating innovative Value adds which ultimately reduce cost. Encouraged team
members to do the same.
Worked on quality assurances , used quality tools to improve the quality of code
Used Best Practices in the industry to deliver best code to client
Involved in Design and enhancement of Informatica Mappings.
Involved in bug fixing with minimum SLA time to revert back to client.
Tested integrity of data by running queries in Toad.
Involved in performance tuning of mappings.
Created documents like UTP, UAT test cases and test logs.
Involved in creating Performance metrics tracker report.
Worked with E-Tracker for defect tracking and creating matrices.
Effectively distributed work allocation between team members also enforced peer
reviews.
Worked in project Proposals and Estimations.
Environment:
Oracle, Informatica, Teradata and Shell Scripting
NAM SALES DATA MART Implementation :
Client News America Marketing
Role Technical Lead
Duration Nov-2008 to May 2009
Company Cognizant Technology Solutions India Pvt. Ltd
Project Description:
8. Pranabesh Ghosh
Page 8
News America Marketing is a subsidiary of News Corporation, one of the world’s largest media
and entertainment companies. Client offer advertisers a broad portfolio of in-store, home-
delivered and online media that helps brands stand out in marketplace.
We Built a data warehouse which would initially include information about their sales
information. This involves three source systems FSI, INSTORE and SIEBEL which are built in
Oracle database serves as source for this warehouse. The warehouse is built on Teradata. We
created mapping and sessions which pull the data from the source systems and load them to
Teradata warehouse.
Responsibilities –
Analyze the existing Campaign data mart.
Analyze and understand the business needs
Involved in Design and Development of Informatica Mappings.
Developed sessions and workflows and scheduled to run.
Tested integrity of data by running queries in Toad.
Documenting the mappings, this explains the flow of it.
Involved in performance tuning of mappings.
Created documents like UTP, UAT test cases and test logs.
Involved in creating Performance metrics tracker report.
Achievements:
Worked as Informatica Admin , handled more than 500 objects , migrated etl objects between
two repository using Informatica deployment tool.
Environment:
Oracle, Informatica, Teradata, Business Objects and Shell Scripting.
Fixed Income Characteristics Implementation :
Client Franklin Templeton
Role Module Lead
Duration March-2008 to Nov 2008
Company Cognizant Technology Solutions India Pvt. Ltd
Project Description:
9. Pranabesh Ghosh
Page 9
Franklin Templeton Investments (“FTI”) has embarked on a program to store, calculated fixed
income characteristics for fixed income funds provided by PMA and to provide User Interface
to marketing team on fixed income Portfolio data. This data will be further used to perform
analysis by creating Reports. This project will address the following:
Source security level fixed-income characteristics from Lehman Point
Store and calculate fixed-income characteristics for fixed income products using the security
level characteristics
Distribute the fixed-income characteristic calculations to downstream systems such as
Enterprise Warehouse and Global Fact Sheet.
Responsibility:
As a Module lead my responsibilities include
Mainly involved in ETL designing and coding.
Analyzing the sources and targets, transforming the data, mapping the data and
populating the data into targets by using Informatica Designer, Workflow Manager and
Workflow Monitor.
Creating and optimizing the Mappings, Sessions, and Workflows to improve the load
performance.
Creation of Sessions, Workflows and Parameter files using UNIX for the corresponding
mappings.
Involved in preparing TSD documents.
Prepared Low Level Design documents and Unit Test Plans for the ETL jobs.
Reviewing documents, mappings and giving necessary comments .to team members
Achievements:
Awarded for best performance as a module lead , handled multiple source system effectively
and delivered to the client successfully , got client appreciation for great work .
Environment:
Oracle, Informatica, Teradata, Business Objects and Shell Scripting.
Product Data load Implementation :
Client Franklin Templeton
10. Pranabesh Ghosh
Page 10
Role Developer
Duration Octobar-2006 to March 2008
Company Cognizant Technology Solutions India Pvt. Ltd
Project Description:
Franklin Templeton Investments (“FTI”) has embarked on a program to implement and consolidate
its current and historical enterprise products information into a single data source (EW), based on
Industry Standard Product Hierarchy. FTI has requested Cognizant to provide technical resources and
expertise in assisting FTI in evaluating and selecting the best solution to fit its requirements and
support its business case.
The Product Data Load (PDL) project has been chartered to make the Enterprise Warehouse (EW) and
Global Integrated Data store (GID) as the go-to sources of reference for historical and current retail
and institutional enterprise product data. The key objective of the project is to present rationalized
and “clean” enterprise product data that can be leveraged by applications downstream.
Listed below are the business drivers for the PDL initiative.
Currently the Products and the data elements are not synchronized between GID & EW
Multiple flows exist between the source systems and to GID and EW thereby introducing
redundancies and inconsistencies in the data between GID and EW and hence other downstream -
Data Quality Challenges
Maintenance redundancies for the multiple flows
The current GID model is based on the off-the-shelf Street books data model. The Street books model
is based on a Name-Value Pairs design pattern that adds overheads to the query retrieval.
Currently the websites that are the primary consumers of the data from the GID have limited Search
& Navigation capabilities for analysis due to lack of a consistent and coherent product categorization.
Historical data retention is not consistent across various products leading to inconsistencies.
Responsibility:
As a Developer my responsibilities include
Mainly involved in ETL designing and coding.
Analyzing the sources and targets, transforming the data, mapping the data and populating
the data into targets by using Informatica Designer, Workflow Manager and Workflow
Monitor.
Creating and optimizing the Mappings, Sessions, and Workflows to improve the load
performance.
Creation of Sessions, Workflows and Parameter files using UNIX for the corresponding
mappings.
11. Pranabesh Ghosh
Page 11
Involved in preparing TSD documents.
Prepared Low Level Design documents and Unit Test Plans for the ETL jobs.
Achievements:
Awarded for best performance as a module lead , handled multiple source system effectively and
delivered to the client successfully , got client appreciation for great work .
Environment:
Oracle, Informatica, Teradata, Business Objects and Shell Scripting.
GTSS 11i Global Data warehouse Project Implementation :
Client Motorola U.S.A
Role Developer
Duration Sept-2005 – Sept-2006
Company L&T InfoTech
Project Description:
Global telecom solutions sector (GTSS) is a Motorola project to enhance its own ERP Business from
local to global sector, leader in telecom sector Worldwide, GTSS 11i Global data warehouse Project
is a part of GTSS project, In this date warehousing is being used for extraction of data from two
instances oracle apps11.0.3 and oracle apps 11.5.10(11i), The scope of this project is to synchronize
and to upgrade the existing oracle apps instance (11.0.3) to 11i instance
Responsibilities –
Involved in Requirement analysis of data in oracle apps 11.0.3 instance up to the column data
type level, analysis including the replaced column, replaced table compared to 11i.
Extensively involved in Detailed designing of data in 11i oracle apps instance, which mainly
includes the replaced column, replaced table information along with the new data types being
used in 11i.
Used Informatica as a ETL tool for extraction of data from 11i oracle instance and 11.0.3
instances which are populated to the staging area, The staging area being used to consolidate
the data which are to be populate to the data warehouse area,
Analyzed the mapping depending on the functionalities of the mapping in 11.0.3, created
design document depending upon functionalities of the mapping in 11.0.3 comparing with
11i,this design document consists of the table, column and their data type in both the
instances.
12. Pranabesh Ghosh
Page 12
Analyzed the existing coding 11.0.3 mappings, generated coding for new 11i mappings.
Involved in populating data to the staging area.
Prepared Test Cases for Unit Test Document, done testing for each condition and new column
in 11i mappings.
Analyzed the target tables in the staging area for Extract mappings, reflected the changes in
the target to the dependencies mappings for Load mappings in data warehouse area.
Environment:
Oracle, Informatica, Oracle and Shell Scripting.
Business Loan Information System Implementation :
Client Respfree Loan Services, Atlanta
Role Developer
Duration March-2005 to Sept- 2005
Company Adrian Technologies
Project Description:
The project is on Banking and deals with the disbursal of loan amounts for various purposes like: Personal
Loan, Vehicle Loan, Housing Loan, Consumer Durable Loans, etc. The company requires different level of
analysis regarding loan amount, type of customers, type of payment schedules, interest rates (variable or
fixed), defaulters list and the penal interest calculations, etc. The data mart captures data from their
Transactional Database maintained under Client/Server Architecture.
Responsibility:
Used Source Analyzer to Load data from OLTP Systems into Data Mart.
Involved in Data Mart Design and implementation.
Involved in Extraction, Transformation and Loading of data
Developed Mapplets, reusable transformations to populate the Data.
Used Star Schema to Populate Data into Cube, Using Dimensions And Fact Tables.
Creation of the reports using Business Objects.
Create Universe and Querying data using Business objects designer.
Working on slice-Dice mode and Drill Mode for Analyzing the Report Data.
Environment:
13. Pranabesh Ghosh
Page 13
Oracle, Informatica, Business Objects and Shell Scripting.
Data warehouse for Fleet Services Implementation :
Client First Vehicle services ,Malaysia
Role Developer
Duration Sept-2004 – Feb-2005
Company Adrian Technologies
Project Description:
First Vehicle Services managing and maintaining more than 30,000 vehicles for local
governments, school districts, airports, utilities, and telecommunications companies, It provides
comprehensive preventive maintenance, repair, and asset management services for a complete range of
vehicle and equipment types.
Responsibilities –
Extensively involved in Data Extraction, Transformation and Loading (ETL process) from source to
target systems using Power Center of Informatica.
Creating of data mart and Extracted data using Informatica Power Center.
Importing Source/Target tables from the respective databases and created reusable
transformations using Designer Tool set of Informatica.
Created Transformations, Mapplets and Mappings using Designer Tools set of Informatica.
Used various Transformations like Expression, Filter, Joiner and Lookups to migrate clean and
consistent data.
Scheduled sessions to update the target data-using server Manager of Informatica.
Environment:
14. Pranabesh Ghosh
Page 14
Oracle, Informatica and Shell Scripting.
Retail Business Intelligence System Implementation :
Client 7ELEVEN, MALAYSIA
Role Developer
Duration Jan-2004 to Aug 2004
Company Adrian Technologies
Project Description:
This project is for Retail Business covering the consumer products specially FMCG products.
Data Warehouse designed for the purpose of Sales Analysis that basically helps the decision-
makers to analyze the sales patterns using Reporting tool. Performed ETL Operations Using
Informatica to Load data into Data warehouse. Developed Reports Using Cognos For Multi
Dimensional Analysis.
Responsibility:
Created Informatica mappings to build business rules to load data. Most of the
transformations were used like the Source qualifier, Aggregators, lookups, Filters &
Sequence.
Used Star schema To Populate Data into Cube, Using Dimensions And Fact Tables
Working filters and conditions, formatting of impromptu reports
Creating hot files on Impromptu reports
Arranged Dimensions and Measures in Cube as per requirement.
Environment:
Oracle, Informatica, cognos and Shell Scripting.
Educational Background:
15. Pranabesh Ghosh
Page 15
Masters :
Name of Board/University: Andhra University
Name of Degree: Masters in Computer Applications
Branch: Computer Applications
Class Obtained : First
Year of Passing: 2003
Bachelors :
Name of College: MPR Degree College.
Name of Board/University: Andhra University
Name of Degree: B.Sc in Electronics.
Class Obtained : First
Year of Passing: 2000