Resume_of_Vasudevan - Hadoop

V A S U D E V A N V E N K A T R A M A N
Bellandur Outer Ring Road, Bangalore – 560103. India.
Voice : +91-98809 38525 Email : vasudevan.venkatraman@gmail.com
PROFILE SUMMARY
Vasudevan Venkatraman has been working in the Information Technology industry for the past 11+
years which includes 7+ years in Oracle PL/SQL & Datawarehousing , 3+ years in Performance Consulting
& applications DBA and 2 years in Big data technologies. He has been working in designing and
developing applications using Oracle PL/SQL & Hadoop. He has rich experience in understanding
business process requirements, analyzing and implementation.
PROFESSIONAL SKILLS
 Good knowledge in Hadoop Framework , Architecture and Big data concepts.
 Worked on Data warehousing project using Oracle / Hadoop.
 Having experience on creating databases , tables and views using Hivesql , Impala and Pig.
 Very Good Knowledge in Oracle Memory Architecture and Datawarehouse Architecture.
 In-depth knowledge in constructing the triggers, packages, collections, functions, procedures etc
 Worked on Data Loading using SQL Loader , Data pump ,External tables & Sqoop.
 Worked on Materailized views, Partitioning ,Bucketing , Parallel execution and Job scheduling
 Exposure to ASM , RAC , Disk and File storage systems
 Creating and monitoring the different tablespaces like user, Undo, temporary Tablespaces.
 Performance Tuning using AWR, EXPLAIN PLAN, TKPROF and Auto Trace
SKILL SET
Software Development : SQL, PL/SQL , Core Java
Performance Tools : AWR ,ASH, TKPROF ,Autotrace, Explain plan, iostat, vmstat, topas
Big data/Hadoop : HDFS , Map Reduce , Hive , Pig , HBase and SQOOP
RDBMS : Oracle 11g,HP-Neoview
BIDW : Business Intellegence ,Data Warehousing and ETL Concepts
DOMAIN : CPG – Retail , Banking
EDUCATION
Education Details - Degree Institute/University Duration
Master of Computer Applications Madurai Kamaraj University 1999 – 2002
Bachelor of Science Gandhigram Rural University 1995 – 1998
Assistant Consultant
Feb 2015 – Present , TCS
 Offshore Tech Lead for
1. UK based Investment bank
• Working on datawarehousing application
• Implemented Proof of Concepts on Hadoop stack and different bigdata analytic tools,
migration from different database (Oracle) to Hadoop.
• Load and transform large sets of structured, semi-structured and unstructured data using
Hadoop ecosystem components.
1
V A S U D E V A N V E N K A T R A M A N
• Experience in working with different data sources like Flat files, XML files and Databases.
• Worked on Partitions & Bucketing in hive to optimize performance.
• Preprocessing Data sets using Pig.
• Extracting data to/from Oracle to HDFS using SQOOP.
• Implementing Oracle SCD techniques for new requirements.
• Tuning on batch processes , time/CPU consuming SQL Queries.
• Develop automated unit test stubs using utPLSQL to test transformations through
Continuous Integration
• Worked on explain plan , Oracle hints and creation of new indexes to improve performance
of SQL Statements.
Technology Lead
Jan 2011 – Jul 2014 , Infosys Limited
 PL/SQL Developer for
1. Auchan Retail , France
Responsibilities include:
• Discussion with Auchan IT Team to get requirements clarity.
• Performed Source system analysis (SSA) to identify the source data that needs to be moved
into the target tables
• Conducting Performance Review of queries.
 Performance Consultant for:
1. Mifel Bank , Mexico
2. National Commerical Bank , Jamaica
All the mentioned projects were onsite based projects for performance monitoring and analysis of the
bank’s production servers. Responsibilities include:
• Preliminary discussions with the bank, preparing Statement of Work (SOW) and project
plan.
• Validating the OS (AIX 6.3), Application (Finacle) and Database (Oracle 11g) level parameters
based on the current load profile of the bank’s systems.
• Performance tuning and resolution of specific issues raised by bank.
 Table Partitioning activity for:
1. National Commercial Bank , Jamaica
2. Bank of Baroda , Mumbai
Both the mentioned projects were onsite based projects for Partitioning. Responsibilities include:
• Preliminary discussions with the bank, preparing Statement of Work (SOW) and project
plan.
• Arriving Strategy/Methodology to Partition tables and related indexes.
• Involved in table redesigning with implementation of Partitions Table and Partition Indexes
to make Database Faster and easier to maintain.
Senior Software Engineer, Hewlett Packard
Mar 2008 – Jan 2011
 Implementation for Enterprise Wide Data Warehouse to Bank of Baroda
Bank of Baroda is embarking on a significant business transformation by aiming to enhance
efficiency, productivity and competitiveness by adopting the latest business processes and
technology and towards this end, is implementing its Technology-Enabled Business and IT
Strategy Project.
2
V A S U D E V A N V E N K A T R A M A N
As a part of the transformation process, the Bank requires a powerful and comprehensive
performance and risk management and executive information system which will enable decision
support and profitability evaluation across multiple dimensions. In addition the Bank is looking
at solution components to enable budgeting and planning, market and liquidity risk
measurement while paving way for an objective decision-making process.
The above requires implementation of an enterprise wide Data Warehouse with a proven data
model which can consolidate data across the Bank for strategic & tactical decision making and
also implementation of advanced analytical engines that enable micro level CUSTOMER and
account and aggregate analysis.
• Worked on Logical and Physical design on Datawarehouse application
• Worked on Advanced Oracle ETL concepts.
• Loading & Processing Bulk Data using sqlloader.
• SQL & PL/SQL Tuning
• Worked on large sets of data.
• Used Bulk Collections for better performance and easy retrieval of data, by reducing context
switching between SQL and PL/SQL engines.
 Migration from Oracle to HP-Neoview for Canon , Singapore
Canon migration project involved in replacing the existing oracle data warehouse with HP
Neoview System. It consisted of migrating the oracle procedures, respective DDLs, other core
database objects, the BO reports and the Data load to Neoview Environment , to
redesign/develop Neoview Java stored procedures equivalent to the functionality provided in
the existing oracle Stored procedures.
• Re-design / Develop Neoview Java Stored Procedures and applications
• Performance tuning in the SQLs written in the Java Stored Procedure
• Review other Core database conversion.
• Interaction with the user for better solutions.
• Supporting Users in the UAT phase
 Graphical interfaces in Supplier Performance Management Portal
AL SPM graphical enhancement project was executed for Ashok Leyland which implemented
graphical interfaces to analyze captured supplier metrics data and enable the AL Sourcing team
to take right business decisions. The graphical interface included Pareto charts to analyze the
top 20% supplier spends on the overall supplier business spends, spend performance analysis to
analyze spend vs. supplier performance and linear trend analysis to understand the future trend
of supplier metrics based on past data.
To be able to perform the same, database was designed to hold business data in de-normalized
form for better performance and the business logic to render the graphs were written in SQL
stored procedures. The supplier data which was available from AL ERP database was parsed,
analyzed and performance data were calculated. It involved writing a few SQL functions to
calculate the cumulative spend value and cumulative spend percentage for generation of Pareto
charts.
• Performed database design related to changes in the existing Portal.
3
V A S U D E V A N V E N K A T R A M A N
• Performed PL/SQL coding in the application
• Deployed the application and provided support during testing
• Prepared and Performed unit and system integration testing document
Software Engineer, Fidelity Investments
Jan 2007 – Nov 2007
 Portfolio Engine – Data Maintenance for Bloomberg
Portfolio Engine is being built to replace the rules based Trade Review Engine (TRE). TRE is
limited to instruments contained in specific targets whereas PE perform portfolio optimization
for it Private Portfolio Services (PPS) accounts based on the most appropriate selections. SAI
Data Maintenance functionality is being created to support the Portfolio Engine requirements.
• Analyzed the proposed web front end screen design
• Performed database design related to changes in the web front end screen(Using Java)
• Performed PL/SQL coding in the application
• Deployed the application and provided support during testing
• Prepared and Performed unit and system integration testing
Associate IT-Consultant, ITC Infotech
May 2005 – Sep 2006
 Support Consultant for Custom Data Warehouse Product V3 for British American Tobacco.
The aim of V3 is to deliver to BAT a maintainable, integrated, scaleable Customer Relationship
Management (CRM) solution supporting best practice processes in Distribution, Trade
Marketing and Account Management. V3 not only supports the field activities associated with
the field sales representatives, it provides extensive support for the Operational Planning
process that underpins these activities. The V3 Common Functional Base provides the central
elements that are required by any market implementing V3, irrespective of whether they are a
Trade Marketing or Distribution-focused market.
• Worked on Run Nightly Refresh Mechanisam to Populate data in to Datawarehouse.
• Worked on Triggers & Materialized views to refresh Data
• Involved on Development as well as Support.
• Worked for more than 5 release versions of the Product
• Fixed the issues raised by the client with stipulated SLA
• Extensively used bulk collection in PL/SQL objects for improving the performing
Trainee Programmer, Cherrysoft Technologies
May 2004 – Feb 2005
 Developing ERP Package for SPIC Pharmauceticals, Chennai.
Solara is an integrated materials management package addressing the needs of a manufacturing
process in a pharmaceutical company. All the major functions starting from product requests to
material dispatch, internal transactions to accounting are all taken care of in this package.
Seamless integration with Internet is possible with this package.
• Involved in coding and support for the system
• Prepared unit test cases and executed unit test case for the system
• Coordinated with onsite
4
V A S U D E V A N V E N K A T R A M A N
5
V A S U D E V A N V E N K A T R A M A N
5

Recomendados

Resume_Triveni_Bigdata_Hadoop Professional von
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalTRIVENI PATRO
387 views3 Folien
Pankaj Resume for Hadoop,Java,J2EE - Outside World von
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside WorldPankaj Kumar
30K views10 Folien
Resume (2) von
Resume (2)Resume (2)
Resume (2)Romy Khetan
382 views6 Folien
hadoop resume von
hadoop resumehadoop resume
hadoop resumeHassan Qureshi
2.5K views4 Folien
Suresh_Yadav_Hadoop_Fresher_Resume von
Suresh_Yadav_Hadoop_Fresher_ResumeSuresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_ResumeSuresh Yadav
90 views2 Folien
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop von
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
474 views3 Folien

Más contenido relacionado

Was ist angesagt?

Anil_BigData Resume von
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData ResumeAnil Sokhal
1.8K views4 Folien
Suresh_Hadoop_Resume von
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh yadav
335 views2 Folien
Jayaram_Parida- Big Data Architect and Technical Scrum Master von
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram Parida
2.2K views7 Folien
RENUGA VEERARAGAVAN Resume HADOOP von
RENUGA VEERARAGAVAN Resume HADOOPRENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOPrenuga V
5K views6 Folien
Hadoop Big Data Resume von
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resumearbind_jha
2.7K views4 Folien
kishore resume hadoop von
kishore resume hadoopkishore resume hadoop
kishore resume hadoopKishore Babu
239 views3 Folien

Was ist angesagt?(19)

Anil_BigData Resume von Anil Sokhal
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData Resume
Anil Sokhal1.8K views
Jayaram_Parida- Big Data Architect and Technical Scrum Master von Jayaram Parida
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram Parida2.2K views
RENUGA VEERARAGAVAN Resume HADOOP von renuga V
RENUGA VEERARAGAVAN Resume HADOOPRENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOP
renuga V5K views
Hadoop Big Data Resume von arbind_jha
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
arbind_jha2.7K views
Kumaresan kaliappan resume von Sam Walsh
Kumaresan kaliappan resumeKumaresan kaliappan resume
Kumaresan kaliappan resume
Sam Walsh2.7K views
Scott Allen Williams Résumé - Senior Java Software Developer - Agile Technolo... von Scott Williams
Scott Allen Williams Résumé - Senior Java Software Developer - Agile Technolo...Scott Allen Williams Résumé - Senior Java Software Developer - Agile Technolo...
Scott Allen Williams Résumé - Senior Java Software Developer - Agile Technolo...
Scott Williams13K views
Chandrakant pandey java j2ee developer resume von Chandrakant Pandey
Chandrakant pandey java j2ee developer resumeChandrakant pandey java j2ee developer resume
Chandrakant pandey java j2ee developer resume
Chandrakant Pandey1.2K views
Rajeshwari K A 9+ years as Java Developer and Team lead-1 von Rajeshwari KA
Rajeshwari K A 9+ years as Java Developer and Team lead-1Rajeshwari K A 9+ years as Java Developer and Team lead-1
Rajeshwari K A 9+ years as Java Developer and Team lead-1
Rajeshwari KA856 views
ganesh_2+yrs_Java_Developer_Resume von Yeduvaka Ganesh
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resume
Yeduvaka Ganesh427 views

Similar a Resume_of_Vasudevan - Hadoop

HamsaBalajiresume von
HamsaBalajiresumeHamsaBalajiresume
HamsaBalajiresumeHamsa Balaji
258 views7 Folien
Copy of Alok_Singh_CV von
Copy of Alok_Singh_CVCopy of Alok_Singh_CV
Copy of Alok_Singh_CVAlok Singh
379 views4 Folien
Shane_O'Neill_CV_slim von
Shane_O'Neill_CV_slimShane_O'Neill_CV_slim
Shane_O'Neill_CV_slimShane O'Neill
120 views8 Folien
Resume - Deepak v.s von
Resume -  Deepak v.sResume -  Deepak v.s
Resume - Deepak v.sDeepak V S
137 views7 Folien
Ganesh CV von
Ganesh CVGanesh CV
Ganesh CVGanesh Kamble
137 views7 Folien
Ganesh profile von
Ganesh profileGanesh profile
Ganesh profileGanesh Kamble
172 views6 Folien

Similar a Resume_of_Vasudevan - Hadoop(20)

Copy of Alok_Singh_CV von Alok Singh
Copy of Alok_Singh_CVCopy of Alok_Singh_CV
Copy of Alok_Singh_CV
Alok Singh379 views
Resume - Deepak v.s von Deepak V S
Resume -  Deepak v.sResume -  Deepak v.s
Resume - Deepak v.s
Deepak V S137 views
Abhijit 10 years Oracle PLSQL, Shell Scripting, Unix, DataWarehousing von abhijit singh
Abhijit 10 years Oracle PLSQL, Shell Scripting, Unix, DataWarehousingAbhijit 10 years Oracle PLSQL, Shell Scripting, Unix, DataWarehousing
Abhijit 10 years Oracle PLSQL, Shell Scripting, Unix, DataWarehousing
abhijit singh784 views
Bi developer gary t von garyt1953
Bi developer   gary tBi developer   gary t
Bi developer gary t
garyt1953793 views
Resume_PratikDey von Pratik Dey
Resume_PratikDeyResume_PratikDey
Resume_PratikDey
Pratik Dey113 views

Resume_of_Vasudevan - Hadoop

  • 1. V A S U D E V A N V E N K A T R A M A N Bellandur Outer Ring Road, Bangalore – 560103. India. Voice : +91-98809 38525 Email : vasudevan.venkatraman@gmail.com PROFILE SUMMARY Vasudevan Venkatraman has been working in the Information Technology industry for the past 11+ years which includes 7+ years in Oracle PL/SQL & Datawarehousing , 3+ years in Performance Consulting & applications DBA and 2 years in Big data technologies. He has been working in designing and developing applications using Oracle PL/SQL & Hadoop. He has rich experience in understanding business process requirements, analyzing and implementation. PROFESSIONAL SKILLS  Good knowledge in Hadoop Framework , Architecture and Big data concepts.  Worked on Data warehousing project using Oracle / Hadoop.  Having experience on creating databases , tables and views using Hivesql , Impala and Pig.  Very Good Knowledge in Oracle Memory Architecture and Datawarehouse Architecture.  In-depth knowledge in constructing the triggers, packages, collections, functions, procedures etc  Worked on Data Loading using SQL Loader , Data pump ,External tables & Sqoop.  Worked on Materailized views, Partitioning ,Bucketing , Parallel execution and Job scheduling  Exposure to ASM , RAC , Disk and File storage systems  Creating and monitoring the different tablespaces like user, Undo, temporary Tablespaces.  Performance Tuning using AWR, EXPLAIN PLAN, TKPROF and Auto Trace SKILL SET Software Development : SQL, PL/SQL , Core Java Performance Tools : AWR ,ASH, TKPROF ,Autotrace, Explain plan, iostat, vmstat, topas Big data/Hadoop : HDFS , Map Reduce , Hive , Pig , HBase and SQOOP RDBMS : Oracle 11g,HP-Neoview BIDW : Business Intellegence ,Data Warehousing and ETL Concepts DOMAIN : CPG – Retail , Banking EDUCATION Education Details - Degree Institute/University Duration Master of Computer Applications Madurai Kamaraj University 1999 – 2002 Bachelor of Science Gandhigram Rural University 1995 – 1998 Assistant Consultant Feb 2015 – Present , TCS  Offshore Tech Lead for 1. UK based Investment bank • Working on datawarehousing application • Implemented Proof of Concepts on Hadoop stack and different bigdata analytic tools, migration from different database (Oracle) to Hadoop. • Load and transform large sets of structured, semi-structured and unstructured data using Hadoop ecosystem components. 1
  • 2. V A S U D E V A N V E N K A T R A M A N • Experience in working with different data sources like Flat files, XML files and Databases. • Worked on Partitions & Bucketing in hive to optimize performance. • Preprocessing Data sets using Pig. • Extracting data to/from Oracle to HDFS using SQOOP. • Implementing Oracle SCD techniques for new requirements. • Tuning on batch processes , time/CPU consuming SQL Queries. • Develop automated unit test stubs using utPLSQL to test transformations through Continuous Integration • Worked on explain plan , Oracle hints and creation of new indexes to improve performance of SQL Statements. Technology Lead Jan 2011 – Jul 2014 , Infosys Limited  PL/SQL Developer for 1. Auchan Retail , France Responsibilities include: • Discussion with Auchan IT Team to get requirements clarity. • Performed Source system analysis (SSA) to identify the source data that needs to be moved into the target tables • Conducting Performance Review of queries.  Performance Consultant for: 1. Mifel Bank , Mexico 2. National Commerical Bank , Jamaica All the mentioned projects were onsite based projects for performance monitoring and analysis of the bank’s production servers. Responsibilities include: • Preliminary discussions with the bank, preparing Statement of Work (SOW) and project plan. • Validating the OS (AIX 6.3), Application (Finacle) and Database (Oracle 11g) level parameters based on the current load profile of the bank’s systems. • Performance tuning and resolution of specific issues raised by bank.  Table Partitioning activity for: 1. National Commercial Bank , Jamaica 2. Bank of Baroda , Mumbai Both the mentioned projects were onsite based projects for Partitioning. Responsibilities include: • Preliminary discussions with the bank, preparing Statement of Work (SOW) and project plan. • Arriving Strategy/Methodology to Partition tables and related indexes. • Involved in table redesigning with implementation of Partitions Table and Partition Indexes to make Database Faster and easier to maintain. Senior Software Engineer, Hewlett Packard Mar 2008 – Jan 2011  Implementation for Enterprise Wide Data Warehouse to Bank of Baroda Bank of Baroda is embarking on a significant business transformation by aiming to enhance efficiency, productivity and competitiveness by adopting the latest business processes and technology and towards this end, is implementing its Technology-Enabled Business and IT Strategy Project. 2
  • 3. V A S U D E V A N V E N K A T R A M A N As a part of the transformation process, the Bank requires a powerful and comprehensive performance and risk management and executive information system which will enable decision support and profitability evaluation across multiple dimensions. In addition the Bank is looking at solution components to enable budgeting and planning, market and liquidity risk measurement while paving way for an objective decision-making process. The above requires implementation of an enterprise wide Data Warehouse with a proven data model which can consolidate data across the Bank for strategic & tactical decision making and also implementation of advanced analytical engines that enable micro level CUSTOMER and account and aggregate analysis. • Worked on Logical and Physical design on Datawarehouse application • Worked on Advanced Oracle ETL concepts. • Loading & Processing Bulk Data using sqlloader. • SQL & PL/SQL Tuning • Worked on large sets of data. • Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.  Migration from Oracle to HP-Neoview for Canon , Singapore Canon migration project involved in replacing the existing oracle data warehouse with HP Neoview System. It consisted of migrating the oracle procedures, respective DDLs, other core database objects, the BO reports and the Data load to Neoview Environment , to redesign/develop Neoview Java stored procedures equivalent to the functionality provided in the existing oracle Stored procedures. • Re-design / Develop Neoview Java Stored Procedures and applications • Performance tuning in the SQLs written in the Java Stored Procedure • Review other Core database conversion. • Interaction with the user for better solutions. • Supporting Users in the UAT phase  Graphical interfaces in Supplier Performance Management Portal AL SPM graphical enhancement project was executed for Ashok Leyland which implemented graphical interfaces to analyze captured supplier metrics data and enable the AL Sourcing team to take right business decisions. The graphical interface included Pareto charts to analyze the top 20% supplier spends on the overall supplier business spends, spend performance analysis to analyze spend vs. supplier performance and linear trend analysis to understand the future trend of supplier metrics based on past data. To be able to perform the same, database was designed to hold business data in de-normalized form for better performance and the business logic to render the graphs were written in SQL stored procedures. The supplier data which was available from AL ERP database was parsed, analyzed and performance data were calculated. It involved writing a few SQL functions to calculate the cumulative spend value and cumulative spend percentage for generation of Pareto charts. • Performed database design related to changes in the existing Portal. 3
  • 4. V A S U D E V A N V E N K A T R A M A N • Performed PL/SQL coding in the application • Deployed the application and provided support during testing • Prepared and Performed unit and system integration testing document Software Engineer, Fidelity Investments Jan 2007 – Nov 2007  Portfolio Engine – Data Maintenance for Bloomberg Portfolio Engine is being built to replace the rules based Trade Review Engine (TRE). TRE is limited to instruments contained in specific targets whereas PE perform portfolio optimization for it Private Portfolio Services (PPS) accounts based on the most appropriate selections. SAI Data Maintenance functionality is being created to support the Portfolio Engine requirements. • Analyzed the proposed web front end screen design • Performed database design related to changes in the web front end screen(Using Java) • Performed PL/SQL coding in the application • Deployed the application and provided support during testing • Prepared and Performed unit and system integration testing Associate IT-Consultant, ITC Infotech May 2005 – Sep 2006  Support Consultant for Custom Data Warehouse Product V3 for British American Tobacco. The aim of V3 is to deliver to BAT a maintainable, integrated, scaleable Customer Relationship Management (CRM) solution supporting best practice processes in Distribution, Trade Marketing and Account Management. V3 not only supports the field activities associated with the field sales representatives, it provides extensive support for the Operational Planning process that underpins these activities. The V3 Common Functional Base provides the central elements that are required by any market implementing V3, irrespective of whether they are a Trade Marketing or Distribution-focused market. • Worked on Run Nightly Refresh Mechanisam to Populate data in to Datawarehouse. • Worked on Triggers & Materialized views to refresh Data • Involved on Development as well as Support. • Worked for more than 5 release versions of the Product • Fixed the issues raised by the client with stipulated SLA • Extensively used bulk collection in PL/SQL objects for improving the performing Trainee Programmer, Cherrysoft Technologies May 2004 – Feb 2005  Developing ERP Package for SPIC Pharmauceticals, Chennai. Solara is an integrated materials management package addressing the needs of a manufacturing process in a pharmaceutical company. All the major functions starting from product requests to material dispatch, internal transactions to accounting are all taken care of in this package. Seamless integration with Internet is possible with this package. • Involved in coding and support for the system • Prepared unit test cases and executed unit test case for the system • Coordinated with onsite 4
  • 5. V A S U D E V A N V E N K A T R A M A N 5
  • 6. V A S U D E V A N V E N K A T R A M A N 5