1. Amit kumar
Phone: (Mob) +917411902268 Email:amitk.jss@gmail.com
CAREEROBJECTIVE
To obtain a position in a highly esteemed company, which gives me a platform to use my
expertise and skills for mutualgrowth and benefit of company and myself.
CAREER PROFILE
An accomplished technical professional with 3 years 9months of experience, worked
throughout the entire software development life cycle from requirements
gathering/analysis, design, development and testing. Have deep and proven experience in
Spark, MapReduce, Core Java, Apache Hadoop Framework and various other tools.
Hands on programming skill in Scala,Apache Spark,MapReduce and Java. Created Data
Ingestion Framework for ingesting data into AWS s3(Hive) using Spark, Scala, Sqoop
and AWS EMR.
Hands on programming skills in Model View Controller, an object oriented, maintainable
and scalable code which incorporates generics and collections, multithreading and
exception handling.
Experience in Java Profilers and memory analysis.
Good knowledge on verification and validation and DO 178B(Safety critical aerospace
process ) Standards and Agile (Scrum) methodology .
Experienced in the Configuration Management.
Expertise in problem solving and defects tracking using “Defect tracking” tools, IBM
Clear Quest and Rally.
Experience in GitHub,Bit Bucket,IBM Rational ClearCase and IBM Rational DOORS.
ACADEMIC PROFILE
Examination
Board / University
Name of School/ Year of
Percentage
passed College passing
B.E VTU
Jss Academyof
2012 70.10%
Technical Education
XII CBSE
St Paul Secondary
2007 74.80%
School,Bihar
X CBSE
Indian Public School
2005 88.00%
Hajipur,Bihar
2. WORK EXPERIENCE
Project Engineer, Wipro Technologies,
Bangalore
28th January 2013 – 15th April 2016
Associate, Cognizant Technology Solutions 18th April 2016 – Till Date
TECHNICAL SKILLS
Languages SPARK,SCALA,CoreJava,Hadoop,MapR
educe,YARN,HIVE,SQOOP,JavaScript,
SQL,Page Description Language(PDL),
Css.
IDE Eclipse, Source Insight,IntelliJ
Operating System Environment Windows Family, LINUX, UBUNTU
Source Control Tools
BitBucket,GIT,IBM Rational Clear Case,
Clear Quest,
Doors
Web Server Apache Tomact
PROJECT DETAILS
1. CUSTOMER360
Client: LaQuinta
Organization: Cognizant Technology Solutions
Duration: May 2016 – Till Date
Technologies: EMR4.7.2, Apache Spark, Scala, Apache Hadoop 2.7 Framework
Role: Developer
Synopsis: Customer360 involves data ingestion from five different sources
(Clairvoyix,Medallia,Epsilon,LQDWH,NVCorp) into Aws S3, where Hive External
Table can be created, and business users can run queries on it. It also involves creating
Golden Record for each customer and storing it into another Hive Table. Spark has been
used as the processing Engine. Every day, at scheduled time, AWS Lambda triggers an
EMR cluster and AWS Steps get executed for the Data Ingestion and golden Record
creation.
Responsibilities:
Created Data Quality Framework in Spark and Scala, for checking the Data
Quality (header, null, blank, integer, Date Format checks) of each Record in the
file.
Created Spark job for creating the golden records from the data ingested and
storing it into different Hive Table.
Created Scala based tool to read columns and Data Type from excel sheet and
createtheproperties file containing theHive DDLs/DMLs Schema, whichcanbe
usedin Sparkjob for creating the Hive Tables and Reading data fromthe existing
tables.
Involved in writing AWS Lambda functions which willtrigger time based EMR
3. cluster for ingesting the data. On successful data ingestion EMR cluster will get
terminated.
Involved in writing various shell scripts which will be used by AWS Steps for
triggering various OOZIE jobs for data ingestion.
Created OOZIE workflow, which will trigger shell, Spark, Sqoop and Hive
actions for Data Ingestion.
2. Fill the Lake(FTL)
Client: CapitalOne
Organization: Wipro Technologies
Duration: March 2015 – 15th April 2016
Technologies: CDH5.4, Apache Hadoop 2.6 Framework
Role: Developer
Synopsis: FTL (Fill the Lake) involves data migration from Teradata to Hadoop so that
business users can run queries on it (HIVE, Impala, etc.). FTL comprises of many
individual frameworks which performdata transformations and data filtering on the base
of application schemas.
Responsibilities:
Worked on development of multiple frameworks which involves FB (File
Broker), DQ (Data Quality) checks, Integration Keys Framework, Split
Framework and BDQ validations.
Have developed MapReduce jobs in Java using Eclipse IDE.
Have Integrated Parquet model with Map Reduce for Integration Keys
Framework.
Integrated In Memory Model, Avro-Parquet Modelin Map Reduce Framework.
Worked on evolving the MR frameworks for handling wide range of data.
Worked on improving of MR jobs to achieve optimized use of cluster resources.
Worked in Data Registry for Data instance registration on network.
Worked in Classic Map Reduce and YARN architecture for working with data.
Workedon differentaspects of MapReducearchitectureandtriedtogive thebest
possible solution of the case study.
Developed PIG and HIVE scripts for business data validations and piping the
data to HIVE based on the standardized schema.
Using Parquet and RC file formats for better performance in HIVE and storage.
Have worked on APACHE SPARK and SCALA.
Have setup Hadoop in pseudo distribution mode in Linux machine.
4. 3. Graph Conversion
Client: CapitalOne
Organization: Wipro Technologies
Duration: August 2015 – Till Date
Technologies: CDH5.4, Apache Hadoop 2.6 Framework,Cascading
Role: Developer
Synopsis:Graph Conversion involves converting the Ab-Initio graphs into xml and Java
code. The xml and Java code will be run by cascading to produce the same flow as of
Ab-Initio graphs. This Project mainly focuses on replicating the Ab-Initio flow of data
onto HADOOP ecosystem. The project uses the data brought by FTL into HADOOP.
Responsibilities:
Responsible for converting the Ab-Initio graphs into xml and Java Code using the
BITWISE tool.
Responsible for replicating the Ab-Initio logic into the Java Code produced by the
BITWISE tool.
Responsible for validating data produced.
4. CIDS_A350
Client: AIRBUS, Germany
Organization: Wipro Technologies
Duration: September 2013 – Feb 2015
Technologies: Core Java, PDL, CITRIX connectivity, Eclipse, IBM Rational
Clear Quest, IBM Rational Clear Case(UCM), Doors
Role: Developer
Synopsis: The Cabin Intercommunication Data System(CIDS) is a microprocessor based
system used to accomplish the functional control, operation, data transmission, testing and
monitoring of various cabin systems. The Flight Attendant Panel(FAP) is intended to be the
centralfront end for different systems in the Aircraft that do not have their own interface. The
FAP provides the flight attendants with the status information and allows them to controlthe
connected systems. FAP is equipped with a LCD touch screen, additional hard keys and card
slots.
Responsibilities:
Understand and gain good amount of knowledge for the business requirements.
Analyzed and implemented major Problem Reports in FAP software.
Executing the HSI test cases of the FAP software.
Analyzing and improving the HSI test cases for better verification and validation of
the FAP software.
Preparing the release notes and delivery related artifacts(SQAR,SCMR,SVCP, etc)
5. 5. CIDS_A350_PANEL_IMPROVEMNT_PROJECT
Client: AIRBUS, Germany
Organization: Wipro Technologies
Duration: 5 Months
Technologies: Core Java, PDL, CITRIX connectivity, Eclipse, IBM Rational
Clear Quest, IBM Rational Clear Case(UCM), Doors
Role: Developer
Synopsis: The Cabin Intercommunication Data System(CIDS) is a microprocessor based
system used to accomplish the functional control, operation, data transmission, testing and
monitoring of various cabin systems. The Flight Attendant Panel(FAP) is intended to be the
centralfront end for different systems in the Aircraft that do not have their own interface. The
FAP provides the flight attendants with the status information and allows them to controlthe
connected systems. FAP is equipped with a LCD touch screen, additional hard keys and card
slots. The project involved in increasing the performance of the Widgets used in the FAP
software.
Responsibilities:
Understand and gain good amount of knowledge for the business requirements.
Analyzed the existing implementation of the Widgets (Tacttile ScrollBar, ComboBox,
NumericalKeyPad, FolderButton, etc..) andimplemented the Widgets in the new way
to optimize the Widgets in FAP software, thereby increasing the overallperformance
of the FAP.
Developed the test cases for measuring the performance of the Widgets.
Used the Java Profiler Tool (Visual VM), for measuring the performance of the
Widgets (Cpu Usage, Heap Memory Usage, TimeStamp).
Executed the test cases and generated the Test Result for both the old and new
implementation of the Widgets.
Preparing the release notes and delivery related artifacts(SQAR,SCMR,SVCP, etc)
6. CCC+ tool - A318,A340 and A330F
Client: AIRBUS, Germany
Organization: Wipro Technologies
Duration: 4 Months
Technologies: Core Java, CITRIX connectivity, Eclipse, IBM Rational
Clear Quest, IBM Rational Clear Case(UCM), Doors,
Microsoft SQL Server 2000
Role: Developer
Synopsis: CCC+ is a tool used for generating CAM(Cabin Assignment
Module) for A318, A340 and A330F Aircrafs. CCC+ is a desktop application based on
Swing is developed and maintained by Wipro.This tool also uses Microsoft SQL Server
2000 as a database for storing the CAM data.It generates the binary which is
an input to the CAM module of CIDS software. This tooluses a toolcalled “Family
Creator”,which is used to create a new aircraft family at the database of the Central
Sites from an existing aircrafts and its families.
6. Responsibilities:
Analyzing and implementing the Change Requests in the CAM of the Aircrafts.
Creating the input files(offset.txt, .cdxand .cmx files) for the Family Creator.
Achievements
Formally recognized multiple times by management for the ownership and commitment
shown in delivering the project outcome on through “Feather In My Cap” and “PES
Shining Star” awards.
Got client appreciation for resolving the issues quickly.
Training and Internal Certifications
Wipro Technologies Java-J2ee 3 months training program.
Wipro Technologies Java-J2ee1.1, Core-Java2.1 Certifications.
PERSONAL INFORMATION
Date of Birth :27-03-1990
Nationality :Indian
Languages known :English&Hindi
DECLARATION:
I hereby declare that all particulars are true and correct to the best of my knowledge and belief.
(Amit Kumar)