SlideShare a Scribd company logo
1 of 6
Download to read offline
Amit kumar
Phone: (Mob) +917411902268 Email:amitk.jss@gmail.com
CAREEROBJECTIVE
To obtain a position in a highly esteemed company, which gives me a platform to use my
expertise and skills for mutualgrowth and benefit of company and myself.
CAREER PROFILE
 An accomplished technical professional with 3 years 9months of experience, worked
throughout the entire software development life cycle from requirements
gathering/analysis, design, development and testing. Have deep and proven experience in
Spark, MapReduce, Core Java, Apache Hadoop Framework and various other tools. 
 Hands on programming skill in Scala,Apache Spark,MapReduce and Java. Created Data
Ingestion Framework for ingesting data into AWS s3(Hive) using Spark, Scala, Sqoop
and AWS EMR.

 Hands on programming skills in Model View Controller, an object oriented, maintainable
and scalable code which incorporates generics and collections, multithreading and
exception handling. 
 Experience in Java Profilers and memory analysis. 

 Good knowledge on verification and validation and DO 178B(Safety critical aerospace
process ) Standards and Agile (Scrum) methodology . 
 Experienced in the Configuration Management. 
 Expertise in problem solving and defects tracking using “Defect tracking” tools, IBM 
Clear Quest and Rally.
 Experience in GitHub,Bit Bucket,IBM Rational ClearCase and IBM Rational DOORS. 
ACADEMIC PROFILE
Examination
Board / University
Name of School/ Year of
Percentage
passed College passing
B.E VTU
Jss Academyof
2012 70.10%
Technical Education
XII CBSE
St Paul Secondary
2007 74.80%
School,Bihar
X CBSE
Indian Public School
2005 88.00%
Hajipur,Bihar
WORK EXPERIENCE
Project Engineer, Wipro Technologies,
Bangalore
28th January 2013 – 15th April 2016
Associate, Cognizant Technology Solutions 18th April 2016 – Till Date
TECHNICAL SKILLS
Languages SPARK,SCALA,CoreJava,Hadoop,MapR
educe,YARN,HIVE,SQOOP,JavaScript,
SQL,Page Description Language(PDL),
Css.
IDE Eclipse, Source Insight,IntelliJ
Operating System Environment Windows Family, LINUX, UBUNTU
Source Control Tools
BitBucket,GIT,IBM Rational Clear Case,
Clear Quest,
Doors
Web Server Apache Tomact
PROJECT DETAILS
1. CUSTOMER360
Client: LaQuinta
Organization: Cognizant Technology Solutions
Duration: May 2016 – Till Date
Technologies: EMR4.7.2, Apache Spark, Scala, Apache Hadoop 2.7 Framework
Role: Developer
Synopsis: Customer360 involves data ingestion from five different sources
(Clairvoyix,Medallia,Epsilon,LQDWH,NVCorp) into Aws S3, where Hive External
Table can be created, and business users can run queries on it. It also involves creating
Golden Record for each customer and storing it into another Hive Table. Spark has been
used as the processing Engine. Every day, at scheduled time, AWS Lambda triggers an
EMR cluster and AWS Steps get executed for the Data Ingestion and golden Record
creation.
Responsibilities:
 Created Data Quality Framework in Spark and Scala, for checking the Data
Quality (header, null, blank, integer, Date Format checks) of each Record in the
file.
 Created Spark job for creating the golden records from the data ingested and
storing it into different Hive Table.
 Created Scala based tool to read columns and Data Type from excel sheet and
createtheproperties file containing theHive DDLs/DMLs Schema, whichcanbe
usedin Sparkjob for creating the Hive Tables and Reading data fromthe existing
tables.
 Involved in writing AWS Lambda functions which willtrigger time based EMR
cluster for ingesting the data. On successful data ingestion EMR cluster will get
terminated.
 Involved in writing various shell scripts which will be used by AWS Steps for
triggering various OOZIE jobs for data ingestion.
 Created OOZIE workflow, which will trigger shell, Spark, Sqoop and Hive
actions for Data Ingestion.
2. Fill the Lake(FTL)
Client: CapitalOne
Organization: Wipro Technologies
Duration: March 2015 – 15th April 2016
Technologies: CDH5.4, Apache Hadoop 2.6 Framework
Role: Developer
Synopsis: FTL (Fill the Lake) involves data migration from Teradata to Hadoop so that
business users can run queries on it (HIVE, Impala, etc.). FTL comprises of many
individual frameworks which performdata transformations and data filtering on the base
of application schemas.
Responsibilities:
 Worked on development of multiple frameworks which involves FB (File
Broker), DQ (Data Quality) checks, Integration Keys Framework, Split
Framework and BDQ validations. 
 Have developed MapReduce jobs in Java using Eclipse IDE. 
 Have Integrated Parquet model with Map Reduce for Integration Keys
Framework. 
 Integrated In Memory Model, Avro-Parquet Modelin Map Reduce Framework. 
 Worked on evolving the MR frameworks for handling wide range of data. 
 Worked on improving of MR jobs to achieve optimized use of cluster resources. 
 Worked in Data Registry for Data instance registration on network. 
 Worked in Classic Map Reduce and YARN architecture for working with data. 

 Workedon differentaspects of MapReducearchitectureandtriedtogive thebest
possible solution of the case study. 

 Developed PIG and HIVE scripts for business data validations and piping the
data to HIVE based on the standardized schema. 
 Using Parquet and RC file formats for better performance in HIVE and storage. 
 Have worked on APACHE SPARK and SCALA. 
 Have setup Hadoop in pseudo distribution mode in Linux machine. 
3. Graph Conversion
Client: CapitalOne
Organization: Wipro Technologies
Duration: August 2015 – Till Date
Technologies: CDH5.4, Apache Hadoop 2.6 Framework,Cascading
Role: Developer
Synopsis:Graph Conversion involves converting the Ab-Initio graphs into xml and Java
code. The xml and Java code will be run by cascading to produce the same flow as of
Ab-Initio graphs. This Project mainly focuses on replicating the Ab-Initio flow of data
onto HADOOP ecosystem. The project uses the data brought by FTL into HADOOP.
Responsibilities:
 Responsible for converting the Ab-Initio graphs into xml and Java Code using the
BITWISE tool. 

 Responsible for replicating the Ab-Initio logic into the Java Code produced by the
BITWISE tool. 
 Responsible for validating data produced. 

4. CIDS_A350
Client: AIRBUS, Germany
Organization: Wipro Technologies
Duration: September 2013 – Feb 2015
Technologies: Core Java, PDL, CITRIX connectivity, Eclipse, IBM Rational
Clear Quest, IBM Rational Clear Case(UCM), Doors
Role: Developer
Synopsis: The Cabin Intercommunication Data System(CIDS) is a microprocessor based
system used to accomplish the functional control, operation, data transmission, testing and
monitoring of various cabin systems. The Flight Attendant Panel(FAP) is intended to be the
centralfront end for different systems in the Aircraft that do not have their own interface. The
FAP provides the flight attendants with the status information and allows them to controlthe
connected systems. FAP is equipped with a LCD touch screen, additional hard keys and card
slots.
Responsibilities:
 Understand and gain good amount of knowledge for the business requirements. 
 Analyzed and implemented major Problem Reports in FAP software. 
 Executing the HSI test cases of the FAP software. 

 Analyzing and improving the HSI test cases for better verification and validation of
the FAP software. 
 Preparing the release notes and delivery related artifacts(SQAR,SCMR,SVCP, etc) 
5. CIDS_A350_PANEL_IMPROVEMNT_PROJECT
Client: AIRBUS, Germany
Organization: Wipro Technologies
Duration: 5 Months
Technologies: Core Java, PDL, CITRIX connectivity, Eclipse, IBM Rational
Clear Quest, IBM Rational Clear Case(UCM), Doors
Role: Developer
Synopsis: The Cabin Intercommunication Data System(CIDS) is a microprocessor based
system used to accomplish the functional control, operation, data transmission, testing and
monitoring of various cabin systems. The Flight Attendant Panel(FAP) is intended to be the
centralfront end for different systems in the Aircraft that do not have their own interface. The
FAP provides the flight attendants with the status information and allows them to controlthe
connected systems. FAP is equipped with a LCD touch screen, additional hard keys and card
slots. The project involved in increasing the performance of the Widgets used in the FAP
software.
Responsibilities:
 Understand and gain good amount of knowledge for the business requirements. 

 Analyzed the existing implementation of the Widgets (Tacttile ScrollBar, ComboBox,
NumericalKeyPad, FolderButton, etc..) andimplemented the Widgets in the new way
to optimize the Widgets in FAP software, thereby increasing the overallperformance
of the FAP. 
 Developed the test cases for measuring the performance of the Widgets. 
 Used the Java Profiler Tool (Visual VM), for measuring the performance of the
Widgets (Cpu Usage, Heap Memory Usage, TimeStamp). 
 Executed the test cases and generated the Test Result for both the old and new
implementation of the Widgets. 

 Preparing the release notes and delivery related artifacts(SQAR,SCMR,SVCP, etc) 

6. CCC+ tool - A318,A340 and A330F
Client: AIRBUS, Germany
Organization: Wipro Technologies
Duration: 4 Months
Technologies: Core Java, CITRIX connectivity, Eclipse, IBM Rational
Clear Quest, IBM Rational Clear Case(UCM), Doors,
Microsoft SQL Server 2000
Role: Developer
Synopsis: CCC+ is a tool used for generating CAM(Cabin Assignment
Module) for A318, A340 and A330F Aircrafs. CCC+ is a desktop application based on
Swing is developed and maintained by Wipro.This tool also uses Microsoft SQL Server
2000 as a database for storing the CAM data.It generates the binary which is
an input to the CAM module of CIDS software. This tooluses a toolcalled “Family
Creator”,which is used to create a new aircraft family at the database of the Central
Sites from an existing aircrafts and its families.
Responsibilities:
 Analyzing and implementing the Change Requests in the CAM of the Aircrafts. 
 Creating the input files(offset.txt, .cdxand .cmx files) for the Family Creator. 
Achievements
 Formally recognized multiple times by management for the ownership and commitment
shown in delivering the project outcome on through “Feather In My Cap” and “PES
Shining Star” awards. 
 Got client appreciation for resolving the issues quickly. 
Training and Internal Certifications
 Wipro Technologies Java-J2ee 3 months training program. 
 Wipro Technologies Java-J2ee1.1, Core-Java2.1 Certifications. 
PERSONAL INFORMATION
Date of Birth :27-03-1990
Nationality :Indian
Languages known :English&Hindi
DECLARATION:
I hereby declare that all particulars are true and correct to the best of my knowledge and belief.
(Amit Kumar)

More Related Content

What's hot

Defend against adversarial AI using Adversarial Robustness Toolbox
Defend against adversarial AI using Adversarial Robustness Toolbox Defend against adversarial AI using Adversarial Robustness Toolbox
Defend against adversarial AI using Adversarial Robustness Toolbox Animesh Singh
 
How to deploy machine learning models into production
How to deploy machine learning models into productionHow to deploy machine learning models into production
How to deploy machine learning models into productionDataWorks Summit
 
Deploying and Monitoring Heterogeneous Machine Learning Applications with Cli...
Deploying and Monitoring Heterogeneous Machine Learning Applications with Cli...Deploying and Monitoring Heterogeneous Machine Learning Applications with Cli...
Deploying and Monitoring Heterogeneous Machine Learning Applications with Cli...Databricks
 
PyconZA19-Distributed-workloads-challenges-with-PySpark-and-Airflow
PyconZA19-Distributed-workloads-challenges-with-PySpark-and-AirflowPyconZA19-Distributed-workloads-challenges-with-PySpark-and-Airflow
PyconZA19-Distributed-workloads-challenges-with-PySpark-and-AirflowChetan Khatri
 
Apache spark with java 8
Apache spark with java 8Apache spark with java 8
Apache spark with java 8Janu Jahnavi
 
Apache spark with java 8
Apache spark with java 8Apache spark with java 8
Apache spark with java 8Janu Jahnavi
 
Automated ML Workflow for Distributed Big Data Using Analytics Zoo (CVPR2020 ...
Automated ML Workflow for Distributed Big Data Using Analytics Zoo (CVPR2020 ...Automated ML Workflow for Distributed Big Data Using Analytics Zoo (CVPR2020 ...
Automated ML Workflow for Distributed Big Data Using Analytics Zoo (CVPR2020 ...Jason Dai
 
Hybrid Cloud, Kubeflow and Tensorflow Extended [TFX]
Hybrid Cloud, Kubeflow and Tensorflow Extended [TFX]Hybrid Cloud, Kubeflow and Tensorflow Extended [TFX]
Hybrid Cloud, Kubeflow and Tensorflow Extended [TFX]Animesh Singh
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev
 
a9TD6cbzTZotpJihekdc+w==.docx
a9TD6cbzTZotpJihekdc+w==.docxa9TD6cbzTZotpJihekdc+w==.docx
a9TD6cbzTZotpJihekdc+w==.docxVasimMemon4
 
Elyra - a set of AI-centric extensions to JupyterLab Notebooks.
Elyra - a set of AI-centric extensions to JupyterLab Notebooks.Elyra - a set of AI-centric extensions to JupyterLab Notebooks.
Elyra - a set of AI-centric extensions to JupyterLab Notebooks.Luciano Resende
 
Deploying End-to-End Deep Learning Pipelines with ONNX
Deploying End-to-End Deep Learning Pipelines with ONNXDeploying End-to-End Deep Learning Pipelines with ONNX
Deploying End-to-End Deep Learning Pipelines with ONNXDatabricks
 
Deep Learning for Natural Language Processing Using Apache Spark and TensorFl...
Deep Learning for Natural Language Processing Using Apache Spark and TensorFl...Deep Learning for Natural Language Processing Using Apache Spark and TensorFl...
Deep Learning for Natural Language Processing Using Apache Spark and TensorFl...Databricks
 
Analyzing the World's Largest Security Data Lake!
Analyzing the World's Largest Security Data Lake!Analyzing the World's Largest Security Data Lake!
Analyzing the World's Largest Security Data Lake!DataWorks Summit
 
Top 5 Tasks Of A Hadoop Developer Webinar
Top 5 Tasks Of A Hadoop Developer WebinarTop 5 Tasks Of A Hadoop Developer Webinar
Top 5 Tasks Of A Hadoop Developer WebinarSkillspeed
 

What's hot (20)

Defend against adversarial AI using Adversarial Robustness Toolbox
Defend against adversarial AI using Adversarial Robustness Toolbox Defend against adversarial AI using Adversarial Robustness Toolbox
Defend against adversarial AI using Adversarial Robustness Toolbox
 
How to deploy machine learning models into production
How to deploy machine learning models into productionHow to deploy machine learning models into production
How to deploy machine learning models into production
 
Deploying and Monitoring Heterogeneous Machine Learning Applications with Cli...
Deploying and Monitoring Heterogeneous Machine Learning Applications with Cli...Deploying and Monitoring Heterogeneous Machine Learning Applications with Cli...
Deploying and Monitoring Heterogeneous Machine Learning Applications with Cli...
 
PyconZA19-Distributed-workloads-challenges-with-PySpark-and-Airflow
PyconZA19-Distributed-workloads-challenges-with-PySpark-and-AirflowPyconZA19-Distributed-workloads-challenges-with-PySpark-and-Airflow
PyconZA19-Distributed-workloads-challenges-with-PySpark-and-Airflow
 
Apache spark with java 8
Apache spark with java 8Apache spark with java 8
Apache spark with java 8
 
Apache spark with java 8
Apache spark with java 8Apache spark with java 8
Apache spark with java 8
 
Arindam Sengupta _ Resume
Arindam Sengupta _ ResumeArindam Sengupta _ Resume
Arindam Sengupta _ Resume
 
Automated ML Workflow for Distributed Big Data Using Analytics Zoo (CVPR2020 ...
Automated ML Workflow for Distributed Big Data Using Analytics Zoo (CVPR2020 ...Automated ML Workflow for Distributed Big Data Using Analytics Zoo (CVPR2020 ...
Automated ML Workflow for Distributed Big Data Using Analytics Zoo (CVPR2020 ...
 
Hybrid Cloud, Kubeflow and Tensorflow Extended [TFX]
Hybrid Cloud, Kubeflow and Tensorflow Extended [TFX]Hybrid Cloud, Kubeflow and Tensorflow Extended [TFX]
Hybrid Cloud, Kubeflow and Tensorflow Extended [TFX]
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015
 
Resume_VipinKP
Resume_VipinKPResume_VipinKP
Resume_VipinKP
 
Pattern -A scoring engine
Pattern -A scoring enginePattern -A scoring engine
Pattern -A scoring engine
 
Resume_Mallikarjun
Resume_MallikarjunResume_Mallikarjun
Resume_Mallikarjun
 
a9TD6cbzTZotpJihekdc+w==.docx
a9TD6cbzTZotpJihekdc+w==.docxa9TD6cbzTZotpJihekdc+w==.docx
a9TD6cbzTZotpJihekdc+w==.docx
 
Elyra - a set of AI-centric extensions to JupyterLab Notebooks.
Elyra - a set of AI-centric extensions to JupyterLab Notebooks.Elyra - a set of AI-centric extensions to JupyterLab Notebooks.
Elyra - a set of AI-centric extensions to JupyterLab Notebooks.
 
Deploying End-to-End Deep Learning Pipelines with ONNX
Deploying End-to-End Deep Learning Pipelines with ONNXDeploying End-to-End Deep Learning Pipelines with ONNX
Deploying End-to-End Deep Learning Pipelines with ONNX
 
Deep Learning for Natural Language Processing Using Apache Spark and TensorFl...
Deep Learning for Natural Language Processing Using Apache Spark and TensorFl...Deep Learning for Natural Language Processing Using Apache Spark and TensorFl...
Deep Learning for Natural Language Processing Using Apache Spark and TensorFl...
 
Analyzing the World's Largest Security Data Lake!
Analyzing the World's Largest Security Data Lake!Analyzing the World's Largest Security Data Lake!
Analyzing the World's Largest Security Data Lake!
 
pres_all_latest
pres_all_latestpres_all_latest
pres_all_latest
 
Top 5 Tasks Of A Hadoop Developer Webinar
Top 5 Tasks Of A Hadoop Developer WebinarTop 5 Tasks Of A Hadoop Developer Webinar
Top 5 Tasks Of A Hadoop Developer Webinar
 

Viewers also liked

Es muy usual escuchar que se afirme que
Es muy usual escuchar que se afirme queEs muy usual escuchar que se afirme que
Es muy usual escuchar que se afirme queAntonio MartinezUribe
 
Factual production conventions
Factual production conventionsFactual production conventions
Factual production conventionshaverstockmedia
 
Camouflage and wetland adaptation
Camouflage and wetland adaptationCamouflage and wetland adaptation
Camouflage and wetland adaptationsafa-medaney
 
Reference letter from Garrick Carter
Reference letter from Garrick CarterReference letter from Garrick Carter
Reference letter from Garrick CarterRusty Bryan
 
Schafik balance 25 cierre junio 2006 2
Schafik balance 25 cierre junio 2006 2Schafik balance 25 cierre junio 2006 2
Schafik balance 25 cierre junio 2006 2Antonio MartinezUribe
 
Managing the Pace of Innovation: Behind the Scenes at AWS
Managing the Pace of Innovation: Behind the Scenes at AWSManaging the Pace of Innovation: Behind the Scenes at AWS
Managing the Pace of Innovation: Behind the Scenes at AWSAmazon Web Services
 
The Gestalt Program Brochure pdf
The Gestalt Program Brochure pdfThe Gestalt Program Brochure pdf
The Gestalt Program Brochure pdfMariano Akerman
 
5.0 malaysia dan sejarah hubungan serantau
5.0 malaysia dan sejarah hubungan serantau5.0 malaysia dan sejarah hubungan serantau
5.0 malaysia dan sejarah hubungan serantausejarahkkb
 
AWS Innovate 2016: Digital Workloads on Amazon Web Services- Santanu Dutt
AWS Innovate 2016: Digital Workloads on Amazon Web Services- Santanu DuttAWS Innovate 2016: Digital Workloads on Amazon Web Services- Santanu Dutt
AWS Innovate 2016: Digital Workloads on Amazon Web Services- Santanu DuttAmazon Web Services Korea
 
Hubungan internasional singapura
Hubungan internasional singapuraHubungan internasional singapura
Hubungan internasional singapuraAhmad Royhan Nst
 
Sebab-sebab Penubuhan Malaysia serta Reaksi Sabah, Sarawak, Singapura & Brunei
Sebab-sebab Penubuhan Malaysia serta Reaksi Sabah, Sarawak, Singapura & BruneiSebab-sebab Penubuhan Malaysia serta Reaksi Sabah, Sarawak, Singapura & Brunei
Sebab-sebab Penubuhan Malaysia serta Reaksi Sabah, Sarawak, Singapura & BruneiChAnnJo_97
 
Bab7 penubuhan malaysiaver1 (1)
Bab7 penubuhan malaysiaver1 (1)Bab7 penubuhan malaysiaver1 (1)
Bab7 penubuhan malaysiaver1 (1)Norerina Ramli
 

Viewers also liked (20)

Sesion 3
Sesion 3 Sesion 3
Sesion 3
 
Es muy usual escuchar que se afirme que
Es muy usual escuchar que se afirme queEs muy usual escuchar que se afirme que
Es muy usual escuchar que se afirme que
 
Factual production conventions
Factual production conventionsFactual production conventions
Factual production conventions
 
TRADE & TRAINING TICKET
TRADE & TRAINING TICKETTRADE & TRAINING TICKET
TRADE & TRAINING TICKET
 
MCSA
MCSAMCSA
MCSA
 
Camouflage and wetland adaptation
Camouflage and wetland adaptationCamouflage and wetland adaptation
Camouflage and wetland adaptation
 
Schafik proyecto de socialismo
Schafik proyecto de socialismoSchafik proyecto de socialismo
Schafik proyecto de socialismo
 
Flat plans
Flat plans Flat plans
Flat plans
 
Reference letter from Garrick Carter
Reference letter from Garrick CarterReference letter from Garrick Carter
Reference letter from Garrick Carter
 
Schafik balance 25 cierre junio 2006 2
Schafik balance 25 cierre junio 2006 2Schafik balance 25 cierre junio 2006 2
Schafik balance 25 cierre junio 2006 2
 
Managing the Pace of Innovation: Behind the Scenes at AWS
Managing the Pace of Innovation: Behind the Scenes at AWSManaging the Pace of Innovation: Behind the Scenes at AWS
Managing the Pace of Innovation: Behind the Scenes at AWS
 
Certificate of Appreciation
Certificate of AppreciationCertificate of Appreciation
Certificate of Appreciation
 
The Gestalt Program Brochure pdf
The Gestalt Program Brochure pdfThe Gestalt Program Brochure pdf
The Gestalt Program Brochure pdf
 
5.0 malaysia dan sejarah hubungan serantau
5.0 malaysia dan sejarah hubungan serantau5.0 malaysia dan sejarah hubungan serantau
5.0 malaysia dan sejarah hubungan serantau
 
AWS Innovate 2016: Digital Workloads on Amazon Web Services- Santanu Dutt
AWS Innovate 2016: Digital Workloads on Amazon Web Services- Santanu DuttAWS Innovate 2016: Digital Workloads on Amazon Web Services- Santanu Dutt
AWS Innovate 2016: Digital Workloads on Amazon Web Services- Santanu Dutt
 
nota ringkas berkenaan TPPA
nota ringkas berkenaan TPPAnota ringkas berkenaan TPPA
nota ringkas berkenaan TPPA
 
Hubungan internasional singapura
Hubungan internasional singapuraHubungan internasional singapura
Hubungan internasional singapura
 
Compiladores teoria e implementacion
Compiladores teoria e implementacionCompiladores teoria e implementacion
Compiladores teoria e implementacion
 
Sebab-sebab Penubuhan Malaysia serta Reaksi Sabah, Sarawak, Singapura & Brunei
Sebab-sebab Penubuhan Malaysia serta Reaksi Sabah, Sarawak, Singapura & BruneiSebab-sebab Penubuhan Malaysia serta Reaksi Sabah, Sarawak, Singapura & Brunei
Sebab-sebab Penubuhan Malaysia serta Reaksi Sabah, Sarawak, Singapura & Brunei
 
Bab7 penubuhan malaysiaver1 (1)
Bab7 penubuhan malaysiaver1 (1)Bab7 penubuhan malaysiaver1 (1)
Bab7 penubuhan malaysiaver1 (1)
 

Similar to resumePdf (20)

BigData_Krishna Kumar Sharma
BigData_Krishna Kumar SharmaBigData_Krishna Kumar Sharma
BigData_Krishna Kumar Sharma
 
RAGHUNATH_GORLA_RESUME
RAGHUNATH_GORLA_RESUMERAGHUNATH_GORLA_RESUME
RAGHUNATH_GORLA_RESUME
 
Resume
ResumeResume
Resume
 
ABHAY_SHUKLA
ABHAY_SHUKLAABHAY_SHUKLA
ABHAY_SHUKLA
 
Tony Reid Resume
Tony Reid ResumeTony Reid Resume
Tony Reid Resume
 
sudipto_resume
sudipto_resumesudipto_resume
sudipto_resume
 
SpencerKellerResume
SpencerKellerResumeSpencerKellerResume
SpencerKellerResume
 
Shan _Java Architect
Shan _Java ArchitectShan _Java Architect
Shan _Java Architect
 
Kunal bhatia resume mass
Kunal bhatia   resume massKunal bhatia   resume mass
Kunal bhatia resume mass
 
Resume_Anshul Jain
Resume_Anshul JainResume_Anshul Jain
Resume_Anshul Jain
 
ScottHovey_Resume
ScottHovey_ResumeScottHovey_Resume
ScottHovey_Resume
 
JITHIN CHANDRAN
JITHIN CHANDRANJITHIN CHANDRAN
JITHIN CHANDRAN
 
Mohammed_Murtuza_EAI
Mohammed_Murtuza_EAIMohammed_Murtuza_EAI
Mohammed_Murtuza_EAI
 
Gubendran Lakshmanan
Gubendran LakshmananGubendran Lakshmanan
Gubendran Lakshmanan
 
Resume
ResumeResume
Resume
 
Resume_Appaji
Resume_AppajiResume_Appaji
Resume_Appaji
 
Ramji
RamjiRamji
Ramji
 
Ajaya_resume
Ajaya_resumeAjaya_resume
Ajaya_resume
 
Shubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-ml
Shubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-mlShubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-ml
Shubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-ml
 
Irshad Resume
Irshad ResumeIrshad Resume
Irshad Resume
 

resumePdf

  • 1. Amit kumar Phone: (Mob) +917411902268 Email:amitk.jss@gmail.com CAREEROBJECTIVE To obtain a position in a highly esteemed company, which gives me a platform to use my expertise and skills for mutualgrowth and benefit of company and myself. CAREER PROFILE  An accomplished technical professional with 3 years 9months of experience, worked throughout the entire software development life cycle from requirements gathering/analysis, design, development and testing. Have deep and proven experience in Spark, MapReduce, Core Java, Apache Hadoop Framework and various other tools.   Hands on programming skill in Scala,Apache Spark,MapReduce and Java. Created Data Ingestion Framework for ingesting data into AWS s3(Hive) using Spark, Scala, Sqoop and AWS EMR.   Hands on programming skills in Model View Controller, an object oriented, maintainable and scalable code which incorporates generics and collections, multithreading and exception handling.   Experience in Java Profilers and memory analysis.    Good knowledge on verification and validation and DO 178B(Safety critical aerospace process ) Standards and Agile (Scrum) methodology .   Experienced in the Configuration Management.   Expertise in problem solving and defects tracking using “Defect tracking” tools, IBM  Clear Quest and Rally.  Experience in GitHub,Bit Bucket,IBM Rational ClearCase and IBM Rational DOORS.  ACADEMIC PROFILE Examination Board / University Name of School/ Year of Percentage passed College passing B.E VTU Jss Academyof 2012 70.10% Technical Education XII CBSE St Paul Secondary 2007 74.80% School,Bihar X CBSE Indian Public School 2005 88.00% Hajipur,Bihar
  • 2. WORK EXPERIENCE Project Engineer, Wipro Technologies, Bangalore 28th January 2013 – 15th April 2016 Associate, Cognizant Technology Solutions 18th April 2016 – Till Date TECHNICAL SKILLS Languages SPARK,SCALA,CoreJava,Hadoop,MapR educe,YARN,HIVE,SQOOP,JavaScript, SQL,Page Description Language(PDL), Css. IDE Eclipse, Source Insight,IntelliJ Operating System Environment Windows Family, LINUX, UBUNTU Source Control Tools BitBucket,GIT,IBM Rational Clear Case, Clear Quest, Doors Web Server Apache Tomact PROJECT DETAILS 1. CUSTOMER360 Client: LaQuinta Organization: Cognizant Technology Solutions Duration: May 2016 – Till Date Technologies: EMR4.7.2, Apache Spark, Scala, Apache Hadoop 2.7 Framework Role: Developer Synopsis: Customer360 involves data ingestion from five different sources (Clairvoyix,Medallia,Epsilon,LQDWH,NVCorp) into Aws S3, where Hive External Table can be created, and business users can run queries on it. It also involves creating Golden Record for each customer and storing it into another Hive Table. Spark has been used as the processing Engine. Every day, at scheduled time, AWS Lambda triggers an EMR cluster and AWS Steps get executed for the Data Ingestion and golden Record creation. Responsibilities:  Created Data Quality Framework in Spark and Scala, for checking the Data Quality (header, null, blank, integer, Date Format checks) of each Record in the file.  Created Spark job for creating the golden records from the data ingested and storing it into different Hive Table.  Created Scala based tool to read columns and Data Type from excel sheet and createtheproperties file containing theHive DDLs/DMLs Schema, whichcanbe usedin Sparkjob for creating the Hive Tables and Reading data fromthe existing tables.  Involved in writing AWS Lambda functions which willtrigger time based EMR
  • 3. cluster for ingesting the data. On successful data ingestion EMR cluster will get terminated.  Involved in writing various shell scripts which will be used by AWS Steps for triggering various OOZIE jobs for data ingestion.  Created OOZIE workflow, which will trigger shell, Spark, Sqoop and Hive actions for Data Ingestion. 2. Fill the Lake(FTL) Client: CapitalOne Organization: Wipro Technologies Duration: March 2015 – 15th April 2016 Technologies: CDH5.4, Apache Hadoop 2.6 Framework Role: Developer Synopsis: FTL (Fill the Lake) involves data migration from Teradata to Hadoop so that business users can run queries on it (HIVE, Impala, etc.). FTL comprises of many individual frameworks which performdata transformations and data filtering on the base of application schemas. Responsibilities:  Worked on development of multiple frameworks which involves FB (File Broker), DQ (Data Quality) checks, Integration Keys Framework, Split Framework and BDQ validations.   Have developed MapReduce jobs in Java using Eclipse IDE.   Have Integrated Parquet model with Map Reduce for Integration Keys Framework.   Integrated In Memory Model, Avro-Parquet Modelin Map Reduce Framework.   Worked on evolving the MR frameworks for handling wide range of data.   Worked on improving of MR jobs to achieve optimized use of cluster resources.   Worked in Data Registry for Data instance registration on network.   Worked in Classic Map Reduce and YARN architecture for working with data.    Workedon differentaspects of MapReducearchitectureandtriedtogive thebest possible solution of the case study.    Developed PIG and HIVE scripts for business data validations and piping the data to HIVE based on the standardized schema.   Using Parquet and RC file formats for better performance in HIVE and storage.   Have worked on APACHE SPARK and SCALA.   Have setup Hadoop in pseudo distribution mode in Linux machine. 
  • 4. 3. Graph Conversion Client: CapitalOne Organization: Wipro Technologies Duration: August 2015 – Till Date Technologies: CDH5.4, Apache Hadoop 2.6 Framework,Cascading Role: Developer Synopsis:Graph Conversion involves converting the Ab-Initio graphs into xml and Java code. The xml and Java code will be run by cascading to produce the same flow as of Ab-Initio graphs. This Project mainly focuses on replicating the Ab-Initio flow of data onto HADOOP ecosystem. The project uses the data brought by FTL into HADOOP. Responsibilities:  Responsible for converting the Ab-Initio graphs into xml and Java Code using the BITWISE tool.    Responsible for replicating the Ab-Initio logic into the Java Code produced by the BITWISE tool.   Responsible for validating data produced.   4. CIDS_A350 Client: AIRBUS, Germany Organization: Wipro Technologies Duration: September 2013 – Feb 2015 Technologies: Core Java, PDL, CITRIX connectivity, Eclipse, IBM Rational Clear Quest, IBM Rational Clear Case(UCM), Doors Role: Developer Synopsis: The Cabin Intercommunication Data System(CIDS) is a microprocessor based system used to accomplish the functional control, operation, data transmission, testing and monitoring of various cabin systems. The Flight Attendant Panel(FAP) is intended to be the centralfront end for different systems in the Aircraft that do not have their own interface. The FAP provides the flight attendants with the status information and allows them to controlthe connected systems. FAP is equipped with a LCD touch screen, additional hard keys and card slots. Responsibilities:  Understand and gain good amount of knowledge for the business requirements.   Analyzed and implemented major Problem Reports in FAP software.   Executing the HSI test cases of the FAP software.    Analyzing and improving the HSI test cases for better verification and validation of the FAP software.   Preparing the release notes and delivery related artifacts(SQAR,SCMR,SVCP, etc) 
  • 5. 5. CIDS_A350_PANEL_IMPROVEMNT_PROJECT Client: AIRBUS, Germany Organization: Wipro Technologies Duration: 5 Months Technologies: Core Java, PDL, CITRIX connectivity, Eclipse, IBM Rational Clear Quest, IBM Rational Clear Case(UCM), Doors Role: Developer Synopsis: The Cabin Intercommunication Data System(CIDS) is a microprocessor based system used to accomplish the functional control, operation, data transmission, testing and monitoring of various cabin systems. The Flight Attendant Panel(FAP) is intended to be the centralfront end for different systems in the Aircraft that do not have their own interface. The FAP provides the flight attendants with the status information and allows them to controlthe connected systems. FAP is equipped with a LCD touch screen, additional hard keys and card slots. The project involved in increasing the performance of the Widgets used in the FAP software. Responsibilities:  Understand and gain good amount of knowledge for the business requirements.    Analyzed the existing implementation of the Widgets (Tacttile ScrollBar, ComboBox, NumericalKeyPad, FolderButton, etc..) andimplemented the Widgets in the new way to optimize the Widgets in FAP software, thereby increasing the overallperformance of the FAP.   Developed the test cases for measuring the performance of the Widgets.   Used the Java Profiler Tool (Visual VM), for measuring the performance of the Widgets (Cpu Usage, Heap Memory Usage, TimeStamp).   Executed the test cases and generated the Test Result for both the old and new implementation of the Widgets.    Preparing the release notes and delivery related artifacts(SQAR,SCMR,SVCP, etc)   6. CCC+ tool - A318,A340 and A330F Client: AIRBUS, Germany Organization: Wipro Technologies Duration: 4 Months Technologies: Core Java, CITRIX connectivity, Eclipse, IBM Rational Clear Quest, IBM Rational Clear Case(UCM), Doors, Microsoft SQL Server 2000 Role: Developer Synopsis: CCC+ is a tool used for generating CAM(Cabin Assignment Module) for A318, A340 and A330F Aircrafs. CCC+ is a desktop application based on Swing is developed and maintained by Wipro.This tool also uses Microsoft SQL Server 2000 as a database for storing the CAM data.It generates the binary which is an input to the CAM module of CIDS software. This tooluses a toolcalled “Family Creator”,which is used to create a new aircraft family at the database of the Central Sites from an existing aircrafts and its families.
  • 6. Responsibilities:  Analyzing and implementing the Change Requests in the CAM of the Aircrafts.   Creating the input files(offset.txt, .cdxand .cmx files) for the Family Creator.  Achievements  Formally recognized multiple times by management for the ownership and commitment shown in delivering the project outcome on through “Feather In My Cap” and “PES Shining Star” awards.   Got client appreciation for resolving the issues quickly.  Training and Internal Certifications  Wipro Technologies Java-J2ee 3 months training program.   Wipro Technologies Java-J2ee1.1, Core-Java2.1 Certifications.  PERSONAL INFORMATION Date of Birth :27-03-1990 Nationality :Indian Languages known :English&Hindi DECLARATION: I hereby declare that all particulars are true and correct to the best of my knowledge and belief. (Amit Kumar)