hadoop resume

Hassan Qureshi
Hadoop Lead Developer
hhquresh@gmail.com
PROFESSIONAL SUMMARY:
 Certified Java programmer with 9+ Years of extensive experience in IT including few years of
Big Data related technologies.
 Currently Researcher and developer Technical lead of data engineering team, team works
with data scientists in developing insights
 Good exposure in following all the process in a production environment like change
management, incident management and managing escalations
 Hands-on experience on major components in Hadoop Ecosystem including Hive, HBase,
HBase-Hive Integration, PIG, Sqoop, Flume & knowledge of Mapper/Reducer/HDFS
Framework.
 Handson experience Installation,configuration,maintenance,monitoring,performance andtuning,
and troubleshooting Hadoop clusters in different environments such as Development Cluster, Test
Cluster and Production
 Defined file system layout and data set permissions
 Monitor local file system disk space usage, log files, cleaning log files with auto script
 Extensive knowledge of Front End technologies like HTML, CSS, Java Script.
 Good working Knowledge in OOA & OOD using UML and designing use cases.
 Good communication skills, work ethics and the ability to work in a team efficiently with
good leadership skills.
TECHNICAL SKILLS:
Big Data Hadoop, HDFS, MapReduce, Hive, Sqoop, Pig, HBase,
MongoDB, Flume, Zookeeper, Oozie.
Operating Systems Windows, Ubuntu, Red Hat Linux, Linux, UNIX
Java Technologies JDBC, JAVA, SQL, JavaScript, J2EE, C, JDBC, SQL, PL/SQL
Programming or Scripting
Languages
Java, SQL, Unix Shell Scripting, C.,Python
Database MS-SQL, MySQL, Oracle, MS-Access
Middleware Web Sphere, TIBCO
IDE’s & Utilities Eclipse and JCreator, NetBeans
Protocols TCP/IP, HTTP and HTTPS.
Testing Quality Center, Win Runner, Load Runner, QTP
Frameworks Hadoop,py-spark,Cassendra
PROFESSIONAL EXPERIENCE
Fortinet, NewYork, NY Nov 2013- Present
Hadoop Lead
Project Details:
Fortinet is company that provides networking software including IP communicator phone
software. IP enhancement project features a post sales project. All devices of Fortinet especially
Routers and Switches would be sending xml files to a centralised server on daily basis. Xml files
contain info about their location and activities done for the day. All xml files need to process to
get the device health and their usage, xmls are 20 percent of overall data and remaining will be
coming from RDBMS and flat files.
Frame works and tools used:
 HDP 2.3 distribution for development Cluster
 Hadoop eco systems Hive, Map reduce to process data Contribution
 Writing Map reduce for processing xmls and flat files
 Provided production support for cluster maintenance
 Commissioned and decommissioned nodes as needed
 There were 10 node cluster with Hortonworks data platform with 550 GB RAM, 10 TB SSDs and 8
cores
 Star schema was designed with fact tables and dimension tables
 Worked on analyzing Hadoop stack and different big data analytic tools including Pig and
Hive, Hbase database and Sqoop
 Taking trainings for new joiners into project
 Triggered workflows based on time or availability of data using the Oozie Coordinator
Engine
Emerson Climate Technologies, Louisville, KY Jan 2011- October 2013
Hadoop Lead
Project Details:
Emerson Climate is providing efficient HVAC system to buildings minimize cost energy used by
HVAC system. Project HVAC controller used in controlling HVAC speed on basis of occupancy in a
floor by analysing access card data of the employees. HVAC systems run at constant speeds
irrespective of occupancy in buildings. We developed a project where we collect data of our own
employees across many locations at 10 mins frequency and come up with recommendation of
HVAC speeds depending occupancy in a floor. Used Mapreduce and spark to clean and format
the data, jobs run at 10 mins frequency using Crontab to generate HVAC controller report.
Frame works and Tools Used
 HDP 2.0 distribution for development Cluster
 All the datasets was loaded from two different source such as Oracle, MySQL to HDFS and
Hive respectively on daily basis
 We were getting on an average of 80 GB on daily basis on the whole the data warehouse.
We used 12 node cluster to process the data
 Involved in loading data from UNIX file system to HDFS
 Hadoop eco systems hive, Map reduce, Pyspark to process data
Implemented capacity scheduler to share the resources of the cluster and perform Hadoop
admin responsibilities as needed
 Writing Map reduce and Pyspark jobs for cleansing and applying algorithms
 Cassendra database was use to transform queries to Hadoop HDFS
 Designed scalable big data cluster solutions
 Monitored job status through email received from cluster health monitoring tools
 Responsible to manage data coming from different sources.
BMO Harris Bank, Buffalo Grove, IL Aug 2010 – Dec 2011
Hadoop Lead
BMO isfinancial servicesdepartmentthathelpscustomerwiththeirfinancial needsincludingcredit
card, banking,andloans.
BMO CreditCardproject wasdesignedtoextractraw data from differentsourcesintoHadoop Eco
systemtocreate andpopulate the necessaryHive tables.The mainaimof the projectisto centralize the
source of data forreport generationusinghistorical databasewhichotherwiseare generatedfrom
multiple sources.
Responsibilities:
 Worked on Importing and exporting data into HDFS in financial sector
 Involved as a team in reviewing of functional and non-functional requirements for writing
debit processing in Atlanta location.
 Implemented Oozie workflows to perform Ingestion & Merging of data in the MapReduce
jobs for credit card fraud detection.
 Extracted files from Cassendra Database through Sqoop and placed in HDFS and processed.
 Hands on experience in creating Hive tables,loading with data and writing hive queries which
will run internally in map reduce way to administer transactions.
 Developed a custom File system plug in for Hadoop so it can access files on Data Platform.
 This plug-in allows Hadoop MapReduce programs, HBase, Pig and Hive to work unmodified
and access files directly.
 Expertise in server-side and J2EE technologies including Java, J2SE, JSP, Servlets, XML,
Hibernate, Struts, Struts2, JDBC, and JavaScript development.
 Design of GUI using Model View Architecture (STRUTS Frame Work).
 Extracted feeds form social media sites such as Facebook, Twitter using Python scripts.
Environment: Hadoop 1x, Hive, Pig, HBASE, Sqoop and Flume, Spring, Jquery, Java, J2EE, HTML,
Javascript, Hibernate
Lowe’s, Mooresville, NC Feb 2005 – July 2010
Sr. Java Developer
Lowes.com redesign and assembly haul away. Worked with different services based on Service
Oriented Architecture (SOA) and also standalone projects which utilize UNIX shell scripts to
execute java programs. Some of the REST and SOAP services include: catalog service, façade
service, pricing service, purchase history service, mylowes service, seo-redirect service. Also
worked in REST API automation project using Rest-assured framework. Involved in updating
small batch script based java projects which produces some CSV, excel, txt files and sends those
files through SFTP or email.
Responsibilities:
 Developed new DAOs methods using Hibernate 4.3 as ORMfor application.
 Used DOMParser to parse XML 1.1 data from file.
 Used JAXB 2.0 annotations to convert Java object to/from XML 1.1 file.
 Created a SOAP 1.2 web service and then got its WSDL 2.0 generated.
 Created a Web Service Client and invoked the web service using the client
 Developed a REST based service which reads the JSON 2.0 file and passed it as an argument
to the Controller which handles the multiple HTML 5.1 UI files.
 Used Struts MVC framework for user authentication by using Ping Federate Server for single
sign on (SSO)
 Used SAML to use many services by entering into the systemfor one service
 Involved in coding front end using Swing, HTML, JSP, JSF, Struts Framework
 IDesign and Development of Spring service classes and JSF pages
 Involvedinall software developmentlifecycle phaseslike development,unittesting,regression
testing,performance testing,deployment
 Responsible fordeveloping,configuring,ormodifyingRESTand SOAPwebservicesusing
technologieslikeJAX-RS,JAX-WS,Jersey,SpringMVC.
 UsedSpringJDBC as data layerto querydatabasesDB2 and Cassandra.
 WorkedUNIXbatch applicationsthatgeneratesproductfeedsandXML files.
 WorkedwithRestAPIautomationusingRestAssuredandTestingframework.
 Participatedinscrummeetings,dailystand-ups, groomingsessions.
 Usedtechnologieslike Spring,REST,JAX-RS,Jersy,JSON,Junit,Testing,Mockito,EasyMock,
RestAssured,Ehcache,Maven,DB2,JDBC,Batch Scripting,DB2, WebSphere commerce,websphere.
Environment: Java, J2EE, JSP, ExtJS, Servlets, Struts, JDBC, Java Script, LifeRay, Google Web
Toolkit, Spring, EJB (SSB, MDB), Ajax, Websphere 6.1
Education Details-BE in IT , Guelph University,2005

Recomendados

Pankaj Resume for Hadoop,Java,J2EE - Outside World von
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside WorldPankaj Kumar
30K views10 Folien
sudipto_resume von
sudipto_resumesudipto_resume
sudipto_resumeSudipto Saha
478 views7 Folien
Resume von
ResumeResume
ResumePraveen G
615 views2 Folien
Resume_of_Vasudevan - Hadoop von
Resume_of_Vasudevan - HadoopResume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - Hadoopvasudevan venkatraman
134 views6 Folien
RENUGA VEERARAGAVAN Resume HADOOP von
RENUGA VEERARAGAVAN Resume HADOOPRENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOPrenuga V
5K views6 Folien
Suresh_Hadoop_Resume von
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh yadav
335 views2 Folien

Más contenido relacionado

Was ist angesagt?

Suresh_Yadav_Hadoop_Fresher_Resume von
Suresh_Yadav_Hadoop_Fresher_ResumeSuresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_ResumeSuresh Yadav
90 views2 Folien
Jayaram_Parida- Big Data Architect and Technical Scrum Master von
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram Parida
2.2K views7 Folien
Resume_Abhinav_Hadoop_Developer von
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperAbhinav khanduja
344 views3 Folien
Resume_Triveni_Bigdata_Hadoop Professional von
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalTRIVENI PATRO
387 views3 Folien
Kumaresan kaliappan resume von
Kumaresan kaliappan resumeKumaresan kaliappan resume
Kumaresan kaliappan resumeSam Walsh
2.7K views11 Folien
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop von
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
474 views3 Folien

Was ist angesagt?(19)

Suresh_Yadav_Hadoop_Fresher_Resume von Suresh Yadav
Suresh_Yadav_Hadoop_Fresher_ResumeSuresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_Resume
Suresh Yadav90 views
Jayaram_Parida- Big Data Architect and Technical Scrum Master von Jayaram Parida
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram Parida2.2K views
Resume_Triveni_Bigdata_Hadoop Professional von TRIVENI PATRO
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop Professional
TRIVENI PATRO387 views
Kumaresan kaliappan resume von Sam Walsh
Kumaresan kaliappan resumeKumaresan kaliappan resume
Kumaresan kaliappan resume
Sam Walsh2.7K views
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop von abinash bindhani
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
abinash bindhani474 views
ganesh_2+yrs_Java_Developer_Resume von Yeduvaka Ganesh
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resume
Yeduvaka Ganesh427 views
Shanthkumar 6yrs-java-analytics-resume von Shantha Kumar N
Shanthkumar 6yrs-java-analytics-resumeShanthkumar 6yrs-java-analytics-resume
Shanthkumar 6yrs-java-analytics-resume
Shantha Kumar N587 views
Resume-pradeep SQL DBA von Pradeep GP
Resume-pradeep SQL DBAResume-pradeep SQL DBA
Resume-pradeep SQL DBA
Pradeep GP2.2K views

Similar a hadoop resume

Mukul-Resume von
Mukul-ResumeMukul-Resume
Mukul-Resumemukul upadhyay
173 views4 Folien
Srikanth hadoop 3.6yrs_hyd von
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydsrikanth K
354 views4 Folien
Sureh hadoop 3 years t von
Sureh hadoop 3 years tSureh hadoop 3 years t
Sureh hadoop 3 years tsuresh thallapelly
105 views4 Folien
Shiv shakti resume von
Shiv shakti resumeShiv shakti resume
Shiv shakti resumeShiv Shakti
290 views4 Folien
PRAFUL_HADOOP von
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL DASH
362 views5 Folien
Prashanth Kumar_Hadoop_NEW von
Prashanth Kumar_Hadoop_NEWPrashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEWPrashanth Shankar kumar
713 views7 Folien

Similar a hadoop resume(20)

Srikanth hadoop 3.6yrs_hyd von srikanth K
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hyd
srikanth K354 views
Shiv shakti resume von Shiv Shakti
Shiv shakti resumeShiv shakti resume
Shiv shakti resume
Shiv Shakti290 views
Hadoop and BigData - July 2016 von Ranjith Sekar
Hadoop and BigData - July 2016Hadoop and BigData - July 2016
Hadoop and BigData - July 2016
Ranjith Sekar1.2K views
Hadoop a Natural Choice for Data Intensive Log Processing von Hitendra Kumar
Hadoop a Natural Choice for Data Intensive Log ProcessingHadoop a Natural Choice for Data Intensive Log Processing
Hadoop a Natural Choice for Data Intensive Log Processing
Hitendra Kumar3.8K views
Prabhakar_Hadoop_2 years Experience von PRABHAKAR T
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years Experience
PRABHAKAR T119 views
Prabhakar_Hadoop_2 years Experience von PRABHAKAR T
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years Experience
PRABHAKAR T63 views
Overview of Big data, Hadoop and Microsoft BI - version1 von Thanh Nguyen
Overview of Big data, Hadoop and Microsoft BI - version1Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1
Thanh Nguyen2.4K views
Overview of big data & hadoop version 1 - Tony Nguyen von Thanh Nguyen
Overview of big data & hadoop   version 1 - Tony NguyenOverview of big data & hadoop   version 1 - Tony Nguyen
Overview of big data & hadoop version 1 - Tony Nguyen
Thanh Nguyen1.2K views

hadoop resume

  • 1. Hassan Qureshi Hadoop Lead Developer hhquresh@gmail.com PROFESSIONAL SUMMARY:  Certified Java programmer with 9+ Years of extensive experience in IT including few years of Big Data related technologies.  Currently Researcher and developer Technical lead of data engineering team, team works with data scientists in developing insights  Good exposure in following all the process in a production environment like change management, incident management and managing escalations  Hands-on experience on major components in Hadoop Ecosystem including Hive, HBase, HBase-Hive Integration, PIG, Sqoop, Flume & knowledge of Mapper/Reducer/HDFS Framework.  Handson experience Installation,configuration,maintenance,monitoring,performance andtuning, and troubleshooting Hadoop clusters in different environments such as Development Cluster, Test Cluster and Production  Defined file system layout and data set permissions  Monitor local file system disk space usage, log files, cleaning log files with auto script  Extensive knowledge of Front End technologies like HTML, CSS, Java Script.  Good working Knowledge in OOA & OOD using UML and designing use cases.  Good communication skills, work ethics and the ability to work in a team efficiently with good leadership skills. TECHNICAL SKILLS: Big Data Hadoop, HDFS, MapReduce, Hive, Sqoop, Pig, HBase, MongoDB, Flume, Zookeeper, Oozie. Operating Systems Windows, Ubuntu, Red Hat Linux, Linux, UNIX Java Technologies JDBC, JAVA, SQL, JavaScript, J2EE, C, JDBC, SQL, PL/SQL Programming or Scripting Languages Java, SQL, Unix Shell Scripting, C.,Python Database MS-SQL, MySQL, Oracle, MS-Access Middleware Web Sphere, TIBCO IDE’s & Utilities Eclipse and JCreator, NetBeans Protocols TCP/IP, HTTP and HTTPS. Testing Quality Center, Win Runner, Load Runner, QTP Frameworks Hadoop,py-spark,Cassendra PROFESSIONAL EXPERIENCE Fortinet, NewYork, NY Nov 2013- Present Hadoop Lead
  • 2. Project Details: Fortinet is company that provides networking software including IP communicator phone software. IP enhancement project features a post sales project. All devices of Fortinet especially Routers and Switches would be sending xml files to a centralised server on daily basis. Xml files contain info about their location and activities done for the day. All xml files need to process to get the device health and their usage, xmls are 20 percent of overall data and remaining will be coming from RDBMS and flat files. Frame works and tools used:  HDP 2.3 distribution for development Cluster  Hadoop eco systems Hive, Map reduce to process data Contribution  Writing Map reduce for processing xmls and flat files  Provided production support for cluster maintenance  Commissioned and decommissioned nodes as needed  There were 10 node cluster with Hortonworks data platform with 550 GB RAM, 10 TB SSDs and 8 cores  Star schema was designed with fact tables and dimension tables  Worked on analyzing Hadoop stack and different big data analytic tools including Pig and Hive, Hbase database and Sqoop  Taking trainings for new joiners into project  Triggered workflows based on time or availability of data using the Oozie Coordinator Engine Emerson Climate Technologies, Louisville, KY Jan 2011- October 2013 Hadoop Lead Project Details: Emerson Climate is providing efficient HVAC system to buildings minimize cost energy used by HVAC system. Project HVAC controller used in controlling HVAC speed on basis of occupancy in a floor by analysing access card data of the employees. HVAC systems run at constant speeds irrespective of occupancy in buildings. We developed a project where we collect data of our own employees across many locations at 10 mins frequency and come up with recommendation of HVAC speeds depending occupancy in a floor. Used Mapreduce and spark to clean and format the data, jobs run at 10 mins frequency using Crontab to generate HVAC controller report. Frame works and Tools Used  HDP 2.0 distribution for development Cluster  All the datasets was loaded from two different source such as Oracle, MySQL to HDFS and Hive respectively on daily basis  We were getting on an average of 80 GB on daily basis on the whole the data warehouse. We used 12 node cluster to process the data
  • 3.  Involved in loading data from UNIX file system to HDFS  Hadoop eco systems hive, Map reduce, Pyspark to process data Implemented capacity scheduler to share the resources of the cluster and perform Hadoop admin responsibilities as needed  Writing Map reduce and Pyspark jobs for cleansing and applying algorithms  Cassendra database was use to transform queries to Hadoop HDFS  Designed scalable big data cluster solutions  Monitored job status through email received from cluster health monitoring tools  Responsible to manage data coming from different sources. BMO Harris Bank, Buffalo Grove, IL Aug 2010 – Dec 2011 Hadoop Lead BMO isfinancial servicesdepartmentthathelpscustomerwiththeirfinancial needsincludingcredit card, banking,andloans. BMO CreditCardproject wasdesignedtoextractraw data from differentsourcesintoHadoop Eco systemtocreate andpopulate the necessaryHive tables.The mainaimof the projectisto centralize the source of data forreport generationusinghistorical databasewhichotherwiseare generatedfrom multiple sources. Responsibilities:  Worked on Importing and exporting data into HDFS in financial sector  Involved as a team in reviewing of functional and non-functional requirements for writing debit processing in Atlanta location.  Implemented Oozie workflows to perform Ingestion & Merging of data in the MapReduce jobs for credit card fraud detection.  Extracted files from Cassendra Database through Sqoop and placed in HDFS and processed.  Hands on experience in creating Hive tables,loading with data and writing hive queries which will run internally in map reduce way to administer transactions.  Developed a custom File system plug in for Hadoop so it can access files on Data Platform.  This plug-in allows Hadoop MapReduce programs, HBase, Pig and Hive to work unmodified and access files directly.  Expertise in server-side and J2EE technologies including Java, J2SE, JSP, Servlets, XML, Hibernate, Struts, Struts2, JDBC, and JavaScript development.  Design of GUI using Model View Architecture (STRUTS Frame Work).  Extracted feeds form social media sites such as Facebook, Twitter using Python scripts. Environment: Hadoop 1x, Hive, Pig, HBASE, Sqoop and Flume, Spring, Jquery, Java, J2EE, HTML, Javascript, Hibernate Lowe’s, Mooresville, NC Feb 2005 – July 2010 Sr. Java Developer
  • 4. Lowes.com redesign and assembly haul away. Worked with different services based on Service Oriented Architecture (SOA) and also standalone projects which utilize UNIX shell scripts to execute java programs. Some of the REST and SOAP services include: catalog service, façade service, pricing service, purchase history service, mylowes service, seo-redirect service. Also worked in REST API automation project using Rest-assured framework. Involved in updating small batch script based java projects which produces some CSV, excel, txt files and sends those files through SFTP or email. Responsibilities:  Developed new DAOs methods using Hibernate 4.3 as ORMfor application.  Used DOMParser to parse XML 1.1 data from file.  Used JAXB 2.0 annotations to convert Java object to/from XML 1.1 file.  Created a SOAP 1.2 web service and then got its WSDL 2.0 generated.  Created a Web Service Client and invoked the web service using the client  Developed a REST based service which reads the JSON 2.0 file and passed it as an argument to the Controller which handles the multiple HTML 5.1 UI files.  Used Struts MVC framework for user authentication by using Ping Federate Server for single sign on (SSO)  Used SAML to use many services by entering into the systemfor one service  Involved in coding front end using Swing, HTML, JSP, JSF, Struts Framework  IDesign and Development of Spring service classes and JSF pages  Involvedinall software developmentlifecycle phaseslike development,unittesting,regression testing,performance testing,deployment  Responsible fordeveloping,configuring,ormodifyingRESTand SOAPwebservicesusing technologieslikeJAX-RS,JAX-WS,Jersey,SpringMVC.  UsedSpringJDBC as data layerto querydatabasesDB2 and Cassandra.  WorkedUNIXbatch applicationsthatgeneratesproductfeedsandXML files.  WorkedwithRestAPIautomationusingRestAssuredandTestingframework.  Participatedinscrummeetings,dailystand-ups, groomingsessions.  Usedtechnologieslike Spring,REST,JAX-RS,Jersy,JSON,Junit,Testing,Mockito,EasyMock, RestAssured,Ehcache,Maven,DB2,JDBC,Batch Scripting,DB2, WebSphere commerce,websphere. Environment: Java, J2EE, JSP, ExtJS, Servlets, Struts, JDBC, Java Script, LifeRay, Google Web Toolkit, Spring, EJB (SSB, MDB), Ajax, Websphere 6.1 Education Details-BE in IT , Guelph University,2005