1. Hassan Qureshi
Hadoop Lead Developer
Certified Java programmer with 9+ Years of extensive experience in IT including few years of
Big Data related technologies.
Currently Researcher and developer Technical lead of data engineering team, team works
with data scientists in developing insights
Good exposure in following all the process in a production environment like change
management, incident management and managing escalations
Hands-on experience on major components in Hadoop Ecosystem including Hive, HBase,
HBase-Hive Integration, PIG, Sqoop, Flume & knowledge of Mapper/Reducer/HDFS
Handson experience Installation,configuration,maintenance,monitoring,performance andtuning,
and troubleshooting Hadoop clusters in different environments such as Development Cluster, Test
Cluster and Production
Defined file system layout and data set permissions
Monitor local file system disk space usage, log files, cleaning log files with auto script
Extensive knowledge of Front End technologies like HTML, CSS, Java Script.
Good working Knowledge in OOA & OOD using UML and designing use cases.
Good communication skills, work ethics and the ability to work in a team efficiently with
good leadership skills.
Big Data Hadoop, HDFS, MapReduce, Hive, Sqoop, Pig, HBase,
MongoDB, Flume, Zookeeper, Oozie.
Operating Systems Windows, Ubuntu, Red Hat Linux, Linux, UNIX
Programming or Scripting
Java, SQL, Unix Shell Scripting, C.,Python
Database MS-SQL, MySQL, Oracle, MS-Access
Middleware Web Sphere, TIBCO
IDE’s & Utilities Eclipse and JCreator, NetBeans
Protocols TCP/IP, HTTP and HTTPS.
Testing Quality Center, Win Runner, Load Runner, QTP
Fortinet, NewYork, NY Nov 2013- Present
2. Project Details:
Fortinet is company that provides networking software including IP communicator phone
software. IP enhancement project features a post sales project. All devices of Fortinet especially
Routers and Switches would be sending xml files to a centralised server on daily basis. Xml files
contain info about their location and activities done for the day. All xml files need to process to
get the device health and their usage, xmls are 20 percent of overall data and remaining will be
coming from RDBMS and flat files.
Frame works and tools used:
HDP 2.3 distribution for development Cluster
Hadoop eco systems Hive, Map reduce to process data Contribution
Writing Map reduce for processing xmls and flat files
Provided production support for cluster maintenance
Commissioned and decommissioned nodes as needed
There were 10 node cluster with Hortonworks data platform with 550 GB RAM, 10 TB SSDs and 8
Star schema was designed with fact tables and dimension tables
Worked on analyzing Hadoop stack and different big data analytic tools including Pig and
Hive, Hbase database and Sqoop
Taking trainings for new joiners into project
Triggered workflows based on time or availability of data using the Oozie Coordinator
Emerson Climate Technologies, Louisville, KY Jan 2011- October 2013
Emerson Climate is providing efficient HVAC system to buildings minimize cost energy used by
HVAC system. Project HVAC controller used in controlling HVAC speed on basis of occupancy in a
floor by analysing access card data of the employees. HVAC systems run at constant speeds
irrespective of occupancy in buildings. We developed a project where we collect data of our own
employees across many locations at 10 mins frequency and come up with recommendation of
HVAC speeds depending occupancy in a floor. Used Mapreduce and spark to clean and format
the data, jobs run at 10 mins frequency using Crontab to generate HVAC controller report.
Frame works and Tools Used
HDP 2.0 distribution for development Cluster
All the datasets was loaded from two different source such as Oracle, MySQL to HDFS and
Hive respectively on daily basis
We were getting on an average of 80 GB on daily basis on the whole the data warehouse.
We used 12 node cluster to process the data
3. Involved in loading data from UNIX file system to HDFS
Hadoop eco systems hive, Map reduce, Pyspark to process data
Implemented capacity scheduler to share the resources of the cluster and perform Hadoop
admin responsibilities as needed
Writing Map reduce and Pyspark jobs for cleansing and applying algorithms
Cassendra database was use to transform queries to Hadoop HDFS
Designed scalable big data cluster solutions
Monitored job status through email received from cluster health monitoring tools
Responsible to manage data coming from different sources.
BMO Harris Bank, Buffalo Grove, IL Aug 2010 – Dec 2011
BMO isfinancial servicesdepartmentthathelpscustomerwiththeirfinancial needsincludingcredit
BMO CreditCardproject wasdesignedtoextractraw data from differentsourcesintoHadoop Eco
systemtocreate andpopulate the necessaryHive tables.The mainaimof the projectisto centralize the
source of data forreport generationusinghistorical databasewhichotherwiseare generatedfrom
Worked on Importing and exporting data into HDFS in financial sector
Involved as a team in reviewing of functional and non-functional requirements for writing
debit processing in Atlanta location.
Implemented Oozie workflows to perform Ingestion & Merging of data in the MapReduce
jobs for credit card fraud detection.
Extracted files from Cassendra Database through Sqoop and placed in HDFS and processed.
Hands on experience in creating Hive tables,loading with data and writing hive queries which
will run internally in map reduce way to administer transactions.
Developed a custom File system plug in for Hadoop so it can access files on Data Platform.
This plug-in allows Hadoop MapReduce programs, HBase, Pig and Hive to work unmodified
and access files directly.
Expertise in server-side and J2EE technologies including Java, J2SE, JSP, Servlets, XML,
Design of GUI using Model View Architecture (STRUTS Frame Work).
Extracted feeds form social media sites such as Facebook, Twitter using Python scripts.
Environment: Hadoop 1x, Hive, Pig, HBASE, Sqoop and Flume, Spring, Jquery, Java, J2EE, HTML,
Lowe’s, Mooresville, NC Feb 2005 – July 2010
Sr. Java Developer
4. Lowes.com redesign and assembly haul away. Worked with different services based on Service
Oriented Architecture (SOA) and also standalone projects which utilize UNIX shell scripts to
execute java programs. Some of the REST and SOAP services include: catalog service, façade
service, pricing service, purchase history service, mylowes service, seo-redirect service. Also
worked in REST API automation project using Rest-assured framework. Involved in updating
small batch script based java projects which produces some CSV, excel, txt files and sends those
files through SFTP or email.
Developed new DAOs methods using Hibernate 4.3 as ORMfor application.
Used DOMParser to parse XML 1.1 data from file.
Used JAXB 2.0 annotations to convert Java object to/from XML 1.1 file.
Created a SOAP 1.2 web service and then got its WSDL 2.0 generated.
Created a Web Service Client and invoked the web service using the client
Developed a REST based service which reads the JSON 2.0 file and passed it as an argument
to the Controller which handles the multiple HTML 5.1 UI files.
Used Struts MVC framework for user authentication by using Ping Federate Server for single
sign on (SSO)
Used SAML to use many services by entering into the systemfor one service
Involved in coding front end using Swing, HTML, JSP, JSF, Struts Framework
IDesign and Development of Spring service classes and JSF pages
Involvedinall software developmentlifecycle phaseslike development,unittesting,regression
Responsible fordeveloping,configuring,ormodifyingRESTand SOAPwebservicesusing
UsedSpringJDBC as data layerto querydatabasesDB2 and Cassandra.
WorkedUNIXbatch applicationsthatgeneratesproductfeedsandXML files.
RestAssured,Ehcache,Maven,DB2,JDBC,Batch Scripting,DB2, WebSphere commerce,websphere.
Environment: Java, J2EE, JSP, ExtJS, Servlets, Struts, JDBC, Java Script, LifeRay, Google Web
Toolkit, Spring, EJB (SSB, MDB), Ajax, Websphere 6.1
Education Details-BE in IT , Guelph University,2005