1. Nickesha Nixon NICKESHANIXON@YAHOO.COM
SUMMARY
Around 5+ years of Software Testing, Development and Quality assurance of Client/Server and Web
based applications using Win Runner, Load Runner, Test Director, Quality Center , Quick Test pro and
Manual testing.
Proficient experience in Manual and Automated Testing of GUI and functional aspects of the Client-Server
and Web based Applications on multiple levels of SDLC and Testing Life Cycle (STLC)
Experience in writing automated scripts, designing Test procedures ,Manual Test cases
and preparing Quality feed back to QA team & manager.
Proficient in Testing methodologies ,Test Matrices and Trace matrix
Extensive experience in Functional testing, , Integration testing, Regression testing, GUI testing, Back-end
testing, Browser Compatibility testing, Ad-hoc testing, Black Box Testing, White Box Testing, System
Testing, Build Verification Testing, User Acceptance Testing.
Performed Testing Life Cycle during the various phases of the application. Involved in converting manual test
cases into automated scripts using TSL on Mercury Win runner, QTP.
Experienced in analyzing Functional Requirement Specifications (FRS) and conversant with System
Design Specifications (SDS).
Involved in entire QA Life Cycle (SDLC), which includes Design, Development and Implementation of the
entire QA process for the Relational Database, Web and Client/Server, IBM Mainframe Applications.
Developed Test cases for manual testing and automated them using Win Runner, Silk, Load Runner, Silk
performer and QTP.
Familiar with Agile Methodology - Scrum, V Model and Waterfall Models.
Excellent working knowledge in design & implementation of all QA Test strategy plans for both Manual and
Automation as demanded by the AUT
Extensively used Load Runner for Performance and Load Testing. The Avg. CPU usage, Response Time,
and TPS are analyzed for each scenario
Extensively used Test Director & Quality Center to write Test Cases and for reporting. All the Scripts are
maintained using Test Director & Quality Center.
Extensively uploaded test cases from MS Excel, MS Word to Test Director & Quality Center.
Experienced in Bug Tracking System and Process.
Well conversant with scripting languages like Java Script, VB Script, HTML, DHTML, and XML.
Strong in manual testing, Automated testing in Visual Basic & other protocol of automation tools
Experience in testing applications under. Net, Windows.
Proficient in Database (Oracle, SQL Server) testing skills using advanced SQL, PL/SQL
Versatile team player with good communication and problem solving skills with all management levels
High proficiency in Scheduling, Managing resources and activities in QA Teams
Setup and maintain QA Test Lab for testing & maintenance.
SOFTWARE SKILLS
Operating Systems/Languages - Windows 98/2003/XP, UNIX, C, C++, JAVA, SQL, VB 6.0
Quality Standards – Implementation knowledge of ISO 9001 & ISMS (Info. Security Management System)
Testing Tools - Mercury Interactive Win Runner 7.6/8.0, Load Runner 7.8/8.0, QuickTestPro 6.5/10.00, Test
Director 7.6/ 8.2, Quality Center 9.0.
Methodologies - Agile Methodology, Waterfall Methodology, Rational Unified Process (RUP), Software
Development Life Cycle (SDLC)
Conversant with Web Technologies - HTML, ASP, XML, DHTML, VB script, Java script.
Documentation Tools - MS-Office, MS Project.
EDUCATION:
Masters in Information Technology.
EXPERIENCE
American Eagle Outfitters Inc., Atlanta, GA May 2013 – Till present
Page 1 Pages of 4
2. Sr.QA Test Analyst
Description: With its market presence exploding over the last three years, American Eagle Outfitters (AE) needed to
extend its retail momentum to the online market place. But its outmoded commerce platform didn’t scale or integrate
customer touch points, making that objective far out of reach. The target is centralized management of not only its
Website, but its call center and order management systems, as well. Completed implementation of Netsuites integrated
CRM/ERP system which included: defining implementation strategy; converting company’s legacy financial system,
Point-Of-Sale (POS) system and internet store to NetSuite; training office, sales and warehousing personnel on the
software; and redesigning Internet, Warehouse, and Order Fulfillment and Shipping operations. As a member of the
team of QA testers, involved in designing test cases, writing test cases, execution of test cases and documenting the
findings.
Responsibilities:
Prepared Test Plans for each release, written Test Cases and executed them as part of Functional Testing.
Prepared Test Reports and Deliverables and submitted for version releases.
Throughout the project, was involved in giving clarifications on domain and product functionality for the team.
Regular interaction with the core developers helped us in fixing the defects in less time.
Played key role in defining test automation procedure and standards, creating Win Runner and
QuickTestProfessional scripts for all the modules, which reduced the regression cycle drastically and improved
the testing efforts for daily builds.
Performed Black Box testing and conducted Functionality and Regression testing on various phases of the
Management software
Worked on Rational Quality Manager (RQM) manging the test plan, test cases and results, Rational Functional
doing the heavy lifting of actually scripting and executing the tests.
Worked on Rational Quality Manager (RQM) to manage and run automated test scripts created with other test
tools.
Managed and executed the test process, using Agile Methodology.
Working knowledge of Team Foundation server and work on automated framework for regression testing using
POS
Tested GUI applications and backend database functionality using QTP.
Created automated Load test scripts using Load Runner.
Gathered requirements for the integration of POS system with the supply chain system
Conducted GUI and functionality testing using QTP.
Conducted data driven testing using QTP to conduct backend testing.
Performed White Box testing by using Expediter tool and updated some of the PROCs , JCLs and SAS
programs in the Test region. Submitted batch JCLs in order to create output datasets for verification.
Used Test Director and Mercury Quality Center for updating the status of all the Test Cases & Test Scripts
that are executed during testing process.
Performed Load and Stress Testing using Load Runner.
Automated confidence tests that run on new builds on regular basis.
Involved in setting up different configuration environment for compatibility testing and manual testing.
Upgraded the existing Test Scripts and created new scripts for client application to be able to work for new
versions and patches, which improved product quality.
Evaluated testing results for each potential release build using Test Director, Quality Center and Bugzilla
reports, listing summarized bug information in priority sequence, recommended viability of release for
production.
Involve in preparing Trace Matrix to design test cases
Preparing Weekly action report & QA feed back to QA team & Manager.
Environment: Web Logic, Java, Quick Test Pro 8.2, Mercury Quality Center 9.0, Unix, Load Runner 8.2, Sun Java,
MSJVM, Oracle 8i/9i/10g, SQL Server 2000, DB2, MS-Project 2000/2003.
Target Corporation, Minneapolis, MN Oct 2011 – Mar 2013
QA Test Analyst
Target Corporation has over 1400 stores in 47 states and is among the biggest players in the retail sector in the US. The
client needed a secure, robust, and flexible platform to support all of their POS devices. The solution was required to
result in cost savings, improved sales tools, easy integration with mobile devices, smooth integration between disparate
systems, and electronic audit trails
Page 2 Pages of 4
3. Responsibilities:
Worked closely with the loan officers (SMEs) to obtain a detailed knowledge of the loan life cycle process,
risk analysis etc;
Interacted constantly between legal and marketing departments with the PM regarding various compliances
which need to be in place before anything can be finalized.
Developed function specifications for modifications of POS system for multiple chains. Included integrating
new chain into existing systems which encompassed, gap analysis, specific brand requirements and full POS
integration
Gathered the business requirements;
Performed requirement analysis and gap analysis;
Documented the business requirements for the developers,
Developed Test cases for manual testing and automated them using Win Runner, Silk, Load Runner, Silk
performer and QTP.
Maintained and kept track of stakeholders requests for enhancements and changes using Rational Clear Quest;
Manually verified the validity of the failed test cases. Performed Positive, Negative and Black box testing on the
application.
Participated in the system design exercise with the Application development team and utilized Agile
methodology for development and implementation.
Translated Business processes into Informatica Mappings for building Data marts.
Created customized report using OLAP Tools such as Crystal Report for business use
Wrote Test Cases in Mercury Quality center that would test various Test scenarios.
Used UML notations for Object Oriented Design and Documentation.
Interfaced with the development team utilizing ASP, JavaScript and Visual Basic and produced detailed user-
interface prototypes and performed usability testing.
Manually tested and GTS modules using Quality Center.
Wrote Test plan and Test cases for the Integration testing and system testing.
Did Integration Testing, GUI testing, Smoke and Sanity Testing, and Acceptance testing in the new builds for
Basic Functionality Checking.
Provide implementation assessment, strategy, and mentoring services for Rational Rose, UML and RUP.
Served as a resource for analytical services utilizing SQL Server, TOAD/Oracle, and SQL Developer.
Developed and maintained the project plan using MS Project
Designed and implemented SQL queries for QA testing;
Assisted the PM in setting realistic project expectations and in evaluating the impact of changes on the project,
and conducted project related presentations;
Prepared user manual and performed user acceptance testing;
Played a key role in the planning and testing the system enhancements and conversions.
Environment: Load Runner, Windows NT /2000, SQL Server 2005, Sybase, MS Word, MS Excel, MS PowerPoint,
Ms-Visio, Cognos BI, MS Project, PowerBuilder, Clear-Quest, Rational Test Manager, GIS Tools, Lotus Notes R6
client, HTML, XML, Notes SAS.
Lowe’s Companies Inc. Mooresville, NC Jan 2010 - Sep 2011
QA Test Analyst
POS application
Lowe’s Companies, Inc., together with its subsidiaries, operates as a home improvement retailer in the United States
and Canada. The company offers a range of products for home decorating, maintenance, repair, remodeling, and
property maintenance. The project was based on testing the basic functionalities of POS 360 System, which included
price management, item management and other management applications.
Responsibilities:
Documented test cases using Quality Center corresponding to business rules and other operating procedures.
Prepared Test Cases and Test Plan for the different modules of the application
Worked on POS Self Check out / POS QA Testing along with its integration components
Point of Sale application (POS) was customized and tested to generate statements of regular shipping to
different retail sales outlets of Lowe’s Nationwide.
Designed the performance test scenarios for smoke test, baseline test, scalability test and stress test
Data was selected from different modules that handle the sales of various goods in each outlet for a specific
Page 3 Pages of 4
4. time span depending on the sales record and performance.
Validated and exercised the paths through the code for the different test case inputs and determined the
appropriate outputs.
Tested different functions of the POS system like scanning, payments, returns, reports etc. to make sure that the
system met the requirements and expectations.
Addressed issues of performance, scalability, reliability, extensibility, manageability and security.
Conducted end-to-end testing including system, functional, and regression tests
Documented and tracked defects until resolved
Tested the compatibility of the different system and devices to the software that was installed.
Supervised new contractors by training and assigning a test plan, providing daily and weekly status updates to
team lead.
Performed data transfer from one server to another using SQL Enterprise
Validated daily reports generated from POS applications and analyzed them to ensure accuracy
Environment: HP Quality Center, Windows XP, MS office suite, SQL Enterprise
Wells Fargo Bank, Des Moines, Iowa Sep 2008 – Dec 2009
QA Analyst
The software package was used to manage the loans administered to the bank’s client. Using this package the bank
officers and administers were able to instantly access the details of a loan such as the name, account number, loan
amount, interest rate, outstanding balance, payment history etc. of each client.
Responsibilities:
Involved in gathering Business Level Requirements and reviewed manual testing methods
Prepared Test plans which includes an introduction, various test strategies, test schedules, QA teams role, test
deliverables, etc.
Responsible for writing Test cases to cover overall quality assurance using Test Director
Performed initial manual testing of the application as part of sanity testing
Performed various tests such as positive, negative to check business functionality manually
Performed Black Box testing and conducted Functionality and Regression testing on various phases of the
Management software
The tests also included GUI testing: Testing for validation and display of screens
Created dummy accounts on our system and verified the account generation process and date integrity
Prepared complex queries to retrieve data for database testing
Created and enhanced TSL scripts using Win Runner
Created various checkpoints in the script using Win Runner
Created reports using Mercury Interactive Test Director
Application and network performance testing analysis
Developed Shell Scripts to automate loading processes
Writing shell scripts to run batch processes and to run Oracle backgroundProcesses
Used SQL queries to test the integrity of data by querying the database
Interacted with the development team to assure that all the defects are addressed in time
Participated in the team meetings to discuss the issues arising out of testing
Environment - Win runner, Test Director, Windows 2000 Professional, SQL
Page 4 Pages of 4