SlideShare ist ein Scribd-Unternehmen logo
1 von 85
Software Test Automation Overview


                      Rohan Bhattarai
Contents
   Overview of STA
   BEP and ROI
   Tools
   Automation Framework
Contents
   Overview of STA                                      Choosing the right tool – The Strategic
       History                                           Steps
       Myths and Truths                                 Tools Classification
       Why Automation Projects Fail?                    Short List Tools – Functional
       What is TA?                                      Short List Tools – Performance
       Why TA is needed?                                Evaluate Vendors
       Manual vs. Automated                             Functional Test Tools - Analysis
       Pros & Cons                                      Feasibility Analysis
       What to Automate?                            Automation Framework
       What Not to Automate?                            Where to start?
   BEP and ROI                                          What is AF?
       BEP                                              Benefits of AF
                                                         How does it work?
       BEP - Example
                                                         Architecture
       ROI
                                                         Framework Approaches
       Classic ROI
                                                             Record and Playback
       Real ROI                                             Script Based Approach
       Benefits of ROI                                      Keyword Driven Approach
       ROI Calculator                                       Data Driven Approach
                                                             Hybrid Keyword and Data Driven Approach
   Tools
       Choosing the right tool – The Strategic
        Approach
Contents
   Overview
       History
       Myths and Truths
       Why Automation Projects Fail?
       What is TA?
       Why TA is needed?
       Manual vs. Automated
       Pros & Cons
       What to Automate?
       What Not to Automate?
History

 Moving swiftly past the hype

 Historically Automation is perceived as a “Silver Bullet” of the Testing
  world
   “The term has been adopted into a general metaphor, where “silver bullet”
    refers to any straight forward solution perceived to have extreme
    effectiveness. The phrase typically appears with an expectation that some
    new technology or practice will easily cure a major prevailing problem”.
History
 Historical trends in test automation frameworks:

                                                                                           2008 – 2011
        1993 – 2001               2001 – 2005                 2005 – 2008
                                                                                         4th Gen – Hybrid
    1st Gen – Modularity         2nd Gen – Data                3rd Gen –
                                                                                          Keyword Data
           Driven                Driven                      Keyword Driven
                                                                                              Driven

•   1993 – 1999: WinRunner x.x                               •   2005 – 2006: QTP 8.x, RFT 7.x
•   1999 – 2001: WinRunner 6.x                               •   2006 – 2008: QTP 9.x, RFT 8.x
•   2001 – 2004: WinRunner 7.x, RobotJ 1.0, XDE Tester 1.0   •   2007: Selenium 2
•   2004 – 2005: QTP 7.x, RFT 6.x                            •   2008 – 2010: QTP 10.x, RFT 8.1
•   2004 : Selenium 1                                        •   2010 – 2011: QTP 11, RFT 8.2
Myths and Truths
1.       Test Automation is simple, that every tester can do
         it

        Promoted by the Sales people by simply saying:
            Record the script
            Enhance the script by adding functions and data driving
            Run the scripts
            Report results

        Under this influence QA manager
         can proudly say “All our testers
         are doing test automation”.
Myths and Truths
   But in Reality: TA is a software development task

       It should be designed, developed and tested
       Need to have some kind of a programming
        background to implement test automation. TA is not
        as
        complex as C++/C#/Java development.
       TA components are assets that should be
        treated like application source code

   Don’t fall into tool vendor sales pitch
    …remember Record & Playback is not real test
    automation
Myths and Truths
2.       Commercial TA tools are expensive

        Under the influence of this myth some
         companies, especially the small ones:
            Try to develop their own test automation tools
            Use scripting languages like Perl and Ruby
            Use shareware test tools
            Do not consider test automation at all
Myths and Truths
   But in Reality: Commercial TA tools are not that
    expensive

       Per seat license for the most expensive automation tool is
        $8K, which can be used for 5 years.
       Maintenance/Support fees are 20% of tool cost or $1,600 per
        year
       The cost of this tool is $8K/5+$1,600 = $3,200 per year
       The automation developer cost with overhead is $100K per
        year
       The cost of this tool is just 3% of the person who uses it, but
        productivity gain can be very significant
Myths and Truths
                                                                             Learning from past experience
  Truth: 92% FAIL to meet target ROI
            Automation Projects


                                                                             8%
                             40%                Working
                                                                                                         ROI
                                                Failure
          60%                                                                         32%




                                       2004                                   2010 (Estimated)
 Industry: Test Automation             $1 Billion                             $ 6.3 Billion
 (Net Worth)
 Automation Projects                   $ 0.6 Billion                          $ 3.8 Billion
 (Failure Cost)

Source 1: http://www.nytimes.com/2006/07/26/technology/26hewlett.html
Source 2: http://www.slideshare.net/Jonathon_Wright/hybrid-keyword-data-driven-automation-frameworks-jonathon-wright
Why do Automation Projects typically Fail?
   IDT study(www.idtus.com)


                        Misc
                                           Lack of Time
          Tool          15%
                                              37%
      Incompatibility
          11%



              Lack Of
              Budget
               17%              Lack of
                               Expertise
                                 20%
Why do Automation Projects typically Fail?
   Lack of defined automation
    methodology
   Automation is not treated as a
    legitimate project with the necessary
    planning / resources
   Test Automation is typically performed
    at the end of the SDLC
   After the initial success the
    automation scripts are not maintained
    for future builds
Why do Automation Projects typically Fail?
   Testers are typically untrained in test tools and
    programming techniques
   No modularization (reusable functions) in automation
    scripts
   Automated tests cases are usually designed based
    on front end functionality (black box testing)
What is Software Test Automation?
   Test Automation is the use of software to execute
    tests without Human intervention
   It refers to the activities and efforts that intend to
    automate engineering tasks and operations in a
    software test process using well-defined
    strategies and systematic solutions.

Not like Rube Goldberg
cartoons
Why TA is needed?
   Objectives:
       To free engineers from tedious and redundant manual
        testing operations
       To speed up a software testing process, and to reduce
        software testing cost and time during a software life cycle
       To increase the quality and effectiveness of a software test
        process by achieving pre-defined adequate test criteria in a
        limited schedule
Key to Success
To reduce manual testing activities
and redundant test operations using
 a systematic solution to achieve a
      better testing coverage.
Manual vs. Automated Testing
   Manual Testing:
       Testing time is consuming and tedious
       Inefficient in today’s shorter SDLC
       Delay the ability to thoroughly test an application
       Critical bugs escape undetected
       What happens when multiple platforms involved

   Automated Testing:
       Higher efficiency that accelerate the testing cycle and promote
        software quality
       Optimizes software quality and testing efficiency by delivering
           Reusability
           Predictability and Consistency
           Productivity
       Enables accurate assessment of quality level
Pros and Cons
 Pros:
  Speed
  Reusability
  Accuracy
  Run Anytime
  Efficiency
Pros and Cons
 Cons:
  SignificantInvestment
  Maintenance
  Not as Robust
  Error Detection
  Cannot Think
What to Automate?
       Regression Tests: Stabilized tests that verify stabilized functionality

       Tests rerun often: Tests that are executed regularly vs. rarely

       Tests that will not expire shortly: Most tests have a finite lifetime
        during which its automated script must recoup the additional cost
        required for its automation

       Tedious/Boring tests:
        tests with many calculations and number verifications

        repetitive tests performing the same operations over and over

        tests requiring many performance measurements

        Just plain boring tests

       Reliably repeatable
What NOT to Automate?
   Unstable functionality: Not reliably repeatable

   Rarely executed tests: poor Return-On-Investment

   Tests that will soon expire: poor Return-On-Investment

   Requiring in-depth business analysis:
       some tests require so much business specific knowledge that it becomes
        prohibitive time wise to include every verification required to make its
        automated script robust enough to be effective

       exceedingly complex tests are sometimes not possible to automate because
        computers cannot think
Contents
   BEP and ROI
       BEP
       BEP - Example
       ROI
       PayBack Period
       Classic ROI
       Real ROI
       Benefits of ROI
       ROI Calculator
BEP
 Break-Even  Point (BEP) is the
 point at which cost or expenses
 and revenue are equal: there is no
 net loss or gain
BEP - Example
                                                                     Resources (R ) for (n) Automated Tests
                Preparation (V)        Execution (D)          ROI     Rn = Aa / Am = (Va + n*Da) / (Vm +
                   (in Mins)             (in Mins)           using                   n*Dm)
                                                             Manu
                                                               al
              Manual Automated Manual Automated                               1       5        10        20
Test
Scenario 1         30             60     11            1.1    33%        149%      77%       51%       33%
Scenario 2         30             60     11            1.1    33%        149%      77%       51%       33%
Scenario 3         30             60      9            0.9    36%        156%      86%       58%       37%
Scenario 4         30             60     10             1     34%        153%      81%       54%       35%
Scenario 5         30             60     10             1     34%        153%      81%       54%       35%
Scenario 6         30             60     10             1     34%        153%      81%       54%       35%
Scenario 7         30             60     15            1.5    27%        137%      64%       42%       27%
Scenario 8         30             60     30             3      5%        105%      42%       27%       19%
Scenario 9         30             60     22            2.2    16%        120%      51%       33%       22%
Scenario 10        30             60     12            1.2    31%        146%      73%       48%       31%
Total             300         600       140            14     28%        142%      71%       47%       31%
ROI
   Return on Investment
   ROI = BENEFIT/COST
   ROI = (total benefit – total cost) / (total cost)
   ROI = (cost of manual – cost of automation) / cost of
    automation
       Where,
           Automation Cost = Price Of HW + Price of SW + Development
            Cost + Maintenance Cost + Execution Cost
           Manual Testing Cost = Development Cost + Maintenance Cost
            + Execution Cost


   Looks right, Doesn’t it?
Classic ROI
   Problems with classic ROI calculation:
       You can’t compare Automated Testing and Manual
        Testing. They are not the same and they provide different
        information about the AUT.
       You can’t compare cost of multiple execution of
        automated tests vs. manual tests. You would never dream
        of executing that many test cases manually
   So then…what is real ROI?
Real ROI
   ROI value is not the value of Automation vs. Cost of
    executing these tests manually
   Automation ROI value is the benefit of this type of
    testing, and it can be:
       Reducing Time to Market
       Increased Test Efficiency (Productivity)
       Increased Test Effectiveness
Benefits of ROI
   Reduced Time to Market
     Products delivered quickly
     Makes people available to work on other projects
     Higher margins, if no competitive products in market

   Productivity and Effectiveness
       More testing gets done faster, increasing the odds of
        finding defects
       Defects found early have better chances of being
        fixed
       Manual Testers can concentrate on clever ways to
        finding defects, instead of typing test inputs and
        verify output.
Benefits of ROI
   About 7% of bug fixes create new bugs, sometimes in
    already tested parts of the system. With automation
    you can rerun tests for those modules. This almost
    never happens when testing is done manually.
ROI Calculator
   ROI Calculator:

       Source 1:
        http://www.aspiresys.com/testautomationroi/index.php
       Source 2:
        http://www.elbrus.com/services/test_automation_roi_
        calc/
ROI Calculator
ROI Calculator
Contents
   Tools
       Choosing the right tool – The Strategic Approach
       Choosing the right tool – The Strategic Steps
       Tools Classification
       Short List Tools – Functional
       Short List Tools – Performance
       Evaluate Vendors
       Functional Test Tools - Analysis
       Feasibility Analysis
Tools




        There is no single best testing tool;
        rather, different tools are more or less
        appropriate in different environments.
Tools
   Over 300 Test Tools are available
    (http://www.softwareqatest.com)
       Load/Performance tools – 54
       Web Functional/Regression – 60
       Java Test tools - 48
       Other Web tools – 76


Which tool is right for you?
Choosing the right tool – The Strategic
Approach
• Is there an organizational methodology for test
  automation?
• Which applications/processes?
• What is the impact to current project schedules?
• What is the effort in maintaining automated
  tests?
• What are the costs?
• What about tools integration?
• What about Continuous Integration?
Choosing the right tool – The Strategic Steps
•   Step 1: Define and Refine Requirements
•   Step 2: Communicate the Impact
•   Step 3: Develop Evaluation Methodology
•   Step 4: Select, Procure, and Implement
Step 1: Define and refine requirement
   Create a list of organizational requirements
       What problems do you want the tool to solve?
       What capabilities will the tool need to be effective in
        your environment? Other lifecycle tools?
       What constraints, budgetary or otherwise?
   Identify compatibility issues
       What operating systems does your application
        support?
       What is the development environment?
       Does your application integrate with third-party
        software?
       Does the application use custom controls?
Step 1: Define and refine requirement
   Identify tool audience
       Who will use the tool on a day-to-day basis?
       What is the level and mix of user skill levels?
       Is your organization willing to invest in training?
   Define technical or business requirements
       Does your organization have additional requirements?
           Software standards
           Technical standards
           Procurement rules
           Preferred vendor rules
Step 1: Define and refine requirement
   Identify budget constraints
       How much can we afford?
       How much is this worth?
   New requirements may surface based on research
           Did not know about
           Forgot to include
Step 2: Communicate the Impact
   Automated testing is part of the larger strategic
    application development endeavor
       Communicate the effects of implementing a tool
       Chance to discuss and mitigate concerns
           How tool may change job description
           Commitment to training
           Implementation strategy
       Discussion may imply additional requirements
           Business, functional, technical, or operational
Step 3: Develop the evaluation methodology
   How will tools be compared?
   Are there specific features that may differentiate
    one tool from another?
   Are there specific things that can eliminate a tool
    from consideration?
       Preferred vendor list can reduce evaluation scope
       Demos and evaluations are time-consuming
       Identify a representative set of activities to accomplish
        with the tool during the evaluation
Step 4: Select, procure, and implement
   Make an informed selection
   Follow organization’s procurement process
   Develop the implementation plan
       What
       When
       Why
       Who
       How
Step 4: Select, procure, and implement
   Develop an implementation plan
       Enterprise applications requiring multiple releases
       Applications that must produce a consistent set of results
        using stable data
           These characteristics fully leverage reusability and predictability
            benefits of automated testing
   Take implementation one step at a time
       Take time for training
       Keep focus on staff issues and reactions
Step 4: Select, procure, and implement
   Develop a test plan
       Describes scope, approach, resources and schedule for
        all automated and manual activities
           Rule of Thumb: (Test scripts) 40% manual - 60% automated
   Create and deploy your automated tests
       Be selective with the automation of test scripts
           Verify the most critical functionality
           Are the most likely to expose defects
           Are expensive or impossible to perform manually
       Use the first automated suites you build for
           Smoke testing
           Regression testing
Decision?
     How to calculate the cost of functional test automation



                                                            Labor costs of               Labor costs of
Cost of test automation               Cost of tool(s)         script creation            script maintenance



If a test script will be run every week for the next 2 years, automate the test if the
cost of automation is less than the cost of manually executing the test 104 times.

     Automate if



                                           Cost of manually executing the test as many
   Cost of automation
                                           times as the automated test will be executed
Tools Classification
   Test Tool Types                      Basic Descriptions of Different Types of Test Tools
Test Information      Systematic solutions and tools support test engineers and quality assurance people to
Management            create, update, and maintain diverse test information, including test cases, test scripts,
                      test data, test results, and discovered problems.
Test Execution and    Systematic solutions and tools help engineer set up and run tests, and collect and
Control               validate test results.
Test Generation       Systematic solutions and tools generate program tests in an automatic way.
Test Coverage         Systematic solutions and tools analyze the test coverage during a test process based
Analysis              on selected test criteria.
Performance Testing   Systematic solutions and tools support program performance testing and performance
and Measurement       measurement.
Software Simulators   Programs are developed to simulate the functions and behaviors of external systems,
                      or dependent subsystems/components for a under test program.
Regression Testing    Test tools support the automation performance of regression testing and activities,
                      including test recording and re-playing.
Tools Classification
       Types of Test Tools          Test Tool Vendors                 Test Tools
Problem Management Tools      Rational Inc.             ClearQuest, ClearDDTS
                              Microsoft Corp.           PVCS Tracker
                              Imbus AG                  Imbus Fehlerdatenbank
Test Information Management   Rautional Inc.            TestManager
Tools
                              Mercury Interactive       TestDirectory
Test Suite Management Tools   Evalid                    TestSuiter
                              Rational Inc.             TestFactory
                              SUN                       JavaTest, JavaHarness
White-Box Test Tools          McCabe & Associates       McCabe IQ2
                                                        Junit
                              IBM                       IBM COBOL Unit Tester
                                                        IBM ATC
                                                        - Coverage Assistant
                                                        - Source Audit Assistant
                                                        - Distillation Assistant
                                                        - Unit Test Assistant
Tools Classification
Test Execution Tools           OC Systems                   Aprob
                               Softbridge                   ATF/TestWright
                               AutoTester                   AutoTester
                               Rational Inc.                Visual Test, Rational Functional
                                                            Tester
                               SQA                          Robot
                               Mercury Interactive          WinRunner, Quick Test Prof
                               Sterling Software            Vision TestPro
                               Compuware                    QARun
                               Seque Software               SilkTest
                               RSW Software Inc.            e-Test
                               Cyrano Gmbh                  Cyrano Robot
Code Coverage Analysis Tools   Case Consult Corp.           Analyzer, Analyzer Java
                               OC Systems                   Aprob
                               IPL Software Product Group   Cantata/Cantata++
                               ATTOL Testware SA            Coverage
                               Compuware NuMega             TruCoverage
                               Software Research            TestWorks Coverage
                               Rational Inc                 PureCoverage
                               SUN                          JavaScope
                               ParaSoft                     TCA
                               Software Automation Inc      Panorama
Tools Classification
Load Test and Performance Tools   Rational Inc.                   Rational Performance Tester
                                  InterNetwork AG                 sma@rtTest

                                  Compuware                       QA-Load
                                  Mercury Interactive             LoadRunner
                                  RSW Software Inc.               e- Load
                                  SUN                             JavaLoad
                                  Seque Software                  SilkPerformer
                                  Client/Server Solutions, Inc.   Benchmark Factory
Regression Testing Tools          IBM                             Regression Testing Tool(ARTT)
                                                                  Distillation Assistant
GUI Record/Replay                 Software Research               eValid
                                  Mercury Interactive             Xrunner
                                  Astra                           Astra QuickTest
                                  AutoTester                      AutoTester, AutoTester One
Short List Tools - Functional
Vendor      Tool                     Test Suite - Companion Tools

Compuware   TestPartner              QACenter Enterprise Edition+

Empirix     e-Tester                 e-TEST suite

            Rational Functional
IBM                                  Rational Suite
            Tester

Mercury     QuickTest Professional   Quality Center


RadView     WebFT                    TestView Suite


Seapine     QA Wizard Pro            TestTrack Pro

Borland
            SilkTest                 SilkCentral Test Manager
(Segue)
Short List Tools - Performance
Vendor      Tool                   Test Suite - Companion Tools

Compuware   QALoad                 QACenter Enterprise Edition+

Empirix     e-Load                 e-TEST suite

            Rational Performance   Rational Suite
IBM
            Tester

Mercury     LoadRunner             Quality Center


RadView     WebLOAD                TestView Suite


Facilita    Forecast               ForecastWeb, ForecastNet, ForecastDB

Borland
            SilkPerformer          SilkCentral Test Manager
(Segue)
Evaluate Vendor
         Risky           Strong
         bets Contenders performers     Leaders
Strong




     Current
     offerings




         Weak                Strategy             Strong
Functional Test Tools - Analysis
       Tool                          Pros                               Cons
IBM/Rational        •Built as Eclipse Plug-In with full IDE   •Insufficient browser
Functional Tester   and Java support                          support
(RFT)               •Supports Web 2.0, Java or .NET           •Licensed product
                    applications
                    •Full GUI Object Map repository
HP/Mercury Quick    •Supports Web 2.0, Java or .NET           •VisualBasic scripting is
Test Pro (QTP)      applications                              limited
                    •Full GUI Object Map repository           •No IDE (may change in
                    •Seamless integration with                new release)
                    QualityCenter                             •Licensed Product


Selenium RC &       •Good browser support                     •No GUI Object repository
IDE                 •Good language support (Java,             •Only web application
                    Ruby,C# )                                 support
                    •Can be easily extended as JUnit
                    suite
                    •Open-source
Feasibility Analysis
   FA Matrix Available
   Operational Feasibility
   Technical Feasibility
   Economic Feasibility
   Schedule Feasibility
FA Matrix
FA Matrix
Contents
   Automation Framework
       Where to start?
       What is AF?
       Benefits of AF
       How does it work?
       Architecture
       Framework Approaches
           Record and Playback
           Script Based Approach
           Keyword Driven Approach
           Data Driven Approach
           Hybrid Keyword and Data Driven Approach
Where to Start?

“Start                            Quick wins
SMALL                             should be                    NEVER expect
think BIG”                        avoided                      to
                                                               automate 100%
          First find out                 Then you can work out
          What needs to be tested?       What needs be automated?
          What can be tested?            What can be automated?
          What could be tested?           What could be automated?
                                                             Focus on key
                                                             critical
“Under                            Keep it                    business
promise,                          simple,                    processes
Over deliver?”                    wherever
                                  possible
What is Automation Framework?
   Framework – independent of application or
    environment under test
   A Test Automation Framework is a set of
    assumptions, concepts and tools that provide
    support for Automated Software Testing.
   A reusable set of libraries or classes for a software
    system (or subsystem).
   A correctly implemented Test Automation Framework
    can further improve ROI by reducing the
    development and maintenance costs.
Benefits of Framework
   Ease of Use – easy to learn and easy to use
   Time – faster than capture/replay and scripting approach
   Maintainability – significantly reduces the test maintenance
    effort
   Reusability – due to modularity of test cases and library
    functions
   Manageability - effective test design, execution, and
    traceability
   Accessibility – to design, develop & modify tests whilst
    executing
   Availability – scheduled execution can run unattended on a
    24/7 basis
   Reliability – due to advanced error handling and scenario
    recovery
   Flexibility – framework independent of AUT or environment
   Measurability – customizable reporting of test results ensure
How does it work?
   Different Implementations

   One Example of Keyword Driven Framework could be:
       Spreadsheets, Spreadsheets, Spreadsheets
       Test Objects
       Keywords and Methods = Toast!
       Parameters
       Description or Call the 911?
Architecture
Framework Approaches
   Record and Playback
   Script Based Approach
   Keyword Driven Approach
   Data Driven Approach
   Hybrid Keyword and Data Driven Approach
Manual Testing – Looking back
                  + easy & cheap to start
                  + flexible testing
                  - expensive every execution
                  - no auto regression testing
                  - less coverage measurement
Record and Playback
+ flexible testing
- expensive first execution
+ auto regression testing
- fragile tests break easily
- less coverage measurement
Script Based Approach
                  +/- test impl. = programming
                  + automatic execution
                  + auto regression testing
                  - fragile tests break easily?
                    (depends on abstraction)
                  - less coverage measurement
Data Driven Approach
 Automation is data-centric
 User defines just data sets to drive tests with
 Will have an external data source (DB tables, Excel
  spreadsheets, XML for data sets)
 Flow control (navigation) is normally done by the test
  script not by the data source
Ex: data set exercises creation of new sales accounts
  functionality; stored in a DB table account_data

    CompanyName          PrimarySalesPerson        Street           Zip     City     State



      Genesis Inc.            Phil Collins       5775 Main st      30075   Atlanta    GA



    RollingStones Inc.      Mick Jagger Jr.   2332 Washington st   02111   Boston     MA
Keyword/Action Driven Approach
 + abstract tests
 + automatic execution
 + auto regression testing
 - robust tests
 - less coverage measurement
Keyword/Action Driven Approach
    Automation is action-centric
    De-compose your test cases/modules into granular re-usable
     keywords
    The idea is for non-coders to be able to create automated test
     cases with action keywords
    User defines flow control of the test via action keywords

    Example: Test Case “Verify Checking Account Balance”
    1. Enter Username and Password and Click submit button  step 1 is
       action Login
    2. Enter “Phil Collins” as a Sales Person and Click Submit button
    3. Verify the Sales Person was successfully created and Logout


    So you may want to choose the following re-usable action keywords:
    EnterText, Click, Login, VerifyExists
Benefits of Keyword Driven Approach
   This Framework addresses the most common
    problem with test automation:
    Automation Engineers do not have domain
    knowledge and the End Users (Subject Matter
    Experts/Test Engineers) usually do not have
    automation expertise.
   When properly implemented and maintained, it
    presents a superior ROI because each business
    event is designed, automated and maintained as
    a discrete entity.
   Keywords can then be used to design test
    cases, but the design and automation overhead
    for the keyword has already been paid.
Benefits of Keyword Driven Approach
   Reduced the cost and time spent maintaining and
    updating tests
   The modular structure of keyword-driven testing means
    that new tests can easily be created from pre-existing
    modules
   The test team is capable of entirely automating tests,
    even without programming knowledge
   Can be easily modified to use with different test tool
   Reusability across different projects

   Classic Example:
    Object                 Action       Data

    Textfield (username)   Enter Text   <username>
Keyword/Action Driven Approach
       May have an external data source (DB tables, Excel
        spreadsheets, XML for data sets) with action keywords

Ste     Descriptio
p       n              Page      Action       Module   Type    Object          Expected
                                              UserLo           .id:=LoginSu    .text:=Login
1       Login          Home      Login        gin      N/A     bmit            Successful
        Enter New   CreateS
        Sales       alesPers                                   .text:=Sales    .value:=Phil
2       Person data on       EnterText                 Field   PerName         Collins
                       CreateS
        Click          alesPers                        Butto   .id:=SubmitS    .url:=.*createdSal
3       Submit         on       Click                  n       alesPer         esPerStatus.html
        Verify Sales   CreateS
        Person         alesPers                                                .value:=User
        Creation       onStatus VerifyExist                    .id:=Creation   Created
4       Successful              s                      DIV     Status          Successfully
Hybrid Keyword and Data Driven Approach
   Combines the best of both worlds
       User defines data sets to drive tests with
       User also defines flow control of the test via action
        keywords
       May have an external data source (DB tables, Excel
        spreadsheets, XML for data sets) with action
        keywords in addition to generic and test case specific
        data sets
Architecture
Open2Test Automation Framework
Open2Test Automation Framework
Open2Test Automation Framework
Model Based Approach
+ abstract tests
+ automatic execution
+ auto regression testing
+ auto design of tests
+ systematic coverage
+ measure coverage of model and
requirements
- modelling overhead




    Emerging Approach
References
   http://www.slideshare.net/Jonathon_Wright/hybrid-
    keyword-data-driven-automation-frameworks-
    jonathon-wright?src=related_normal&rel=805408
   http://www.ibm.com/developerworks/rational/library/5
    91.html
   http://www.keane.com/resources%2Fpdf%2FWhiteP
    apers%2FWP_ROIforTestAutomation.pdf
Summarizing…
   Overview of STA                                      Choosing the right tool – The Strategic
       History                                           Steps
       Myths and Truths                                 Tools Classification
       Why Automation Projects Fail?                    Short List Tools – Functional
       What is TA?                                      Short List Tools – Performance
       Why TA is needed?                                Evaluate Vendors
       Manual vs. Automated                             Functional Test Tools - Analysis
       Pros & Cons                                      Feasibility Analysis
       What to Automate?                            Automation Framework
       What Not to Automate?                            Where to start?
   BEP and ROI                                          What is AF?
       BEP                                              Benefits of AF
                                                         How does it work?
       BEP - Example
                                                         Architecture
       ROI
                                                         Framework Approaches
       Classic ROI
                                                             Record and Playback
       Real ROI                                             Script Based Approach
       Benefits of ROI                                      Keyword Driven Approach
       ROI Calculator                                       Data Driven Approach
                                                             Hybrid Keyword and Data Driven Approach
   Tools
       Choosing the right tool – The Strategic
        Approach
THANK YOU

 Rohan Bhattarai

Weitere ähnliche Inhalte

Was ist angesagt?

Test Automation Strategy
Test Automation StrategyTest Automation Strategy
Test Automation StrategyMartin Ruddy
 
Why Test Automation Fails
Why Test Automation FailsWhy Test Automation Fails
Why Test Automation FailsRanorex
 
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test AutomationAdrian Smith
 
Test Automation - Everything You Need To Know
Test Automation - Everything You Need To KnowTest Automation - Everything You Need To Know
Test Automation - Everything You Need To KnowBugRaptors
 
Automation testing strategy, approach & planning
Automation testing  strategy, approach & planningAutomation testing  strategy, approach & planning
Automation testing strategy, approach & planningSivaprasanthRentala1975
 
Automated Testing vs Manual Testing
Automated Testing vs Manual TestingAutomated Testing vs Manual Testing
Automated Testing vs Manual Testingdidev
 
How to Design a Successful Test Automation Strategy
How to Design a Successful Test Automation Strategy How to Design a Successful Test Automation Strategy
How to Design a Successful Test Automation Strategy Impetus Technologies
 
Introduction to Automation Testing
Introduction to Automation TestingIntroduction to Automation Testing
Introduction to Automation TestingArchana Krushnan
 
Introduction to Test Automation - Technology and Tools
Introduction to Test Automation - Technology and ToolsIntroduction to Test Automation - Technology and Tools
Introduction to Test Automation - Technology and ToolsKMS Technology
 
Test Automation failure analysis
Test Automation failure analysisTest Automation failure analysis
Test Automation failure analysisPrashant Chaudhary
 
Programming skills for test automation
Programming skills for test automationProgramming skills for test automation
Programming skills for test automationRomania Testing
 
Test Automation
Test AutomationTest Automation
Test Automationrockoder
 
Framework for Web Automation Testing
Framework for Web Automation TestingFramework for Web Automation Testing
Framework for Web Automation TestingTaras Lytvyn
 
Automation test scripting techniques
Automation test scripting techniquesAutomation test scripting techniques
Automation test scripting techniquesZhu Zhong
 
Career in Software Testing | Skills Required for Software Test Engineer | Edu...
Career in Software Testing | Skills Required for Software Test Engineer | Edu...Career in Software Testing | Skills Required for Software Test Engineer | Edu...
Career in Software Testing | Skills Required for Software Test Engineer | Edu...Edureka!
 
Automation Tools Overview
Automation Tools OverviewAutomation Tools Overview
Automation Tools OverviewMurageppa-QA
 

Was ist angesagt? (20)

Test Automation Strategy
Test Automation StrategyTest Automation Strategy
Test Automation Strategy
 
Why Test Automation Fails
Why Test Automation FailsWhy Test Automation Fails
Why Test Automation Fails
 
Test Automation - Keytorc Approach
Test Automation - Keytorc Approach Test Automation - Keytorc Approach
Test Automation - Keytorc Approach
 
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
 
Automation test scripting guidelines
Automation test scripting guidelines Automation test scripting guidelines
Automation test scripting guidelines
 
Test Automation - Everything You Need To Know
Test Automation - Everything You Need To KnowTest Automation - Everything You Need To Know
Test Automation - Everything You Need To Know
 
Automation testing strategy, approach & planning
Automation testing  strategy, approach & planningAutomation testing  strategy, approach & planning
Automation testing strategy, approach & planning
 
Automated Testing vs Manual Testing
Automated Testing vs Manual TestingAutomated Testing vs Manual Testing
Automated Testing vs Manual Testing
 
How to Design a Successful Test Automation Strategy
How to Design a Successful Test Automation Strategy How to Design a Successful Test Automation Strategy
How to Design a Successful Test Automation Strategy
 
Introduction to Automation Testing
Introduction to Automation TestingIntroduction to Automation Testing
Introduction to Automation Testing
 
Introduction to Test Automation - Technology and Tools
Introduction to Test Automation - Technology and ToolsIntroduction to Test Automation - Technology and Tools
Introduction to Test Automation - Technology and Tools
 
Test Automation failure analysis
Test Automation failure analysisTest Automation failure analysis
Test Automation failure analysis
 
Programming skills for test automation
Programming skills for test automationProgramming skills for test automation
Programming skills for test automation
 
Introduction to Software Test Automation
Introduction to Software Test AutomationIntroduction to Software Test Automation
Introduction to Software Test Automation
 
Test Automation
Test AutomationTest Automation
Test Automation
 
Test Automation
Test AutomationTest Automation
Test Automation
 
Framework for Web Automation Testing
Framework for Web Automation TestingFramework for Web Automation Testing
Framework for Web Automation Testing
 
Automation test scripting techniques
Automation test scripting techniquesAutomation test scripting techniques
Automation test scripting techniques
 
Career in Software Testing | Skills Required for Software Test Engineer | Edu...
Career in Software Testing | Skills Required for Software Test Engineer | Edu...Career in Software Testing | Skills Required for Software Test Engineer | Edu...
Career in Software Testing | Skills Required for Software Test Engineer | Edu...
 
Automation Tools Overview
Automation Tools OverviewAutomation Tools Overview
Automation Tools Overview
 

Andere mochten auch

Just Java2007 - Daniel Wildt - Tools For Java Test Automation
Just Java2007 - Daniel Wildt - Tools For Java Test AutomationJust Java2007 - Daniel Wildt - Tools For Java Test Automation
Just Java2007 - Daniel Wildt - Tools For Java Test AutomationDaniel Wildt
 
Automation testing overview
Automation testing overviewAutomation testing overview
Automation testing overviewAmrita Bisht
 
Fundamentals of testing 1
Fundamentals of testing 1Fundamentals of testing 1
Fundamentals of testing 1Hoang Nguyen
 
ISTQB - CTFL Summary v1.0
ISTQB - CTFL Summary v1.0ISTQB - CTFL Summary v1.0
ISTQB - CTFL Summary v1.0Samer Desouky
 
Software testing
Software testingSoftware testing
Software testingmkn3009
 
Atlassian Roadshow 2016 - DevOps Session
Atlassian Roadshow 2016 - DevOps SessionAtlassian Roadshow 2016 - DevOps Session
Atlassian Roadshow 2016 - DevOps SessionSourcesense
 
Automated Testing vs Manual Testing
Automated Testing vs Manual TestingAutomated Testing vs Manual Testing
Automated Testing vs Manual TestingDirecti Group
 
Test Process Improvement with TPI NEXT - what the model does not tell you but...
Test Process Improvement with TPI NEXT - what the model does not tell you but...Test Process Improvement with TPI NEXT - what the model does not tell you but...
Test Process Improvement with TPI NEXT - what the model does not tell you but...SQALab
 
How Atlassian's Build Engineering Team Has Scaled to 150k Builds Per Month an...
How Atlassian's Build Engineering Team Has Scaled to 150k Builds Per Month an...How Atlassian's Build Engineering Team Has Scaled to 150k Builds Per Month an...
How Atlassian's Build Engineering Team Has Scaled to 150k Builds Per Month an...Peter Leschev
 
Introduction to Test Automation
Introduction to Test AutomationIntroduction to Test Automation
Introduction to Test AutomationPekka Klärck
 
Tipos de pruebas de software
Tipos de pruebas de softwareTipos de pruebas de software
Tipos de pruebas de softwareGuillermo Lemus
 
Automated vs manual testing
Automated vs manual testingAutomated vs manual testing
Automated vs manual testingKanoah
 

Andere mochten auch (15)

Just Java2007 - Daniel Wildt - Tools For Java Test Automation
Just Java2007 - Daniel Wildt - Tools For Java Test AutomationJust Java2007 - Daniel Wildt - Tools For Java Test Automation
Just Java2007 - Daniel Wildt - Tools For Java Test Automation
 
Automation testing overview
Automation testing overviewAutomation testing overview
Automation testing overview
 
Fundamentals of testing 1
Fundamentals of testing 1Fundamentals of testing 1
Fundamentals of testing 1
 
ISTQB - CTFL Summary v1.0
ISTQB - CTFL Summary v1.0ISTQB - CTFL Summary v1.0
ISTQB - CTFL Summary v1.0
 
Software testing
Software testingSoftware testing
Software testing
 
From QA To Dev-QA-Ops
From QA To Dev-QA-OpsFrom QA To Dev-QA-Ops
From QA To Dev-QA-Ops
 
Atlassian Roadshow 2016 - DevOps Session
Atlassian Roadshow 2016 - DevOps SessionAtlassian Roadshow 2016 - DevOps Session
Atlassian Roadshow 2016 - DevOps Session
 
Automated Testing vs Manual Testing
Automated Testing vs Manual TestingAutomated Testing vs Manual Testing
Automated Testing vs Manual Testing
 
Test Process Improvement with TPI NEXT - what the model does not tell you but...
Test Process Improvement with TPI NEXT - what the model does not tell you but...Test Process Improvement with TPI NEXT - what the model does not tell you but...
Test Process Improvement with TPI NEXT - what the model does not tell you but...
 
Testing - Ing. Gabriela Muñoz
Testing - Ing. Gabriela MuñozTesting - Ing. Gabriela Muñoz
Testing - Ing. Gabriela Muñoz
 
How Atlassian's Build Engineering Team Has Scaled to 150k Builds Per Month an...
How Atlassian's Build Engineering Team Has Scaled to 150k Builds Per Month an...How Atlassian's Build Engineering Team Has Scaled to 150k Builds Per Month an...
How Atlassian's Build Engineering Team Has Scaled to 150k Builds Per Month an...
 
Introduction to Test Automation
Introduction to Test AutomationIntroduction to Test Automation
Introduction to Test Automation
 
Tipos de pruebas de software
Tipos de pruebas de softwareTipos de pruebas de software
Tipos de pruebas de software
 
Automated vs manual testing
Automated vs manual testingAutomated vs manual testing
Automated vs manual testing
 
Automated Testing
Automated TestingAutomated Testing
Automated Testing
 

Ähnlich wie Software test automation_overview

Agile Development in Aerospace and Defense
Agile Development in Aerospace and DefenseAgile Development in Aerospace and Defense
Agile Development in Aerospace and DefenseJim Nickel
 
Error Proofing And Cost Reduction 2
Error Proofing And Cost Reduction 2Error Proofing And Cost Reduction 2
Error Proofing And Cost Reduction 2Brian King
 
Automation Best Practices.pptx
Automation Best Practices.pptxAutomation Best Practices.pptx
Automation Best Practices.pptxpavelpopov43
 
Creating a successful continuous testing environment by Eran Kinsbruner
Creating a successful continuous testing environment by Eran KinsbrunerCreating a successful continuous testing environment by Eran Kinsbruner
Creating a successful continuous testing environment by Eran KinsbrunerQA or the Highway
 
Test_Automation_-_Let's_Talk_Business.ppt
Test_Automation_-_Let's_Talk_Business.pptTest_Automation_-_Let's_Talk_Business.ppt
Test_Automation_-_Let's_Talk_Business.pptGopi Raghavendra
 
Robotics - Mainstream or Marginal for Process Industries?
Robotics - Mainstream or Marginal for Process Industries?Robotics - Mainstream or Marginal for Process Industries?
Robotics - Mainstream or Marginal for Process Industries?Yokogawa1
 
TEA Presentation V 0.3
TEA Presentation V 0.3TEA Presentation V 0.3
TEA Presentation V 0.3Ian McDonald
 
The Automation Firehose: Be Strategic & Tactical With Your Mobile & Web Testing
The Automation Firehose: Be Strategic & Tactical With Your Mobile & Web TestingThe Automation Firehose: Be Strategic & Tactical With Your Mobile & Web Testing
The Automation Firehose: Be Strategic & Tactical With Your Mobile & Web TestingPerfecto by Perforce
 
Testing NodeJS, REST APIs and MongoDB with UFT
Testing NodeJS, REST APIs and MongoDB with UFTTesting NodeJS, REST APIs and MongoDB with UFT
Testing NodeJS, REST APIs and MongoDB with UFTOri Bendet
 
ITLCHN 18 - Automation & DevOps - Automic
ITLCHN 18 -  Automation & DevOps - AutomicITLCHN 18 -  Automation & DevOps - Automic
ITLCHN 18 - Automation & DevOps - AutomicIT Expert Club
 
Unlocking the Power of ChatGPT and AI in Testing - NextSteps, presented by Ap...
Unlocking the Power of ChatGPT and AI in Testing - NextSteps, presented by Ap...Unlocking the Power of ChatGPT and AI in Testing - NextSteps, presented by Ap...
Unlocking the Power of ChatGPT and AI in Testing - NextSteps, presented by Ap...Applitools
 
Saksham Sarode - Innovation Through Introspection - EuroSTAR 2012
Saksham Sarode - Innovation Through Introspection - EuroSTAR 2012Saksham Sarode - Innovation Through Introspection - EuroSTAR 2012
Saksham Sarode - Innovation Through Introspection - EuroSTAR 2012TEST Huddle
 
Susan windsor soft test 16th november 2005
Susan windsor soft test   16th november 2005Susan windsor soft test   16th november 2005
Susan windsor soft test 16th november 2005David O'Dowd
 
Evolution of Test Automation
Evolution of Test AutomationEvolution of Test Automation
Evolution of Test AutomationDharmik Rajput
 
implementing_ai_for_improved_performance_testing_the_key_to_success.pptx
implementing_ai_for_improved_performance_testing_the_key_to_success.pptximplementing_ai_for_improved_performance_testing_the_key_to_success.pptx
implementing_ai_for_improved_performance_testing_the_key_to_success.pptxsarah david
 

Ähnlich wie Software test automation_overview (20)

Qtp - Introduction values
Qtp - Introduction valuesQtp - Introduction values
Qtp - Introduction values
 
Agile Development in Aerospace and Defense
Agile Development in Aerospace and DefenseAgile Development in Aerospace and Defense
Agile Development in Aerospace and Defense
 
UiPath Test Automation Webinar Recap
UiPath Test Automation Webinar RecapUiPath Test Automation Webinar Recap
UiPath Test Automation Webinar Recap
 
Error Proofing And Cost Reduction 2
Error Proofing And Cost Reduction 2Error Proofing And Cost Reduction 2
Error Proofing And Cost Reduction 2
 
Automation Best Practices.pptx
Automation Best Practices.pptxAutomation Best Practices.pptx
Automation Best Practices.pptx
 
Creating a successful continuous testing environment by Eran Kinsbruner
Creating a successful continuous testing environment by Eran KinsbrunerCreating a successful continuous testing environment by Eran Kinsbruner
Creating a successful continuous testing environment by Eran Kinsbruner
 
Test_Automation_-_Let's_Talk_Business.ppt
Test_Automation_-_Let's_Talk_Business.pptTest_Automation_-_Let's_Talk_Business.ppt
Test_Automation_-_Let's_Talk_Business.ppt
 
Robotics - Mainstream or Marginal for Process Industries?
Robotics - Mainstream or Marginal for Process Industries?Robotics - Mainstream or Marginal for Process Industries?
Robotics - Mainstream or Marginal for Process Industries?
 
TEA Presentation V 0.3
TEA Presentation V 0.3TEA Presentation V 0.3
TEA Presentation V 0.3
 
Future of QA
Future of QAFuture of QA
Future of QA
 
Futureofqa
FutureofqaFutureofqa
Futureofqa
 
The Automation Firehose: Be Strategic & Tactical With Your Mobile & Web Testing
The Automation Firehose: Be Strategic & Tactical With Your Mobile & Web TestingThe Automation Firehose: Be Strategic & Tactical With Your Mobile & Web Testing
The Automation Firehose: Be Strategic & Tactical With Your Mobile & Web Testing
 
Testing NodeJS, REST APIs and MongoDB with UFT
Testing NodeJS, REST APIs and MongoDB with UFTTesting NodeJS, REST APIs and MongoDB with UFT
Testing NodeJS, REST APIs and MongoDB with UFT
 
ITLCHN 18 - Automation & DevOps - Automic
ITLCHN 18 -  Automation & DevOps - AutomicITLCHN 18 -  Automation & DevOps - Automic
ITLCHN 18 - Automation & DevOps - Automic
 
Unlocking the Power of ChatGPT and AI in Testing - NextSteps, presented by Ap...
Unlocking the Power of ChatGPT and AI in Testing - NextSteps, presented by Ap...Unlocking the Power of ChatGPT and AI in Testing - NextSteps, presented by Ap...
Unlocking the Power of ChatGPT and AI in Testing - NextSteps, presented by Ap...
 
Saksham Sarode - Innovation Through Introspection - EuroSTAR 2012
Saksham Sarode - Innovation Through Introspection - EuroSTAR 2012Saksham Sarode - Innovation Through Introspection - EuroSTAR 2012
Saksham Sarode - Innovation Through Introspection - EuroSTAR 2012
 
Script less automation
Script less automation  Script less automation
Script less automation
 
Susan windsor soft test 16th november 2005
Susan windsor soft test   16th november 2005Susan windsor soft test   16th november 2005
Susan windsor soft test 16th november 2005
 
Evolution of Test Automation
Evolution of Test AutomationEvolution of Test Automation
Evolution of Test Automation
 
implementing_ai_for_improved_performance_testing_the_key_to_success.pptx
implementing_ai_for_improved_performance_testing_the_key_to_success.pptximplementing_ai_for_improved_performance_testing_the_key_to_success.pptx
implementing_ai_for_improved_performance_testing_the_key_to_success.pptx
 

Software test automation_overview

  • 1. Software Test Automation Overview Rohan Bhattarai
  • 2. Contents  Overview of STA  BEP and ROI  Tools  Automation Framework
  • 3. Contents  Overview of STA  Choosing the right tool – The Strategic  History Steps  Myths and Truths  Tools Classification  Why Automation Projects Fail?  Short List Tools – Functional  What is TA?  Short List Tools – Performance  Why TA is needed?  Evaluate Vendors  Manual vs. Automated  Functional Test Tools - Analysis  Pros & Cons  Feasibility Analysis  What to Automate?  Automation Framework  What Not to Automate?  Where to start?  BEP and ROI  What is AF?  BEP  Benefits of AF  How does it work?  BEP - Example  Architecture  ROI  Framework Approaches  Classic ROI  Record and Playback  Real ROI  Script Based Approach  Benefits of ROI  Keyword Driven Approach  ROI Calculator  Data Driven Approach  Hybrid Keyword and Data Driven Approach  Tools  Choosing the right tool – The Strategic Approach
  • 4. Contents  Overview  History  Myths and Truths  Why Automation Projects Fail?  What is TA?  Why TA is needed?  Manual vs. Automated  Pros & Cons  What to Automate?  What Not to Automate?
  • 5. History  Moving swiftly past the hype  Historically Automation is perceived as a “Silver Bullet” of the Testing world  “The term has been adopted into a general metaphor, where “silver bullet” refers to any straight forward solution perceived to have extreme effectiveness. The phrase typically appears with an expectation that some new technology or practice will easily cure a major prevailing problem”.
  • 6. History  Historical trends in test automation frameworks: 2008 – 2011 1993 – 2001 2001 – 2005 2005 – 2008 4th Gen – Hybrid 1st Gen – Modularity 2nd Gen – Data 3rd Gen – Keyword Data Driven Driven Keyword Driven Driven • 1993 – 1999: WinRunner x.x • 2005 – 2006: QTP 8.x, RFT 7.x • 1999 – 2001: WinRunner 6.x • 2006 – 2008: QTP 9.x, RFT 8.x • 2001 – 2004: WinRunner 7.x, RobotJ 1.0, XDE Tester 1.0 • 2007: Selenium 2 • 2004 – 2005: QTP 7.x, RFT 6.x • 2008 – 2010: QTP 10.x, RFT 8.1 • 2004 : Selenium 1 • 2010 – 2011: QTP 11, RFT 8.2
  • 7. Myths and Truths 1. Test Automation is simple, that every tester can do it  Promoted by the Sales people by simply saying:  Record the script  Enhance the script by adding functions and data driving  Run the scripts  Report results  Under this influence QA manager can proudly say “All our testers are doing test automation”.
  • 8. Myths and Truths  But in Reality: TA is a software development task  It should be designed, developed and tested  Need to have some kind of a programming background to implement test automation. TA is not as complex as C++/C#/Java development.  TA components are assets that should be treated like application source code  Don’t fall into tool vendor sales pitch …remember Record & Playback is not real test automation
  • 9. Myths and Truths 2. Commercial TA tools are expensive  Under the influence of this myth some companies, especially the small ones:  Try to develop their own test automation tools  Use scripting languages like Perl and Ruby  Use shareware test tools  Do not consider test automation at all
  • 10. Myths and Truths  But in Reality: Commercial TA tools are not that expensive  Per seat license for the most expensive automation tool is $8K, which can be used for 5 years.  Maintenance/Support fees are 20% of tool cost or $1,600 per year  The cost of this tool is $8K/5+$1,600 = $3,200 per year  The automation developer cost with overhead is $100K per year  The cost of this tool is just 3% of the person who uses it, but productivity gain can be very significant
  • 11. Myths and Truths Learning from past experience  Truth: 92% FAIL to meet target ROI Automation Projects 8% 40% Working ROI Failure 60% 32% 2004 2010 (Estimated) Industry: Test Automation $1 Billion $ 6.3 Billion (Net Worth) Automation Projects $ 0.6 Billion $ 3.8 Billion (Failure Cost) Source 1: http://www.nytimes.com/2006/07/26/technology/26hewlett.html Source 2: http://www.slideshare.net/Jonathon_Wright/hybrid-keyword-data-driven-automation-frameworks-jonathon-wright
  • 12. Why do Automation Projects typically Fail?  IDT study(www.idtus.com) Misc Lack of Time Tool 15% 37% Incompatibility 11% Lack Of Budget 17% Lack of Expertise 20%
  • 13. Why do Automation Projects typically Fail?  Lack of defined automation methodology  Automation is not treated as a legitimate project with the necessary planning / resources  Test Automation is typically performed at the end of the SDLC  After the initial success the automation scripts are not maintained for future builds
  • 14. Why do Automation Projects typically Fail?  Testers are typically untrained in test tools and programming techniques  No modularization (reusable functions) in automation scripts  Automated tests cases are usually designed based on front end functionality (black box testing)
  • 15. What is Software Test Automation?  Test Automation is the use of software to execute tests without Human intervention  It refers to the activities and efforts that intend to automate engineering tasks and operations in a software test process using well-defined strategies and systematic solutions. Not like Rube Goldberg cartoons
  • 16. Why TA is needed?  Objectives:  To free engineers from tedious and redundant manual testing operations  To speed up a software testing process, and to reduce software testing cost and time during a software life cycle  To increase the quality and effectiveness of a software test process by achieving pre-defined adequate test criteria in a limited schedule
  • 17. Key to Success To reduce manual testing activities and redundant test operations using a systematic solution to achieve a better testing coverage.
  • 18. Manual vs. Automated Testing  Manual Testing:  Testing time is consuming and tedious  Inefficient in today’s shorter SDLC  Delay the ability to thoroughly test an application  Critical bugs escape undetected  What happens when multiple platforms involved  Automated Testing:  Higher efficiency that accelerate the testing cycle and promote software quality  Optimizes software quality and testing efficiency by delivering  Reusability  Predictability and Consistency  Productivity  Enables accurate assessment of quality level
  • 19. Pros and Cons  Pros:  Speed  Reusability  Accuracy  Run Anytime  Efficiency
  • 20. Pros and Cons  Cons:  SignificantInvestment  Maintenance  Not as Robust  Error Detection  Cannot Think
  • 21. What to Automate?  Regression Tests: Stabilized tests that verify stabilized functionality  Tests rerun often: Tests that are executed regularly vs. rarely  Tests that will not expire shortly: Most tests have a finite lifetime during which its automated script must recoup the additional cost required for its automation  Tedious/Boring tests:  tests with many calculations and number verifications  repetitive tests performing the same operations over and over  tests requiring many performance measurements  Just plain boring tests  Reliably repeatable
  • 22. What NOT to Automate?  Unstable functionality: Not reliably repeatable  Rarely executed tests: poor Return-On-Investment  Tests that will soon expire: poor Return-On-Investment  Requiring in-depth business analysis:  some tests require so much business specific knowledge that it becomes prohibitive time wise to include every verification required to make its automated script robust enough to be effective  exceedingly complex tests are sometimes not possible to automate because computers cannot think
  • 23.
  • 24. Contents  BEP and ROI  BEP  BEP - Example  ROI  PayBack Period  Classic ROI  Real ROI  Benefits of ROI  ROI Calculator
  • 25. BEP  Break-Even Point (BEP) is the point at which cost or expenses and revenue are equal: there is no net loss or gain
  • 26. BEP - Example Resources (R ) for (n) Automated Tests Preparation (V) Execution (D) ROI Rn = Aa / Am = (Va + n*Da) / (Vm + (in Mins) (in Mins) using n*Dm) Manu al Manual Automated Manual Automated 1 5 10 20 Test Scenario 1 30 60 11 1.1 33% 149% 77% 51% 33% Scenario 2 30 60 11 1.1 33% 149% 77% 51% 33% Scenario 3 30 60 9 0.9 36% 156% 86% 58% 37% Scenario 4 30 60 10 1 34% 153% 81% 54% 35% Scenario 5 30 60 10 1 34% 153% 81% 54% 35% Scenario 6 30 60 10 1 34% 153% 81% 54% 35% Scenario 7 30 60 15 1.5 27% 137% 64% 42% 27% Scenario 8 30 60 30 3 5% 105% 42% 27% 19% Scenario 9 30 60 22 2.2 16% 120% 51% 33% 22% Scenario 10 30 60 12 1.2 31% 146% 73% 48% 31% Total 300 600 140 14 28% 142% 71% 47% 31%
  • 27. ROI  Return on Investment  ROI = BENEFIT/COST  ROI = (total benefit – total cost) / (total cost)  ROI = (cost of manual – cost of automation) / cost of automation  Where,  Automation Cost = Price Of HW + Price of SW + Development Cost + Maintenance Cost + Execution Cost  Manual Testing Cost = Development Cost + Maintenance Cost + Execution Cost  Looks right, Doesn’t it?
  • 28.
  • 29. Classic ROI  Problems with classic ROI calculation:  You can’t compare Automated Testing and Manual Testing. They are not the same and they provide different information about the AUT.  You can’t compare cost of multiple execution of automated tests vs. manual tests. You would never dream of executing that many test cases manually  So then…what is real ROI?
  • 30. Real ROI  ROI value is not the value of Automation vs. Cost of executing these tests manually  Automation ROI value is the benefit of this type of testing, and it can be:  Reducing Time to Market  Increased Test Efficiency (Productivity)  Increased Test Effectiveness
  • 31. Benefits of ROI  Reduced Time to Market  Products delivered quickly  Makes people available to work on other projects  Higher margins, if no competitive products in market  Productivity and Effectiveness  More testing gets done faster, increasing the odds of finding defects  Defects found early have better chances of being fixed  Manual Testers can concentrate on clever ways to finding defects, instead of typing test inputs and verify output.
  • 32. Benefits of ROI  About 7% of bug fixes create new bugs, sometimes in already tested parts of the system. With automation you can rerun tests for those modules. This almost never happens when testing is done manually.
  • 33. ROI Calculator  ROI Calculator:  Source 1: http://www.aspiresys.com/testautomationroi/index.php  Source 2: http://www.elbrus.com/services/test_automation_roi_ calc/
  • 36. Contents  Tools  Choosing the right tool – The Strategic Approach  Choosing the right tool – The Strategic Steps  Tools Classification  Short List Tools – Functional  Short List Tools – Performance  Evaluate Vendors  Functional Test Tools - Analysis  Feasibility Analysis
  • 37. Tools There is no single best testing tool; rather, different tools are more or less appropriate in different environments.
  • 38. Tools  Over 300 Test Tools are available (http://www.softwareqatest.com)  Load/Performance tools – 54  Web Functional/Regression – 60  Java Test tools - 48  Other Web tools – 76 Which tool is right for you?
  • 39. Choosing the right tool – The Strategic Approach • Is there an organizational methodology for test automation? • Which applications/processes? • What is the impact to current project schedules? • What is the effort in maintaining automated tests? • What are the costs? • What about tools integration? • What about Continuous Integration?
  • 40. Choosing the right tool – The Strategic Steps • Step 1: Define and Refine Requirements • Step 2: Communicate the Impact • Step 3: Develop Evaluation Methodology • Step 4: Select, Procure, and Implement
  • 41. Step 1: Define and refine requirement  Create a list of organizational requirements  What problems do you want the tool to solve?  What capabilities will the tool need to be effective in your environment? Other lifecycle tools?  What constraints, budgetary or otherwise?  Identify compatibility issues  What operating systems does your application support?  What is the development environment?  Does your application integrate with third-party software?  Does the application use custom controls?
  • 42. Step 1: Define and refine requirement  Identify tool audience  Who will use the tool on a day-to-day basis?  What is the level and mix of user skill levels?  Is your organization willing to invest in training?  Define technical or business requirements  Does your organization have additional requirements?  Software standards  Technical standards  Procurement rules  Preferred vendor rules
  • 43. Step 1: Define and refine requirement  Identify budget constraints  How much can we afford?  How much is this worth?  New requirements may surface based on research  Did not know about  Forgot to include
  • 44. Step 2: Communicate the Impact  Automated testing is part of the larger strategic application development endeavor  Communicate the effects of implementing a tool  Chance to discuss and mitigate concerns  How tool may change job description  Commitment to training  Implementation strategy  Discussion may imply additional requirements  Business, functional, technical, or operational
  • 45. Step 3: Develop the evaluation methodology  How will tools be compared?  Are there specific features that may differentiate one tool from another?  Are there specific things that can eliminate a tool from consideration?  Preferred vendor list can reduce evaluation scope  Demos and evaluations are time-consuming  Identify a representative set of activities to accomplish with the tool during the evaluation
  • 46. Step 4: Select, procure, and implement  Make an informed selection  Follow organization’s procurement process  Develop the implementation plan  What  When  Why  Who  How
  • 47. Step 4: Select, procure, and implement  Develop an implementation plan  Enterprise applications requiring multiple releases  Applications that must produce a consistent set of results using stable data  These characteristics fully leverage reusability and predictability benefits of automated testing  Take implementation one step at a time  Take time for training  Keep focus on staff issues and reactions
  • 48. Step 4: Select, procure, and implement  Develop a test plan  Describes scope, approach, resources and schedule for all automated and manual activities  Rule of Thumb: (Test scripts) 40% manual - 60% automated  Create and deploy your automated tests  Be selective with the automation of test scripts  Verify the most critical functionality  Are the most likely to expose defects  Are expensive or impossible to perform manually  Use the first automated suites you build for  Smoke testing  Regression testing
  • 49. Decision? How to calculate the cost of functional test automation Labor costs of Labor costs of Cost of test automation Cost of tool(s) script creation script maintenance If a test script will be run every week for the next 2 years, automate the test if the cost of automation is less than the cost of manually executing the test 104 times. Automate if Cost of manually executing the test as many Cost of automation times as the automated test will be executed
  • 50. Tools Classification Test Tool Types Basic Descriptions of Different Types of Test Tools Test Information Systematic solutions and tools support test engineers and quality assurance people to Management create, update, and maintain diverse test information, including test cases, test scripts, test data, test results, and discovered problems. Test Execution and Systematic solutions and tools help engineer set up and run tests, and collect and Control validate test results. Test Generation Systematic solutions and tools generate program tests in an automatic way. Test Coverage Systematic solutions and tools analyze the test coverage during a test process based Analysis on selected test criteria. Performance Testing Systematic solutions and tools support program performance testing and performance and Measurement measurement. Software Simulators Programs are developed to simulate the functions and behaviors of external systems, or dependent subsystems/components for a under test program. Regression Testing Test tools support the automation performance of regression testing and activities, including test recording and re-playing.
  • 51. Tools Classification Types of Test Tools Test Tool Vendors Test Tools Problem Management Tools Rational Inc. ClearQuest, ClearDDTS Microsoft Corp. PVCS Tracker Imbus AG Imbus Fehlerdatenbank Test Information Management Rautional Inc. TestManager Tools Mercury Interactive TestDirectory Test Suite Management Tools Evalid TestSuiter Rational Inc. TestFactory SUN JavaTest, JavaHarness White-Box Test Tools McCabe & Associates McCabe IQ2 Junit IBM IBM COBOL Unit Tester IBM ATC - Coverage Assistant - Source Audit Assistant - Distillation Assistant - Unit Test Assistant
  • 52. Tools Classification Test Execution Tools OC Systems Aprob Softbridge ATF/TestWright AutoTester AutoTester Rational Inc. Visual Test, Rational Functional Tester SQA Robot Mercury Interactive WinRunner, Quick Test Prof Sterling Software Vision TestPro Compuware QARun Seque Software SilkTest RSW Software Inc. e-Test Cyrano Gmbh Cyrano Robot Code Coverage Analysis Tools Case Consult Corp. Analyzer, Analyzer Java OC Systems Aprob IPL Software Product Group Cantata/Cantata++ ATTOL Testware SA Coverage Compuware NuMega TruCoverage Software Research TestWorks Coverage Rational Inc PureCoverage SUN JavaScope ParaSoft TCA Software Automation Inc Panorama
  • 53. Tools Classification Load Test and Performance Tools Rational Inc. Rational Performance Tester InterNetwork AG sma@rtTest Compuware QA-Load Mercury Interactive LoadRunner RSW Software Inc. e- Load SUN JavaLoad Seque Software SilkPerformer Client/Server Solutions, Inc. Benchmark Factory Regression Testing Tools IBM Regression Testing Tool(ARTT) Distillation Assistant GUI Record/Replay Software Research eValid Mercury Interactive Xrunner Astra Astra QuickTest AutoTester AutoTester, AutoTester One
  • 54. Short List Tools - Functional Vendor Tool Test Suite - Companion Tools Compuware TestPartner QACenter Enterprise Edition+ Empirix e-Tester e-TEST suite Rational Functional IBM Rational Suite Tester Mercury QuickTest Professional Quality Center RadView WebFT TestView Suite Seapine QA Wizard Pro TestTrack Pro Borland SilkTest SilkCentral Test Manager (Segue)
  • 55. Short List Tools - Performance Vendor Tool Test Suite - Companion Tools Compuware QALoad QACenter Enterprise Edition+ Empirix e-Load e-TEST suite Rational Performance Rational Suite IBM Tester Mercury LoadRunner Quality Center RadView WebLOAD TestView Suite Facilita Forecast ForecastWeb, ForecastNet, ForecastDB Borland SilkPerformer SilkCentral Test Manager (Segue)
  • 56. Evaluate Vendor Risky Strong bets Contenders performers Leaders Strong Current offerings Weak Strategy Strong
  • 57. Functional Test Tools - Analysis Tool Pros Cons IBM/Rational •Built as Eclipse Plug-In with full IDE •Insufficient browser Functional Tester and Java support support (RFT) •Supports Web 2.0, Java or .NET •Licensed product applications •Full GUI Object Map repository HP/Mercury Quick •Supports Web 2.0, Java or .NET •VisualBasic scripting is Test Pro (QTP) applications limited •Full GUI Object Map repository •No IDE (may change in •Seamless integration with new release) QualityCenter •Licensed Product Selenium RC & •Good browser support •No GUI Object repository IDE •Good language support (Java, •Only web application Ruby,C# ) support •Can be easily extended as JUnit suite •Open-source
  • 58. Feasibility Analysis  FA Matrix Available  Operational Feasibility  Technical Feasibility  Economic Feasibility  Schedule Feasibility
  • 61. Contents  Automation Framework  Where to start?  What is AF?  Benefits of AF  How does it work?  Architecture  Framework Approaches  Record and Playback  Script Based Approach  Keyword Driven Approach  Data Driven Approach  Hybrid Keyword and Data Driven Approach
  • 62. Where to Start? “Start Quick wins SMALL should be NEVER expect think BIG” avoided to automate 100% First find out Then you can work out What needs to be tested? What needs be automated? What can be tested? What can be automated? What could be tested? What could be automated? Focus on key critical “Under Keep it business promise, simple, processes Over deliver?” wherever possible
  • 63. What is Automation Framework?  Framework – independent of application or environment under test  A Test Automation Framework is a set of assumptions, concepts and tools that provide support for Automated Software Testing.  A reusable set of libraries or classes for a software system (or subsystem).  A correctly implemented Test Automation Framework can further improve ROI by reducing the development and maintenance costs.
  • 64. Benefits of Framework  Ease of Use – easy to learn and easy to use  Time – faster than capture/replay and scripting approach  Maintainability – significantly reduces the test maintenance effort  Reusability – due to modularity of test cases and library functions  Manageability - effective test design, execution, and traceability  Accessibility – to design, develop & modify tests whilst executing  Availability – scheduled execution can run unattended on a 24/7 basis  Reliability – due to advanced error handling and scenario recovery  Flexibility – framework independent of AUT or environment  Measurability – customizable reporting of test results ensure
  • 65. How does it work?  Different Implementations  One Example of Keyword Driven Framework could be:  Spreadsheets, Spreadsheets, Spreadsheets  Test Objects  Keywords and Methods = Toast!  Parameters  Description or Call the 911?
  • 67. Framework Approaches  Record and Playback  Script Based Approach  Keyword Driven Approach  Data Driven Approach  Hybrid Keyword and Data Driven Approach
  • 68. Manual Testing – Looking back + easy & cheap to start + flexible testing - expensive every execution - no auto regression testing - less coverage measurement
  • 69. Record and Playback + flexible testing - expensive first execution + auto regression testing - fragile tests break easily - less coverage measurement
  • 70. Script Based Approach +/- test impl. = programming + automatic execution + auto regression testing - fragile tests break easily? (depends on abstraction) - less coverage measurement
  • 71. Data Driven Approach  Automation is data-centric  User defines just data sets to drive tests with  Will have an external data source (DB tables, Excel spreadsheets, XML for data sets)  Flow control (navigation) is normally done by the test script not by the data source Ex: data set exercises creation of new sales accounts functionality; stored in a DB table account_data CompanyName PrimarySalesPerson Street Zip City State Genesis Inc. Phil Collins 5775 Main st 30075 Atlanta GA RollingStones Inc. Mick Jagger Jr. 2332 Washington st 02111 Boston MA
  • 72. Keyword/Action Driven Approach + abstract tests + automatic execution + auto regression testing - robust tests - less coverage measurement
  • 73. Keyword/Action Driven Approach  Automation is action-centric  De-compose your test cases/modules into granular re-usable keywords  The idea is for non-coders to be able to create automated test cases with action keywords  User defines flow control of the test via action keywords Example: Test Case “Verify Checking Account Balance” 1. Enter Username and Password and Click submit button  step 1 is action Login 2. Enter “Phil Collins” as a Sales Person and Click Submit button 3. Verify the Sales Person was successfully created and Logout So you may want to choose the following re-usable action keywords: EnterText, Click, Login, VerifyExists
  • 74. Benefits of Keyword Driven Approach  This Framework addresses the most common problem with test automation: Automation Engineers do not have domain knowledge and the End Users (Subject Matter Experts/Test Engineers) usually do not have automation expertise.  When properly implemented and maintained, it presents a superior ROI because each business event is designed, automated and maintained as a discrete entity.  Keywords can then be used to design test cases, but the design and automation overhead for the keyword has already been paid.
  • 75. Benefits of Keyword Driven Approach  Reduced the cost and time spent maintaining and updating tests  The modular structure of keyword-driven testing means that new tests can easily be created from pre-existing modules  The test team is capable of entirely automating tests, even without programming knowledge  Can be easily modified to use with different test tool  Reusability across different projects  Classic Example: Object Action Data Textfield (username) Enter Text <username>
  • 76. Keyword/Action Driven Approach  May have an external data source (DB tables, Excel spreadsheets, XML for data sets) with action keywords Ste Descriptio p n Page Action Module Type Object Expected UserLo .id:=LoginSu .text:=Login 1 Login Home Login gin N/A bmit Successful Enter New CreateS Sales alesPers .text:=Sales .value:=Phil 2 Person data on EnterText Field PerName Collins CreateS Click alesPers Butto .id:=SubmitS .url:=.*createdSal 3 Submit on Click n alesPer esPerStatus.html Verify Sales CreateS Person alesPers .value:=User Creation onStatus VerifyExist .id:=Creation Created 4 Successful s DIV Status Successfully
  • 77. Hybrid Keyword and Data Driven Approach  Combines the best of both worlds  User defines data sets to drive tests with  User also defines flow control of the test via action keywords  May have an external data source (DB tables, Excel spreadsheets, XML for data sets) with action keywords in addition to generic and test case specific data sets
  • 82. Model Based Approach + abstract tests + automatic execution + auto regression testing + auto design of tests + systematic coverage + measure coverage of model and requirements - modelling overhead Emerging Approach
  • 83. References  http://www.slideshare.net/Jonathon_Wright/hybrid- keyword-data-driven-automation-frameworks- jonathon-wright?src=related_normal&rel=805408  http://www.ibm.com/developerworks/rational/library/5 91.html  http://www.keane.com/resources%2Fpdf%2FWhiteP apers%2FWP_ROIforTestAutomation.pdf
  • 84. Summarizing…  Overview of STA  Choosing the right tool – The Strategic  History Steps  Myths and Truths  Tools Classification  Why Automation Projects Fail?  Short List Tools – Functional  What is TA?  Short List Tools – Performance  Why TA is needed?  Evaluate Vendors  Manual vs. Automated  Functional Test Tools - Analysis  Pros & Cons  Feasibility Analysis  What to Automate?  Automation Framework  What Not to Automate?  Where to start?  BEP and ROI  What is AF?  BEP  Benefits of AF  How does it work?  BEP - Example  Architecture  ROI  Framework Approaches  Classic ROI  Record and Playback  Real ROI  Script Based Approach  Benefits of ROI  Keyword Driven Approach  ROI Calculator  Data Driven Approach  Hybrid Keyword and Data Driven Approach  Tools  Choosing the right tool – The Strategic Approach
  • 85. THANK YOU Rohan Bhattarai