SlideShare ist ein Scribd-Unternehmen logo
1 von 36
Downloaden Sie, um offline zu lesen
Glossary of Testing
Terms and Concepts
 AGS QA and Testing CoE




                December 18, 2009
General terms
QA & Software Testing
 Quality assurance, or QA for short, refers to the systematic
 monitoring and evaluation of various aspects of a project,
 program or service, to ensure that standards of quality are
 being met.
 Software testing or Quality Control, or QC for short, is the
 Validation and Verification (V&V) activity aimed at
 evaluating an attribute or capability of a program or system
 and determining that it meets the desired results.
Verification
Verification (the first V) is
the process of evaluating a
system or component to
determine whether the
output of a given
development phase
satisfies the conditions
expected at the start of that
phase.
Validation
Validation is the process of
evaluating a system or
component during or at the
end of the development
process to determine
whether it satisfies
specified requirements.
Test Automation
Test automation is the use of software to control the
execution of tests, the comparison of actual outcomes to
predicted outcomes, the setting up of test preconditions, and
other test control and test reporting functions.
Commonly, test automation involves automating a manual
process already in place that uses a formalized testing
process.
Types of Test Automation
Frameworks
 The different test automation frameworks available are as
 follows:
    Test Script Modularity
    Test Library Architecture
    Data-Driven Testing
    Keyword-Driven or Table-Driven Testing
    Hybrid Test Automation
Test Script Modularity
 The test script modularity framework is the most basic of the
 frameworks. It's a programming strategy to build an abstraction layer
 in front of a component to hide the component from the rest of the
 application. This insulates the application from modifications in the
 component and provides modularity in the application design.
 When working with test scripts (in any language or proprietary
 environment) this can be achieved by creating small, independent
 scripts that represent modules, sections, and functions of the
 application-under-test. Then these small scripts are taken and
 combined them in a hierarchical fashion to construct larger tests. The
 use of this framework will yield a higher degree of modularization and
 add to the overall maintainability of the test scripts.
Test Library Architecture
 The test library architecture framework is very similar to the test script
 modularity framework and offers the same advantages, but it divides
 the application-under-test into procedures and functions (or objects
 and methods depending on the implementation language) instead of
 scripts.
 This framework requires the creation of library files (SQABasic
 libraries, APIs, DLLs, and such) that represent modules, sections,
 and functions of the application-under-test. These library files are
 then called directly from the test case script. Much like script
 modularization this framework also yields a high degree of
 modularization and adds to the overall maintainability of the tests.
Data-Driven Testing
 A data-driven framework is where test input and output values are
 read from data files (ODBC sources, CVS files, Excel files, DAO
 objects, ADO objects, and such) and are loaded into variables in
 captured or manually coded scripts.
 In this framework, variables are used for both input values and output
 verification values. Navigation through the program, reading of the
 data files, and logging of test status and information are all coded in
 the test script.
 This is similar to table-driven testing in that the test case is contained
 in the data file and not in the script; the script is just a "driver," or
 delivery mechanism, for the data. In data-driven testing, only test
 data is contained in the data files.
Keyword-Driven Testing
 This requires the development of data tables and
 keywords, independent of the test automation tool used to
 execute them and the test script code that "drives" the
 application-under-test and the data. Keyword-driven tests
 look very similar to manual test cases.
 In a keyword-driven test, the functionality of the application-
 under-test is documented in a table as well as in step-by-
 step instructions for each test. In this method, the entire
 process is data-driven, including functionality.
Hybrid Test Automation
Framework
 The most commonly implemented framework is a
 combination of all of the above techniques, pulling from
 their strengths and trying to mitigate their weaknesses. This
 hybrid test automation framework is what most frameworks
 evolve into over time and multiple projects. The most
 successful automation frameworks generally accommodate
 both Keyword-Driven testing as well as Data-Driven scripts.
 This allows data driven scripts to take advantage of the
 powerful libraries and utilities that usually accompany a
 keyword driven architecture. The framework utilities can
 make the data driven scripts more compact and less prone
 to failure than they otherwise would have been.
Errors, Bugs, Defects…
 Mistake – a human action that produces an incorrect
 result.
 Bug, Fault [or Defect] – an incorrect step, process, or data
 definition in a program.
 Failure – the inability of a system or component to perform
 its required function within the specified performance
 requirement.
 Error – the difference between a computed, observed, or
 measured value or condition and the true, specified, or
 theoretically correct value or condition.
The progression of a software
failure
A purpose of testing is to expose as many failures as possible
before delivering the code to customers.
Test Visibility
 Black box testing (also called functional testing or
 behavioral testing) is testing that ignores the internal
 mechanism of a system or component and focuses solely
 on the outputs generated in response to selected inputs
 and execution conditions.

 White box testing (also called structural testing and glass
 box testing) is testing that takes into account the internal
 mechanism of a system or component.
Comparing Black and White
Specification
Specification – a document that specifies in a complete,
precise, verifiable manner, the requirements, design,
behavior, or other characteristic of a system or component,
and often the procedures for determining whether these
provisions have been satisfied.
Some examples:
     Functional Requirements Specification
     Non-Functional Requirements Specification
     Design Specification
Testing Scope
 Functional Requirements (FR), also termed as Business
 Requirements, of the software or program under test
 Non-Functional Requirements (NFR) of the software or program
 under test. These are non-explicit requirements which the software or
 program is expected to satisfy for end user to be able to use the
 software or program successfully. Security, Performance,
 Compatibility, Internationalization, Usability … requirements are
 examples of NFR.
 Whitebox or blackbox type of testing to be performed validate that
 the software or program meets the FR and NFR
Types of Testing
Unit Testing
  Opacity: White box testing
  Specification: Low-level design and/or code structure
  Unit testing is the testing of individual hardware or
  software units or groups of related units. Using white box
  testing techniques, testers (usually the developers creating
  the code implementation) verify that the code does what it
  is intended to do at a very low structural level.
Integration Testing
    Opacity: Black- and white-box testing
    Specification: Low- and High-level design
    Integration test is testing in which software components, hardware
    components, or both are combined and tested to evaluate the
    interaction between them.
Using both black and white box testing techniques, the tester (still
    usually the software developer) verifies that units work together
    when they are integrated into a larger code base.
Just because the components work individually, that doesn’t mean that
    they all work together when assembled or integrated.
Functional Testing
   Opacity: Black-box testing
   Specification: High-level design, Requirements
   specification
   Using black box testing techniques, testers examine the
   high-level design and the customer requirements
   specification to plan the test cases to ensure the code
   does what it is intended to do.
Functional testing involves ensuring that the functionality
   specified in the requirement specification works.
System Testing
    Opacity: Black-box testing
    Specification: High-level design, Requirements specification
      System testing is testing conducted on a complete, integrated
      system to evaluate the system compliance with its specified
      requirements.
Because system test is done with a full system implementation and
      environment, several classes of testing can be done that can
      examine non-functional properties of the system.
It is best when Integration, Function and System testing is done by an
      unbiased, independent perspective (e.g. not the programmer).
Acceptance Tests
    Opacity: Black-box testing
    Specification: Requirements Specification
   After functional and system testing, the product is delivered to a
   customer and the customer runs black box acceptance tests based
   on their expectations of the functionality.
Acceptance testing is formal testing conducted to determine whether or
   not a system satisfies its acceptance criteria (the criteria the system
   must satisfy to be accepted by a customer) and to enable the
   customer to determine whether or not to accept the system
Beta Tests
  Opacity: Black-box testing
  Specification: None.
  When an advanced partial or full version of a software
  package is available, the development organization can
  offer it free to one or more (and sometimes thousands)
  potential users or beta testers.
Regression Tests
Regression testing is selective retesting of a system or
component to verify that modifications have not
caused unintended effects and that the system or
component still complies with its specified
requirements.
Test Plan
A test plan is a document describing the scope,
approach, resources, and schedule of intended test
activities. It identifies test items, the features to be
tested, the testing tasks, who will do each task, and
any risks requiring contingency plans. An important
component of the test plan is the individual test cases.
Test Scenarios/Test Case
  The term test scenario and test case are often used
  synonymously.
  A test case is a set of test steps each of which defines
  inputs, execution conditions, and expected results
  developed for a particular objective, such as to exercise a
  particular program path or to verify compliance with a
  specific requirement.
  Test scenarios ensure that all business process flows are
  tested from end to end.
Test Suite/Scripts
The test suite/test script is the combination of a test
scenarios, test steps, and test data. Initially the term
was derived from the product of work created by
automated regression test tools. Test suites/test
scripts can be manual, automated, or a combination of
both.
Application Terms
 Application: An application is software with features.
 Feature / Functional point: A functional point is a feature
 of the application. Some examples of features might be
 search, login, signup, and edit preferences.
 Session: A session is the means of grouping together
 functional points of a single application. The session keeps
 state and tracks variables of all the functional points
 grouped in a single session.
Action Point
An action point can be thought of as the act of doing. Take
search for example. Usually search only has two UI elements
to it, a text box where the search terms are entered and a
search button that submits the search terms.
The action point doesn't necessarily care about the search
results. It only cares that the search page was loaded
correctly, the search term was inserted and the search button
was clicked.
In other words, a action point doesn't validate what happened
after the action occurred, it only performs the action.
Validation Point
A validation point verifies the outcome of the action point.
Usually an action has several possible outcomes.
For example, login might behave differently depending on
whether the username and password are correct, incorrect,
too long, too short or just non-existent.
The action point is the act of logging in and the validation
point verifies the outcome when using valid, invalid, too long,
too short or just non-existent usernames and passwords.
Navigation Point
Navigation points traverse to the action point or the validation
point so it can be executed.
For example, there are websites that have a navigation bar on
the top no matter which page is loaded.
In this case the navigation point simply clicks on the link in the
navigation bar while on any of the pages in order to load the
page to perform the action or validation.
Data Flow Testing
In data flow-based testing, the control flow graph is
annotated with information about how the program
variables are defined and used. Different criteria
exercise with varying degrees of precision how a
value assigned to a variable is used along different
control flow paths.
Additional Types of Tests
Performance testing   Testing conducted to evaluate the compliance of a system
                      or component with specified performance requirements.
Usability testing     Testing conducted to evaluate the extent to which a user
                      can learn to operate, prepare inputs for, and interpret
                      outputs of a system or component.
Stress testing        Testing conducted to evaluate a system or component at
                      or beyond the limits of its specification or requirement.
Smoke test            A group of test cases that establish that the system is
                      stable and all major functionality is present and works
                      under “normal” conditions.
Robustness testing    Testing whereby test cases are chosen outside the
                      domain to test robustness to unexpected, erroneous input.
Thank
              You




Delivering…

Weitere ähnliche Inhalte

Was ist angesagt?

Software testing
Software testingSoftware testing
Software testingEng Ibrahem
 
Software testing
Software testingSoftware testing
Software testingSengu Msc
 
Python: Object-Oriented Testing (Unit Testing)
Python: Object-Oriented Testing (Unit Testing)Python: Object-Oriented Testing (Unit Testing)
Python: Object-Oriented Testing (Unit Testing)Damian T. Gordon
 
100 most popular software testing terms
100 most popular software testing terms100 most popular software testing terms
100 most popular software testing termsapurvaorama
 
Object Oriented Testing(OOT) presentation slides
Object Oriented Testing(OOT) presentation slidesObject Oriented Testing(OOT) presentation slides
Object Oriented Testing(OOT) presentation slidesPunjab University
 
Object oriented testing
Object oriented testingObject oriented testing
Object oriented testingHaris Jamil
 
Testing terms & definitions
Testing terms & definitionsTesting terms & definitions
Testing terms & definitionsSachin MK
 
Chapter 3 SOFTWARE TESTING PROCESS
Chapter 3 SOFTWARE TESTING PROCESSChapter 3 SOFTWARE TESTING PROCESS
Chapter 3 SOFTWARE TESTING PROCESSst. michael
 
Testing Concepts and Manual Testing
Testing Concepts and Manual TestingTesting Concepts and Manual Testing
Testing Concepts and Manual TestingMurageppa-QA
 
Software quality and testing (func. & non func.)
Software quality and testing (func. & non   func.)Software quality and testing (func. & non   func.)
Software quality and testing (func. & non func.)Pragya G
 
Software testing definition
Software testing definitionSoftware testing definition
Software testing definitionHiro Mia
 
Testing Concepts and Manual Testing
Testing Concepts and Manual TestingTesting Concepts and Manual Testing
Testing Concepts and Manual TestingSachin-QA
 
Testing Concepts and Manual Testing
Testing Concepts and Manual TestingTesting Concepts and Manual Testing
Testing Concepts and Manual TestingANKUR-BA
 

Was ist angesagt? (20)

Audit
AuditAudit
Audit
 
Software testing
Software testingSoftware testing
Software testing
 
Software testing
Software testingSoftware testing
Software testing
 
Python: Object-Oriented Testing (Unit Testing)
Python: Object-Oriented Testing (Unit Testing)Python: Object-Oriented Testing (Unit Testing)
Python: Object-Oriented Testing (Unit Testing)
 
Testing
TestingTesting
Testing
 
100 most popular software testing terms
100 most popular software testing terms100 most popular software testing terms
100 most popular software testing terms
 
Manual testing
Manual testingManual testing
Manual testing
 
Object Oriented Testing(OOT) presentation slides
Object Oriented Testing(OOT) presentation slidesObject Oriented Testing(OOT) presentation slides
Object Oriented Testing(OOT) presentation slides
 
Object oriented testing
Object oriented testingObject oriented testing
Object oriented testing
 
Software Testing
Software Testing Software Testing
Software Testing
 
Testing terms & definitions
Testing terms & definitionsTesting terms & definitions
Testing terms & definitions
 
System testing
System testingSystem testing
System testing
 
Chapter 3 SOFTWARE TESTING PROCESS
Chapter 3 SOFTWARE TESTING PROCESSChapter 3 SOFTWARE TESTING PROCESS
Chapter 3 SOFTWARE TESTING PROCESS
 
Software testing
Software testingSoftware testing
Software testing
 
Testing Concepts and Manual Testing
Testing Concepts and Manual TestingTesting Concepts and Manual Testing
Testing Concepts and Manual Testing
 
Software quality and testing (func. & non func.)
Software quality and testing (func. & non   func.)Software quality and testing (func. & non   func.)
Software quality and testing (func. & non func.)
 
Testing
TestingTesting
Testing
 
Software testing definition
Software testing definitionSoftware testing definition
Software testing definition
 
Testing Concepts and Manual Testing
Testing Concepts and Manual TestingTesting Concepts and Manual Testing
Testing Concepts and Manual Testing
 
Testing Concepts and Manual Testing
Testing Concepts and Manual TestingTesting Concepts and Manual Testing
Testing Concepts and Manual Testing
 

Andere mochten auch

Islamic Art & Geometric Design
Islamic Art & Geometric DesignIslamic Art & Geometric Design
Islamic Art & Geometric Designmqamarhayat
 
Puro gusto - Autogrill
Puro gusto - AutogrillPuro gusto - Autogrill
Puro gusto - AutogrillMauro Germani
 
Naresh chauhan
Naresh chauhanNaresh chauhan
Naresh chauhanncrajput
 
Srinivasadesikanraghavan 131008015759-phpapp01
Srinivasadesikanraghavan 131008015759-phpapp01Srinivasadesikanraghavan 131008015759-phpapp01
Srinivasadesikanraghavan 131008015759-phpapp01PMI_IREP_TP
 
Creative Thinking
Creative ThinkingCreative Thinking
Creative Thinkingmqamarhayat
 
Introduction To Critical Thinking
Introduction To Critical ThinkingIntroduction To Critical Thinking
Introduction To Critical Thinkingmqamarhayat
 

Andere mochten auch (7)

Islamic Art & Geometric Design
Islamic Art & Geometric DesignIslamic Art & Geometric Design
Islamic Art & Geometric Design
 
Puro gusto - Autogrill
Puro gusto - AutogrillPuro gusto - Autogrill
Puro gusto - Autogrill
 
Team++
Team++Team++
Team++
 
Naresh chauhan
Naresh chauhanNaresh chauhan
Naresh chauhan
 
Srinivasadesikanraghavan 131008015759-phpapp01
Srinivasadesikanraghavan 131008015759-phpapp01Srinivasadesikanraghavan 131008015759-phpapp01
Srinivasadesikanraghavan 131008015759-phpapp01
 
Creative Thinking
Creative ThinkingCreative Thinking
Creative Thinking
 
Introduction To Critical Thinking
Introduction To Critical ThinkingIntroduction To Critical Thinking
Introduction To Critical Thinking
 

Ähnlich wie Glossary of Testing Terms and Concepts

Ähnlich wie Glossary of Testing Terms and Concepts (20)

Testing in Software Engineering.docx
Testing in Software Engineering.docxTesting in Software Engineering.docx
Testing in Software Engineering.docx
 
Mca se chapter_07_software_validation
Mca se chapter_07_software_validationMca se chapter_07_software_validation
Mca se chapter_07_software_validation
 
Software testing
Software testingSoftware testing
Software testing
 
softwaretestingppt-FINAL-PPT-1
softwaretestingppt-FINAL-PPT-1softwaretestingppt-FINAL-PPT-1
softwaretestingppt-FINAL-PPT-1
 
Software testing ppt
Software testing pptSoftware testing ppt
Software testing ppt
 
software testing technique
software testing techniquesoftware testing technique
software testing technique
 
Test Process
Test ProcessTest Process
Test Process
 
Software Testing
Software TestingSoftware Testing
Software Testing
 
Software testing
Software testingSoftware testing
Software testing
 
Software testing basic
Software testing basicSoftware testing basic
Software testing basic
 
Software testing strategies
Software testing strategiesSoftware testing strategies
Software testing strategies
 
White box & black box testing
White box & black box testingWhite box & black box testing
White box & black box testing
 
Objectorientedtesting 160320132146
Objectorientedtesting 160320132146Objectorientedtesting 160320132146
Objectorientedtesting 160320132146
 
Testing
TestingTesting
Testing
 
SW Testing Fundamentals
SW Testing FundamentalsSW Testing Fundamentals
SW Testing Fundamentals
 
Testing Types And Models
Testing Types And ModelsTesting Types And Models
Testing Types And Models
 
Real Time software Training in Nagercoil
Real Time software Training in NagercoilReal Time software Training in Nagercoil
Real Time software Training in Nagercoil
 
Testing strategies
Testing strategiesTesting strategies
Testing strategies
 
Software Testing or Quality Assurance
Software Testing or Quality AssuranceSoftware Testing or Quality Assurance
Software Testing or Quality Assurance
 
Software testing
Software testingSoftware testing
Software testing
 

Mehr von mqamarhayat

10 Tips For Making Beautiful Presentation
10 Tips For Making Beautiful Presentation10 Tips For Making Beautiful Presentation
10 Tips For Making Beautiful Presentationmqamarhayat
 
Negotiation Skills
Negotiation SkillsNegotiation Skills
Negotiation Skillsmqamarhayat
 
Why Leaders Fail
Why Leaders FailWhy Leaders Fail
Why Leaders Failmqamarhayat
 
Goals And Outcomes
Goals And OutcomesGoals And Outcomes
Goals And Outcomesmqamarhayat
 
Multi Cultural Team Management
Multi Cultural Team ManagementMulti Cultural Team Management
Multi Cultural Team Managementmqamarhayat
 

Mehr von mqamarhayat (7)

10 Tips For Making Beautiful Presentation
10 Tips For Making Beautiful Presentation10 Tips For Making Beautiful Presentation
10 Tips For Making Beautiful Presentation
 
Negotiation Skills
Negotiation SkillsNegotiation Skills
Negotiation Skills
 
Public Speaking
Public SpeakingPublic Speaking
Public Speaking
 
Thirst
ThirstThirst
Thirst
 
Why Leaders Fail
Why Leaders FailWhy Leaders Fail
Why Leaders Fail
 
Goals And Outcomes
Goals And OutcomesGoals And Outcomes
Goals And Outcomes
 
Multi Cultural Team Management
Multi Cultural Team ManagementMulti Cultural Team Management
Multi Cultural Team Management
 

Glossary of Testing Terms and Concepts

  • 1. Glossary of Testing Terms and Concepts AGS QA and Testing CoE December 18, 2009
  • 3. QA & Software Testing Quality assurance, or QA for short, refers to the systematic monitoring and evaluation of various aspects of a project, program or service, to ensure that standards of quality are being met. Software testing or Quality Control, or QC for short, is the Validation and Verification (V&V) activity aimed at evaluating an attribute or capability of a program or system and determining that it meets the desired results.
  • 4. Verification Verification (the first V) is the process of evaluating a system or component to determine whether the output of a given development phase satisfies the conditions expected at the start of that phase.
  • 5. Validation Validation is the process of evaluating a system or component during or at the end of the development process to determine whether it satisfies specified requirements.
  • 6. Test Automation Test automation is the use of software to control the execution of tests, the comparison of actual outcomes to predicted outcomes, the setting up of test preconditions, and other test control and test reporting functions. Commonly, test automation involves automating a manual process already in place that uses a formalized testing process.
  • 7. Types of Test Automation Frameworks The different test automation frameworks available are as follows: Test Script Modularity Test Library Architecture Data-Driven Testing Keyword-Driven or Table-Driven Testing Hybrid Test Automation
  • 8. Test Script Modularity The test script modularity framework is the most basic of the frameworks. It's a programming strategy to build an abstraction layer in front of a component to hide the component from the rest of the application. This insulates the application from modifications in the component and provides modularity in the application design. When working with test scripts (in any language or proprietary environment) this can be achieved by creating small, independent scripts that represent modules, sections, and functions of the application-under-test. Then these small scripts are taken and combined them in a hierarchical fashion to construct larger tests. The use of this framework will yield a higher degree of modularization and add to the overall maintainability of the test scripts.
  • 9. Test Library Architecture The test library architecture framework is very similar to the test script modularity framework and offers the same advantages, but it divides the application-under-test into procedures and functions (or objects and methods depending on the implementation language) instead of scripts. This framework requires the creation of library files (SQABasic libraries, APIs, DLLs, and such) that represent modules, sections, and functions of the application-under-test. These library files are then called directly from the test case script. Much like script modularization this framework also yields a high degree of modularization and adds to the overall maintainability of the tests.
  • 10. Data-Driven Testing A data-driven framework is where test input and output values are read from data files (ODBC sources, CVS files, Excel files, DAO objects, ADO objects, and such) and are loaded into variables in captured or manually coded scripts. In this framework, variables are used for both input values and output verification values. Navigation through the program, reading of the data files, and logging of test status and information are all coded in the test script. This is similar to table-driven testing in that the test case is contained in the data file and not in the script; the script is just a "driver," or delivery mechanism, for the data. In data-driven testing, only test data is contained in the data files.
  • 11. Keyword-Driven Testing This requires the development of data tables and keywords, independent of the test automation tool used to execute them and the test script code that "drives" the application-under-test and the data. Keyword-driven tests look very similar to manual test cases. In a keyword-driven test, the functionality of the application- under-test is documented in a table as well as in step-by- step instructions for each test. In this method, the entire process is data-driven, including functionality.
  • 12. Hybrid Test Automation Framework The most commonly implemented framework is a combination of all of the above techniques, pulling from their strengths and trying to mitigate their weaknesses. This hybrid test automation framework is what most frameworks evolve into over time and multiple projects. The most successful automation frameworks generally accommodate both Keyword-Driven testing as well as Data-Driven scripts. This allows data driven scripts to take advantage of the powerful libraries and utilities that usually accompany a keyword driven architecture. The framework utilities can make the data driven scripts more compact and less prone to failure than they otherwise would have been.
  • 13. Errors, Bugs, Defects… Mistake – a human action that produces an incorrect result. Bug, Fault [or Defect] – an incorrect step, process, or data definition in a program. Failure – the inability of a system or component to perform its required function within the specified performance requirement. Error – the difference between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition.
  • 14. The progression of a software failure A purpose of testing is to expose as many failures as possible before delivering the code to customers.
  • 15. Test Visibility Black box testing (also called functional testing or behavioral testing) is testing that ignores the internal mechanism of a system or component and focuses solely on the outputs generated in response to selected inputs and execution conditions. White box testing (also called structural testing and glass box testing) is testing that takes into account the internal mechanism of a system or component.
  • 17. Specification Specification – a document that specifies in a complete, precise, verifiable manner, the requirements, design, behavior, or other characteristic of a system or component, and often the procedures for determining whether these provisions have been satisfied. Some examples: Functional Requirements Specification Non-Functional Requirements Specification Design Specification
  • 18. Testing Scope Functional Requirements (FR), also termed as Business Requirements, of the software or program under test Non-Functional Requirements (NFR) of the software or program under test. These are non-explicit requirements which the software or program is expected to satisfy for end user to be able to use the software or program successfully. Security, Performance, Compatibility, Internationalization, Usability … requirements are examples of NFR. Whitebox or blackbox type of testing to be performed validate that the software or program meets the FR and NFR
  • 20. Unit Testing Opacity: White box testing Specification: Low-level design and/or code structure Unit testing is the testing of individual hardware or software units or groups of related units. Using white box testing techniques, testers (usually the developers creating the code implementation) verify that the code does what it is intended to do at a very low structural level.
  • 21. Integration Testing Opacity: Black- and white-box testing Specification: Low- and High-level design Integration test is testing in which software components, hardware components, or both are combined and tested to evaluate the interaction between them. Using both black and white box testing techniques, the tester (still usually the software developer) verifies that units work together when they are integrated into a larger code base. Just because the components work individually, that doesn’t mean that they all work together when assembled or integrated.
  • 22. Functional Testing Opacity: Black-box testing Specification: High-level design, Requirements specification Using black box testing techniques, testers examine the high-level design and the customer requirements specification to plan the test cases to ensure the code does what it is intended to do. Functional testing involves ensuring that the functionality specified in the requirement specification works.
  • 23. System Testing Opacity: Black-box testing Specification: High-level design, Requirements specification System testing is testing conducted on a complete, integrated system to evaluate the system compliance with its specified requirements. Because system test is done with a full system implementation and environment, several classes of testing can be done that can examine non-functional properties of the system. It is best when Integration, Function and System testing is done by an unbiased, independent perspective (e.g. not the programmer).
  • 24. Acceptance Tests Opacity: Black-box testing Specification: Requirements Specification After functional and system testing, the product is delivered to a customer and the customer runs black box acceptance tests based on their expectations of the functionality. Acceptance testing is formal testing conducted to determine whether or not a system satisfies its acceptance criteria (the criteria the system must satisfy to be accepted by a customer) and to enable the customer to determine whether or not to accept the system
  • 25. Beta Tests Opacity: Black-box testing Specification: None. When an advanced partial or full version of a software package is available, the development organization can offer it free to one or more (and sometimes thousands) potential users or beta testers.
  • 26. Regression Tests Regression testing is selective retesting of a system or component to verify that modifications have not caused unintended effects and that the system or component still complies with its specified requirements.
  • 27. Test Plan A test plan is a document describing the scope, approach, resources, and schedule of intended test activities. It identifies test items, the features to be tested, the testing tasks, who will do each task, and any risks requiring contingency plans. An important component of the test plan is the individual test cases.
  • 28. Test Scenarios/Test Case The term test scenario and test case are often used synonymously. A test case is a set of test steps each of which defines inputs, execution conditions, and expected results developed for a particular objective, such as to exercise a particular program path or to verify compliance with a specific requirement. Test scenarios ensure that all business process flows are tested from end to end.
  • 29. Test Suite/Scripts The test suite/test script is the combination of a test scenarios, test steps, and test data. Initially the term was derived from the product of work created by automated regression test tools. Test suites/test scripts can be manual, automated, or a combination of both.
  • 30. Application Terms Application: An application is software with features. Feature / Functional point: A functional point is a feature of the application. Some examples of features might be search, login, signup, and edit preferences. Session: A session is the means of grouping together functional points of a single application. The session keeps state and tracks variables of all the functional points grouped in a single session.
  • 31. Action Point An action point can be thought of as the act of doing. Take search for example. Usually search only has two UI elements to it, a text box where the search terms are entered and a search button that submits the search terms. The action point doesn't necessarily care about the search results. It only cares that the search page was loaded correctly, the search term was inserted and the search button was clicked. In other words, a action point doesn't validate what happened after the action occurred, it only performs the action.
  • 32. Validation Point A validation point verifies the outcome of the action point. Usually an action has several possible outcomes. For example, login might behave differently depending on whether the username and password are correct, incorrect, too long, too short or just non-existent. The action point is the act of logging in and the validation point verifies the outcome when using valid, invalid, too long, too short or just non-existent usernames and passwords.
  • 33. Navigation Point Navigation points traverse to the action point or the validation point so it can be executed. For example, there are websites that have a navigation bar on the top no matter which page is loaded. In this case the navigation point simply clicks on the link in the navigation bar while on any of the pages in order to load the page to perform the action or validation.
  • 34. Data Flow Testing In data flow-based testing, the control flow graph is annotated with information about how the program variables are defined and used. Different criteria exercise with varying degrees of precision how a value assigned to a variable is used along different control flow paths.
  • 35. Additional Types of Tests Performance testing Testing conducted to evaluate the compliance of a system or component with specified performance requirements. Usability testing Testing conducted to evaluate the extent to which a user can learn to operate, prepare inputs for, and interpret outputs of a system or component. Stress testing Testing conducted to evaluate a system or component at or beyond the limits of its specification or requirement. Smoke test A group of test cases that establish that the system is stable and all major functionality is present and works under “normal” conditions. Robustness testing Testing whereby test cases are chosen outside the domain to test robustness to unexpected, erroneous input.
  • 36. Thank You Delivering…