SlideShare ist ein Scribd-Unternehmen logo
1 von 40
Downloaden Sie, um offline zu lesen
Unit 9: Web Application Testing

 Testing is the activity conducted to evaluate the quality of a
    product and to improve it by finding errors.




                                     Testing




dsbw 2011/2012 q1                                                  1
Testing Terminology
 An error is “the difference between a computed, observed, or
  measured value or condition and the true, specified, or
  theoretically correct value or condition” (IEEE standard 610.12-
  1990).
 This “true, specified, or theoretically correct value or condition”
  comes from
    A well-defined requirements model, if available and complete
    An incomplete set of fuzzy and contradictory goals, concerns,
      and expectations of the stakeholders
 A test is a set of test cases for a specific object under test: the
  whole Web application, components of a Web application, a
  system that runs a Web application, etc.
 A single test case describes a set of inputs, execution conditions,
  and expected results, which are used to test a specific aspect of the
  object under test
dsbw 2011/2012 q1                                                     2
Testing [and] Quality

 Testing should address compliance not only to functional
    requirements but also to quality requirements, i.e., the kinds of
    quality characteristics expected by stakeholders.
 ISO/IEC 9126-1 [Software] Quality Model:




dsbw 2011/2012 q1                                                       3
Goals of Testing
 The main goal of testing is to find errors but not to prove
    their absence
 A test run is successful if errors are detected. Otherwise, it is
    unsuccessful and “a waste of time”.
 Testing should adopt a risk-based approach:
       Test first and with the greatest effort those critical parts of an
        application where the most dangerous errors are still
        undetected
       A further aim of testing is to bring risks to light, not simply to
        demonstrate conformance to stated requirements.
 Test as early as possible at the beginning of a project: errors
    happened in early development phases are harder to localize
    and more expensive to fix in later phases.
dsbw 2011/2012 q1                                                            4
Test Levels (1/2)
 Unit tests
       Test the smallest testable units (classes, Web pages, etc.)
        independently of one another.
       Performed by the developer during implementation.

 Integration tests
       Evaluate the interaction between distinct and separately tested
        units once they have been integrated.
       Performed by a tester, a developer, or both jointly.

 System tests
       Test the complete, integrated system.
       Typically performed by a specialized test team.




dsbw 2011/2012 q1                                                         5
Test Levels (2/2)
 Acceptance tests
         Evaluate the system with the client in an “realistic”
          environment, i.e. with real conditions and real data.
 Beta tests
       Let friendly users work with early versions of a product to get
        early feedback.
       Beta tests are unsystematic tests which rely on the number and
        “malevolence” of potential users.




dsbw 2011/2012 q1                                                     6
Fitting Testing in the Development Process




 Planning: Defines the quality goals, the general testing strategy, the test
  plans for all test levels, the metrics and measuring methods, and the test
  environment.
 Preparing: Involves selecting the testing techniques and tools and
  specifying the test cases (including the test data).
 Performing: Prepares the test infrastructure, runs the test cases, and then
  documents and evaluates the results.
 Reporting: Summarizes the test results and produces the test reports.
dsbw 2011/2012 q1                                                               7
Web Testing: A Road Map

     Content                     Interface
     Testing                      Testing             Usability Testing
                      user

                                             Navigation
                                              Testing



                                                             Component
                                                               Testing




                                                          Configuration
                                                             Testing

                                                 Performance         Security
                    technology                      Testing          Testing


dsbw 2011/2012 q1                                                               8
Usability
 Usability is a quality attribute that assesses how easy user
    interfaces are to use. Also refers to methods for improving
    ease-of-use during the design process.
 Usability is defined by five quality components:
         Learnability: How easy is it for users to accomplish basic tasks
          the first time they encounter the design?
         Efficiency: Once users have learned the design, how quickly
          can they perform tasks?
         Memorability: When users return to the design after a period
          of not using it, how easily can they reestablish proficiency?
         Errors: How many errors do users make, how severe are these
          errors, and how easily can they recover from the errors?
         Satisfaction: How pleasant is it to use the design?
dsbw 2011/2012 q1                                                            9
Why Usability matters*
 62% of web shoppers gave up looking for an item. (Zona
    study)
 50% of web sales are lost because visitors can’t easily find
    content. (Gartner Group)
 40% of repeat visitors do not return due to a negative
    experience. (Zona study)
 85% of visitors abandon a new site due to poor design.
    (cPulse)
 Only 51% of sites complied with simple web usability
    principles. (Forrester study of 20 major sites)

(*) data from www.usabilitynet.org/management/c_cost.htm

dsbw 2011/2012 q1                                                10
Why people fail

                                                     Search

                                                     Findability (IA, Category
                                                     names, Navigation, Links)

                                                     Page design (Readability,
                                                     Layout, Graphics, Amateur,
                                                     Scrolling)
                                                     Information (Content,
                                                     Product info, Corporate info,
                                                     Prices)
                                                     Task support (Workflow,
                                                     Privacy, Forms, Comparison,
                                                     Inflexible)
                                                     Fancy design (Multimedia,
                                                     Back button, PDF/Printing,
                                                     New window, Sound)
                                                     Other (Bugs, Presence on
                                                     Web, Ads, New site,
Usability problems weighted by how frequently they   Metaphors)
          caused users to fail a task [NL06]

dsbw 2011/2012 q1                                                                    11
Top Ten (Usability) Mistakes in Web Design

1.    Bad search
2.    Pdf files for online reading
3.    Not changing the color of visited links
4.    Non-scannable text
5.    Fixed font size
6.    Page titles with low search engine visibility
7.    Anything that looks like an advertisement
8.    Violating design conventions
9.    Opening new browser windows
10. Not answering users' questions

dsbw 2011/2012 q1                                     12
Assessing Usability
 Two major types of assessing methods:
         Usability evaluations:
               Evaluators and no users
               Techniques: surveys/questionnaires, observational
                evaluations, guideline based reviews, cognitive
                walkthroughs, expert reviews, heuristic evaluations
         Usability tests: focus on users working with the product
 Usability testing is the only way to know if the Web site
    actually has problems that keep people from having a
    successful and satisfying experience.




dsbw 2011/2012 q1                                                     13
Usability Testing
 Usability testing is a methodology that employs potential
    users to evaluate the degree to which a website/software
    meets predefined usability criteria.
 Basic Process:
      1. Watch Customers
      2. They Perform Tasks
      3. Note Their Problems
      4. Make Recommendations
      5. Iterate




dsbw 2011/2012 q1                                              14
Measures of Usability
 Effectiveness (Ability to successfully accomplish tasks)
       Percentage of goals/tasks achieved (success rate)
       Number of errors

 Efficiency (Ability to accomplish tasks with speed and ease)
       Time to complete a task
       Frequency of requests for help
       Number of times facilitator provides assistance
       Number of times user gives up




dsbw 2011/2012 q1                                                15
Measures of Usability
 Satisfaction (Pleasing to users)
         Positive and negative ratings on a satisfaction scale
         Percent of favorable comments to unfavorable comments
         Number of good vs. bad features recalled after test
         Number of users who would use the system again
         Number of times users express dissatisfaction or frustration
 Learnability (Ability to learn how to use site and remember it)
       Ratio of successes to failures
       Number of features that can be recalled after the test




dsbw 2011/2012 q1                                                        16
Usability Testing Roles
 Facilitator:
       Oversees the entire test process
       Plan, test, and report.

 Participant:
       Actual or potential customer.
       Representative users (marketing, designers) avoided.

 Observer (optional):
       Records events as they occur.
       Limits interaction with the customer.
       Does contribute to the report.




dsbw 2011/2012 q1                                              17
Usability Testing Process
Step 1: Planning The Usability Test
         Define what to test
         Define which customers should be tested
         Define what tasks should be tested
         Write usability scenarios and tasks
         Select participants
Step 2: Conducting The Usability Test
       Conduct a test
       Collect data

Step 3: Analyzing and Reporting The Usability Test
       Compile results
       Make recommendations

dsbw 2011/2012 q1                                    18
People – Context – Activities
Step 1: Planning The Usability Test
         Define what to test
               → Activities (Use Cases)
         Define which customers (user profiles) to be tested
               → People (Actors)
         Provide a background for the activities to test
               → Context




dsbw 2011/2012 q1                                               19
Usability Scenarios and Tasks
 Provide the participant with motivation and context to make
    the situation more realistic
 Include several tasks:
       Make the first task simple
       Give a goal, without describing steps

 Set some success criteria, examples:
       N% of test participants will be able to complete x% of tasks in
        the time allotted.
       Participants will be able to complete x% of tasks with no more
        than one error per task.
       N% of test participants will rate the system as highly usable on
        a scale of x to x.


dsbw 2011/2012 q1                                                          20
Example of Scenario with Tasks
 Context:
          You want to book a sailing on Royal Caribbean International for
           next June with your church group. The group is called “Saint
           Francis Summer 2010”. The group is selling out fast, so you
           want to book a cabin, which is close to an elevator because
           your leg hurts from a recent injury.
 Tasks to perform:
      1.    Open your browser
      2.    Click the link labeled “Royal Caribbean”
      3.    Tell me the available cabins in the “Saint Francis Summer
            2010” group
      4.    Tell me a cabin number closest to an elevator
      5.    Book the cabin the best suits your needs

dsbw 2011/2012 q1                                                        21
Selecting Participants
 Recruit participants
       In-house
       recruitment firms, databases, conferences

 Match participants with user profiles
 Numbers: of participants, floaters

 Schedule test sessions

 Incentives:
       Gift checks ($100 per session)
       Food or gift cards




dsbw 2011/2012 q1                                   22
How Many Test Participants Are Required?

 The number of usability problems found in a usability test
    with n participants is:
                              N(1-(1-L)n)
       N : total number of usability problems in the design
       L : the proportion of usability problems discovered while testing
        a single participant.

                                                           For L = 31%




dsbw 2011/2012 q1                                                       23
How Many Test Participants Are Required?

 It seems that you need to test with at least 15 participants to
    discover all the usability problems
 However, is better to perform 3 tests with 5 participants than
    to perform one with 15 participants:
         After the first test with 5 participants has found 85% of the
          usability problems, you will want to fix them in a redesign.
         After creating the new design, you need to test again.
         The second test with 5 users will discover most of the
          remaining 15% of the original usability problems that were not
          found in the first test (and some new one).
         The new test will be able to uncover structural usability
          problems that were obscured in initial studies as users were
          stumped by surface-level usability problems.
         Fix the new problems, and test …
dsbw 2011/2012 q1                                                          24
Usability Labs … Not Necessary




The testing room contains office     The observer side contains a
furniture, video tape equipment, a   powerful computer to collect the
microphone and a computer with       usability data and analyze it. A one-
appropriate software.                way mirror separates the rooms.

dsbw 2011/2012 q1                                                       25
Test Side-by-Side




dsbw 2011/2012 q1   26
Conducting Tests: Facilitator’s Role
 Start with an easy task to build confidence
 Sit beside the person not behind the glass

 Use “think-out-loud” protocol

 Give participants time to think it through

 Offer appropriate encouragement

 Lead participants, don’t answer questions (being an enabler)
 Don’t act knowledgeable (treat them as the experts)

 Don’t get too involved in data collection

 Don’t jump to conclusions

 Don’t solve their problems immediately

dsbw 2011/2012 q1                                                27
Collecting Data
 Performance
       Objective (what actually happened)
       Usually Quantitative
               Time to complete a task
               Time to recover from an error
               Number of errors
               Percentage of tasks completed successfully
               Number of clicks
               Pathway information
 Preference
       Subjective (what participants say/thought)
       Usually Qualitative
               Preference of versions
               Suggestions and comments
               Ratings or rankings (can be quantitative)

dsbw 2011/2012 q1                                            28
Report findings and recommendations
 Make report usable for your users
 Include quantitative data (success rates, times, etc.)

 Avoid words like “few, many, several”. Include counts

 Use quotes

 Use screenshots

 Mention positive findings
 Do not use participant names, use P1, P2, P3, etc.

 Include recommendations

 Make it short



dsbw 2011/2012 q1                                          29
Component Testing
 Focuses on a set of tests that attempt to uncover errors in
    WebApp functions
 Conventional black-box and white-box test case design
    methods can be used at each architectural layer
    (presentation, domain, data access)
 Form data can be exploited systematically to find errors:
         Missing/incomplete data
         Type conversion problems
         Value boundary violations
         Fake data
         Etc.
 Database testing is often an integral part of the component-
    testing regime
dsbw 2011/2012 q1                                                30
Configuration Testing: Server-Side Issues
 Is the WebApp fully compatible with the server OS?
 Are system files, directories, and related system data created correctly
    when the WebApp is operational?
   Do system security measures (e.g., firewalls or encryption) allow the
    WebApp to execute and service users without interference or
    performance degradation?
   Has the WebApp been tested with the distributed server configuration (if
    one exists) that has been chosen?
   Is the WebApp properly integrated with database software? Is the
    WebApp sensitive to different versions of database software?
   Do server-side WebApp scripts execute properly?
   Have system administrator errors been examined for their affect on
    WebApp operations?
   If proxy servers are used, have differences in their configuration been
    addressed with on-site testing?

dsbw 2011/2012 q1                                                            31
Configuration Testing: Client-Side Issues

 Hardware—CPU, memory, storage and printing devices
 Operating systems—Linux, Macintosh OS, Microsoft
    Windows, a mobile-based OS
 Browser software—Internet Explorer, Mozilla/Netscape,
    Opera, Safari, and others
 User interface components—Active X, Java applets and
    others
 Plug-ins—QuickTime, RealPlayer, and many others

 Connectivity—cable, DSL, regular modem, T1




dsbw 2011/2012 q1                                         32
Security Testing
 Designed to probe vulnerabilities
       of the client-side environment,
       the network communications that occur as data are passed
        from client to server and back again,
       and the server-side environment

 On the client-side, vulnerabilities can often be traced to pre-
    existing bugs in browsers, e-mail programs, or
    communication software.
 On the network infrastructure

 On the server-side,
                                       Review the DSBW Unit on
       At host level                  WebApp Security
       At WebApp level


dsbw 2011/2012 q1                                                   33
Performance Testing: Main Questions
 Does the server response time degrade to a point where it is
    noticeable and unacceptable?
   At what point (in terms of users, transactions or data loading) does
    performance become unacceptable?
   What system components are responsible for performance
    degradation?
   What is the average response time for users under a variety of
    loading conditions?
   Does performance degradation have an impact on system security?
   Is WebApp reliability or accuracy affected as the load on the
    system grows?
   What happens when loads that are greater than maximum server
    capacity are applied?


dsbw 2011/2012 q1                                                      34
Performance Testing: Load Tests
 A load test verifies whether or not the system meets the
    required response times and the required throughput.
 Steps:
      1.   Determine load profiles (what access types, how many visits per day,
           at what peak times, how many visits per session, how many
           transactions per session, etc.) and the transaction mix (which
           functions shall be executed with which percentage).
      2.   Determine the target values for response times and throughput (in
           normal operation and at peak times, for simple or complex accesses,
           with minimum, maximum, and average values).
      3.   Run the tests, generating the workload with the transaction mix
           defined in the load profile, and measure the response times and the
           throughput.
      4.   The results are evaluated, and potential bottlenecks are identified.


dsbw 2011/2012 q1                                                                 35
Performance Testing: Stress Tests
 A stress test verifies whether or not the system reacts in a
    controlled way in “stress situations”, which are simulated by
    applying extreme conditions, such as unrealistic overload, or
    heavily fluctuating load.
 The test is aimed at answering the questions:
       Does the server degrade ‘gently’ or does it shut down as
        capacity is exceeded?
       Does server software generate “server not available”
        messages? More generally, are users aware that they cannot
        reach the server?
       Are transactions lost as capacity is exceeded?
       Is data integrity affected as capacity is exceeded?




dsbw 2011/2012 q1                                                    36
Performance Testing: Stress Tests (cont.)

       Under what load conditions the server environment fails? How
        does failure manifest itself? Are automated notifications sent to
        technical support staff at the server site?
       If the system does fail, how long will it take to come back on-
        line?
       Are certain WebApp functions (e.g., compute intensive
        functionality, data streaming capabilities) discontinued as
        capacity reaches the 80 or 90% level?




dsbw 2011/2012 q1                                                       37
Performance Testing: Interpreting Graphics


                              Load: the number of
                               requests that arrive at
                               the system per time unit
                              Throughput: the number
                               of requests served per
                               time unit.
                              SLA: Service Level
                               Agreement




dsbw 2011/2012 q1                                       38
Test Automation
 Automation can significantly increase the efficiency of testing and
  enables new types of tests that also increase the scope (e.g.
  different test objects and quality characteristics) and depth of
  testing (e.g. large amounts and combinations of input data).
 Test automation brings the following benefits:
    Running automated regression tests on new versions of a
      WebApp allows to detect defects caused by side-effects to
      unchanged functionality.
    Various test methods and techniques would be difficult or
      impossible to perform manually. For example, load and stress
      testing requires to simulate a large number of concurrent users.
    Automation allows to run more tests in less time and, thus, to
      run the tests more often leading to greater confidence in the
      system under test.
 Web Site Test Tools: http://www.softwareqatest.com/qatweb1.html
dsbw 2011/2012 q1                                                       39
References
 R. G. Pressman, D. Lowe: Web Engineering. A Practitioner’s
    Approach. McGraw Hill, 2008. Chapter 15.
 KAPPEL, Gerti et al. Web Engineering, John Wiley & Sons,
    2006. Chapter 7.
 [NH06] NIELSEN, J. and LORANGER, H. 2006 Prioritizing Web
    Usability. New Riders Publishing.
 www.useit.com (Jakob Nielsen)
 www.usability.gov




dsbw 2011/2012 q1                                              40

Weitere ähnliche Inhalte

Was ist angesagt?

User Interface Design- Module 2 Uid Process
User Interface Design- Module 2 Uid ProcessUser Interface Design- Module 2 Uid Process
User Interface Design- Module 2 Uid ProcessbrindaN
 
Software Testing Fundamentals
Software Testing FundamentalsSoftware Testing Fundamentals
Software Testing FundamentalsChankey Pathak
 
Architectural Patterns - Interactive and Event Handling Patterns
Architectural Patterns  - Interactive and Event Handling PatternsArchitectural Patterns  - Interactive and Event Handling Patterns
Architectural Patterns - Interactive and Event Handling Patternsassinha
 
User Experience 3: User Experience, Usability and Accessibility
User Experience 3: User Experience, Usability and AccessibilityUser Experience 3: User Experience, Usability and Accessibility
User Experience 3: User Experience, Usability and AccessibilityMarc Miquel
 
Jmeter vs loadrunner vs neoload
Jmeter vs loadrunner vs neoloadJmeter vs loadrunner vs neoload
Jmeter vs loadrunner vs neoloadpratik mohite
 
What is Web Testing?
What is Web Testing?   What is Web Testing?
What is Web Testing? QA InfoTech
 
SELECT THE PROPER DEVICE BASED CONTROLS
SELECT THE PROPER DEVICE BASED CONTROLSSELECT THE PROPER DEVICE BASED CONTROLS
SELECT THE PROPER DEVICE BASED CONTROLSDhanya LK
 
User interface design
User interface designUser interface design
User interface designSlideshare
 
Introduction to Software Project Management
Introduction to Software Project ManagementIntroduction to Software Project Management
Introduction to Software Project ManagementReetesh Gupta
 
PARADIGM SHIFT IN HUMAN COMPUTER INTERACTION
PARADIGM SHIFT IN HUMAN COMPUTER INTERACTIONPARADIGM SHIFT IN HUMAN COMPUTER INTERACTION
PARADIGM SHIFT IN HUMAN COMPUTER INTERACTIONRamkumar Kannan
 
Graphical User Interface (Gui)
Graphical User Interface (Gui)Graphical User Interface (Gui)
Graphical User Interface (Gui)Bilal Amjad
 
ccs356-software-engineering-notes.pdf
ccs356-software-engineering-notes.pdfccs356-software-engineering-notes.pdf
ccs356-software-engineering-notes.pdfVijayakumarKadumbadi
 
Hci – Project Presentation
Hci – Project PresentationHci – Project Presentation
Hci – Project Presentationslmsaady
 
Advanced topics in software engineering
Advanced topics in software engineeringAdvanced topics in software engineering
Advanced topics in software engineeringRupesh Vaishnav
 
Software Requirement Specification
Software Requirement SpecificationSoftware Requirement Specification
Software Requirement SpecificationNiraj Kumar
 

Was ist angesagt? (20)

User Interface Design- Module 2 Uid Process
User Interface Design- Module 2 Uid ProcessUser Interface Design- Module 2 Uid Process
User Interface Design- Module 2 Uid Process
 
Software Testing Fundamentals
Software Testing FundamentalsSoftware Testing Fundamentals
Software Testing Fundamentals
 
Architectural Patterns - Interactive and Event Handling Patterns
Architectural Patterns  - Interactive and Event Handling PatternsArchitectural Patterns  - Interactive and Event Handling Patterns
Architectural Patterns - Interactive and Event Handling Patterns
 
User Experience 3: User Experience, Usability and Accessibility
User Experience 3: User Experience, Usability and AccessibilityUser Experience 3: User Experience, Usability and Accessibility
User Experience 3: User Experience, Usability and Accessibility
 
Jmeter vs loadrunner vs neoload
Jmeter vs loadrunner vs neoloadJmeter vs loadrunner vs neoload
Jmeter vs loadrunner vs neoload
 
What is Web Testing?
What is Web Testing?   What is Web Testing?
What is Web Testing?
 
SELECT THE PROPER DEVICE BASED CONTROLS
SELECT THE PROPER DEVICE BASED CONTROLSSELECT THE PROPER DEVICE BASED CONTROLS
SELECT THE PROPER DEVICE BASED CONTROLS
 
GUI Testing
GUI TestingGUI Testing
GUI Testing
 
User interface design
User interface designUser interface design
User interface design
 
User interface-design
User interface-designUser interface-design
User interface-design
 
Mobile hci
Mobile hciMobile hci
Mobile hci
 
Introduction to Software Project Management
Introduction to Software Project ManagementIntroduction to Software Project Management
Introduction to Software Project Management
 
PARADIGM SHIFT IN HUMAN COMPUTER INTERACTION
PARADIGM SHIFT IN HUMAN COMPUTER INTERACTIONPARADIGM SHIFT IN HUMAN COMPUTER INTERACTION
PARADIGM SHIFT IN HUMAN COMPUTER INTERACTION
 
Graphical User Interface (Gui)
Graphical User Interface (Gui)Graphical User Interface (Gui)
Graphical User Interface (Gui)
 
ccs356-software-engineering-notes.pdf
ccs356-software-engineering-notes.pdfccs356-software-engineering-notes.pdf
ccs356-software-engineering-notes.pdf
 
Hci – Project Presentation
Hci – Project PresentationHci – Project Presentation
Hci – Project Presentation
 
HCI
HCIHCI
HCI
 
Screen based controls in HCI
Screen based controls in HCIScreen based controls in HCI
Screen based controls in HCI
 
Advanced topics in software engineering
Advanced topics in software engineeringAdvanced topics in software engineering
Advanced topics in software engineering
 
Software Requirement Specification
Software Requirement SpecificationSoftware Requirement Specification
Software Requirement Specification
 

Andere mochten auch

Testing Web Applications
Testing Web ApplicationsTesting Web Applications
Testing Web ApplicationsSeth McLaughlin
 
Business Process Reengineering Presentation
Business Process Reengineering PresentationBusiness Process Reengineering Presentation
Business Process Reengineering PresentationHira Anwer Khan
 
Nyc 7 qualities_of_the_leader_as_coach_
Nyc 7 qualities_of_the_leader_as_coach_Nyc 7 qualities_of_the_leader_as_coach_
Nyc 7 qualities_of_the_leader_as_coach_tomheck
 
Moneran Kingdom
Moneran KingdomMoneran Kingdom
Moneran Kingdomiiiapdst
 
TESTING Checklist
TESTING Checklist TESTING Checklist
TESTING Checklist Febin Chacko
 
Training & development dhanu
Training & development dhanuTraining & development dhanu
Training & development dhanuDhanu P G Naik
 
Open Source ERP Technologies for Java Developers
Open Source ERP Technologies for Java DevelopersOpen Source ERP Technologies for Java Developers
Open Source ERP Technologies for Java Developerscboecking
 
Final Report Business Process Reengineering
Final Report Business Process ReengineeringFinal Report Business Process Reengineering
Final Report Business Process ReengineeringHira Anwer Khan
 
Mobile testing
Mobile testingMobile testing
Mobile testingAlex Hung
 
browser compatibility testing
browser compatibility testingbrowser compatibility testing
browser compatibility testingLakshmi Nandoor
 
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...Sauce Labs
 
7 1-1 soap-developers_guide
7 1-1 soap-developers_guide7 1-1 soap-developers_guide
7 1-1 soap-developers_guideNugroho Hermanto
 
Web Application Software Testing
Web Application Software TestingWeb Application Software Testing
Web Application Software TestingAndrew Kandels
 
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers Tom Eston
 
Mobile applications testing
Mobile applications testingMobile applications testing
Mobile applications testingRahul Ranjan
 

Andere mochten auch (20)

Testing Web Applications
Testing Web ApplicationsTesting Web Applications
Testing Web Applications
 
Business Process Reengineering Presentation
Business Process Reengineering PresentationBusiness Process Reengineering Presentation
Business Process Reengineering Presentation
 
Group3
Group3Group3
Group3
 
Nyc 7 qualities_of_the_leader_as_coach_
Nyc 7 qualities_of_the_leader_as_coach_Nyc 7 qualities_of_the_leader_as_coach_
Nyc 7 qualities_of_the_leader_as_coach_
 
Moneran Kingdom
Moneran KingdomMoneran Kingdom
Moneran Kingdom
 
Unit 06: The Web Application Extension for UML
Unit 06: The Web Application Extension for UMLUnit 06: The Web Application Extension for UML
Unit 06: The Web Application Extension for UML
 
TESTING Checklist
TESTING Checklist TESTING Checklist
TESTING Checklist
 
Unit 05: Physical Architecture Design
Unit 05: Physical Architecture DesignUnit 05: Physical Architecture Design
Unit 05: Physical Architecture Design
 
A perspective on web testing.ppt
A perspective on web testing.pptA perspective on web testing.ppt
A perspective on web testing.ppt
 
Training & development dhanu
Training & development dhanuTraining & development dhanu
Training & development dhanu
 
Open Source ERP Technologies for Java Developers
Open Source ERP Technologies for Java DevelopersOpen Source ERP Technologies for Java Developers
Open Source ERP Technologies for Java Developers
 
Final Report Business Process Reengineering
Final Report Business Process ReengineeringFinal Report Business Process Reengineering
Final Report Business Process Reengineering
 
Mobile testing
Mobile testingMobile testing
Mobile testing
 
Web testing
Web testingWeb testing
Web testing
 
browser compatibility testing
browser compatibility testingbrowser compatibility testing
browser compatibility testing
 
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
 
7 1-1 soap-developers_guide
7 1-1 soap-developers_guide7 1-1 soap-developers_guide
7 1-1 soap-developers_guide
 
Web Application Software Testing
Web Application Software TestingWeb Application Software Testing
Web Application Software Testing
 
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
 
Mobile applications testing
Mobile applications testingMobile applications testing
Mobile applications testing
 

Ähnlich wie Unit 09: Web Application Testing

Web Usability (Slideshare Version)
Web Usability (Slideshare Version)Web Usability (Slideshare Version)
Web Usability (Slideshare Version)Carles Farré
 
User Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User VisionUser Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User Visiontechmeetup
 
[DSBW Spring 2009] Unit 09: Web Testing
[DSBW Spring 2009] Unit 09: Web Testing[DSBW Spring 2009] Unit 09: Web Testing
[DSBW Spring 2009] Unit 09: Web TestingCarles Farré
 
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMA RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMijseajournal
 
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMA RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMijseajournal
 
Standards Based Approach to User Interface Development
Standards Based Approach to User Interface DevelopmentStandards Based Approach to User Interface Development
Standards Based Approach to User Interface DevelopmentSameer Chavan
 
MD Tareque Automation
MD Tareque AutomationMD Tareque Automation
MD Tareque AutomationMD Tareque
 
Helpdesk Software
Helpdesk SoftwareHelpdesk Software
Helpdesk Softwaredinasharawi
 
Helpdesk Software
Helpdesk SoftwareHelpdesk Software
Helpdesk Softwaredinasharawi
 
Usabilitydraft
UsabilitydraftUsabilitydraft
UsabilitydraftKimGriggs
 
Richa Rani-QA Consultant
Richa Rani-QA ConsultantRicha Rani-QA Consultant
Richa Rani-QA ConsultantRicha Rani
 
Aditya Vad_Resume
Aditya Vad_ResumeAditya Vad_Resume
Aditya Vad_ResumeAditya Vad
 

Ähnlich wie Unit 09: Web Application Testing (20)

Web Usability (Slideshare Version)
Web Usability (Slideshare Version)Web Usability (Slideshare Version)
Web Usability (Slideshare Version)
 
User Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User VisionUser Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User Vision
 
Unit03: Process and Business Models
Unit03: Process and Business ModelsUnit03: Process and Business Models
Unit03: Process and Business Models
 
[DSBW Spring 2009] Unit 09: Web Testing
[DSBW Spring 2009] Unit 09: Web Testing[DSBW Spring 2009] Unit 09: Web Testing
[DSBW Spring 2009] Unit 09: Web Testing
 
QA_Resume
QA_ResumeQA_Resume
QA_Resume
 
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMA RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
 
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMA RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
 
Standards Based Approach to User Interface Development
Standards Based Approach to User Interface DevelopmentStandards Based Approach to User Interface Development
Standards Based Approach to User Interface Development
 
MD Tareque Automation
MD Tareque AutomationMD Tareque Automation
MD Tareque Automation
 
Helpdesk Software
Helpdesk SoftwareHelpdesk Software
Helpdesk Software
 
Helpdesk Software
Helpdesk SoftwareHelpdesk Software
Helpdesk Software
 
Raji_QA
Raji_QARaji_QA
Raji_QA
 
Usabilitydraft
UsabilitydraftUsabilitydraft
Usabilitydraft
 
Kasi Viswanath
Kasi ViswanathKasi Viswanath
Kasi Viswanath
 
Richa Rani-QA Consultant
Richa Rani-QA ConsultantRicha Rani-QA Consultant
Richa Rani-QA Consultant
 
Aditya Vad_Resume
Aditya Vad_ResumeAditya Vad_Resume
Aditya Vad_Resume
 
QA_Resume
QA_ResumeQA_Resume
QA_Resume
 
Ooad
OoadOoad
Ooad
 
PARVATHY INDIRA
PARVATHY INDIRAPARVATHY INDIRA
PARVATHY INDIRA
 
195
195195
195
 

Mehr von DSBW 2011/2002 - Carles Farré - Barcelona Tech (9)

Unit 08: Security for Web Applications
Unit 08: Security for Web ApplicationsUnit 08: Security for Web Applications
Unit 08: Security for Web Applications
 
Unit 07: Design Patterns and Frameworks (3/3)
Unit 07: Design Patterns and Frameworks (3/3)Unit 07: Design Patterns and Frameworks (3/3)
Unit 07: Design Patterns and Frameworks (3/3)
 
Unit 07: Design Patterns and Frameworks (2/3)
Unit 07: Design Patterns and Frameworks (2/3)Unit 07: Design Patterns and Frameworks (2/3)
Unit 07: Design Patterns and Frameworks (2/3)
 
Unit 07: Design Patterns and Frameworks (1/3)
Unit 07: Design Patterns and Frameworks (1/3)Unit 07: Design Patterns and Frameworks (1/3)
Unit 07: Design Patterns and Frameworks (1/3)
 
Unit 04: From Requirements to the UX Model
Unit 04: From Requirements to the UX ModelUnit 04: From Requirements to the UX Model
Unit 04: From Requirements to the UX Model
 
Unit 02: Web Technologies (2/2)
Unit 02: Web Technologies (2/2)Unit 02: Web Technologies (2/2)
Unit 02: Web Technologies (2/2)
 
Unit 02: Web Technologies (1/2)
Unit 02: Web Technologies (1/2)Unit 02: Web Technologies (1/2)
Unit 02: Web Technologies (1/2)
 
Unit 01 - Introduction
Unit 01 - IntroductionUnit 01 - Introduction
Unit 01 - Introduction
 
Unit 10: XML and Beyond (Sematic Web, Web Services, ...)
Unit 10: XML and Beyond (Sematic Web, Web Services, ...)Unit 10: XML and Beyond (Sematic Web, Web Services, ...)
Unit 10: XML and Beyond (Sematic Web, Web Services, ...)
 

Kürzlich hochgeladen

MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotesMuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotesManik S Magar
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxLoriGlavin3
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfLoriGlavin3
 
[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality Assurance[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality AssuranceInflectra
 
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better StrongerModern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better Strongerpanagenda
 
A Framework for Development in the AI Age
A Framework for Development in the AI AgeA Framework for Development in the AI Age
A Framework for Development in the AI AgeCprime
 
Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024Hiroshi SHIBATA
 
React Native vs Ionic - The Best Mobile App Framework
React Native vs Ionic - The Best Mobile App FrameworkReact Native vs Ionic - The Best Mobile App Framework
React Native vs Ionic - The Best Mobile App FrameworkPixlogix Infotech
 
Bridging Between CAD & GIS: 6 Ways to Automate Your Data Integration
Bridging Between CAD & GIS:  6 Ways to Automate Your Data IntegrationBridging Between CAD & GIS:  6 Ways to Automate Your Data Integration
Bridging Between CAD & GIS: 6 Ways to Automate Your Data Integrationmarketing932765
 
Generative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptxGenerative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptxfnnc6jmgwh
 
2024 April Patch Tuesday
2024 April Patch Tuesday2024 April Patch Tuesday
2024 April Patch TuesdayIvanti
 
Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...
Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...
Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...Alkin Tezuysal
 
Decarbonising Buildings: Making a net-zero built environment a reality
Decarbonising Buildings: Making a net-zero built environment a realityDecarbonising Buildings: Making a net-zero built environment a reality
Decarbonising Buildings: Making a net-zero built environment a realityIES VE
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 
Zeshan Sattar- Assessing the skill requirements and industry expectations for...
Zeshan Sattar- Assessing the skill requirements and industry expectations for...Zeshan Sattar- Assessing the skill requirements and industry expectations for...
Zeshan Sattar- Assessing the skill requirements and industry expectations for...itnewsafrica
 
Emixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native developmentEmixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native developmentPim van der Noll
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
Varsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
Varsha Sewlal- Cyber Attacks on Critical Critical InfrastructureVarsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
Varsha Sewlal- Cyber Attacks on Critical Critical Infrastructureitnewsafrica
 
UiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPathCommunity
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxLoriGlavin3
 

Kürzlich hochgeladen (20)

MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotesMuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdf
 
[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality Assurance[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality Assurance
 
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better StrongerModern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
 
A Framework for Development in the AI Age
A Framework for Development in the AI AgeA Framework for Development in the AI Age
A Framework for Development in the AI Age
 
Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024
 
React Native vs Ionic - The Best Mobile App Framework
React Native vs Ionic - The Best Mobile App FrameworkReact Native vs Ionic - The Best Mobile App Framework
React Native vs Ionic - The Best Mobile App Framework
 
Bridging Between CAD & GIS: 6 Ways to Automate Your Data Integration
Bridging Between CAD & GIS:  6 Ways to Automate Your Data IntegrationBridging Between CAD & GIS:  6 Ways to Automate Your Data Integration
Bridging Between CAD & GIS: 6 Ways to Automate Your Data Integration
 
Generative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptxGenerative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptx
 
2024 April Patch Tuesday
2024 April Patch Tuesday2024 April Patch Tuesday
2024 April Patch Tuesday
 
Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...
Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...
Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...
 
Decarbonising Buildings: Making a net-zero built environment a reality
Decarbonising Buildings: Making a net-zero built environment a realityDecarbonising Buildings: Making a net-zero built environment a reality
Decarbonising Buildings: Making a net-zero built environment a reality
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 
Zeshan Sattar- Assessing the skill requirements and industry expectations for...
Zeshan Sattar- Assessing the skill requirements and industry expectations for...Zeshan Sattar- Assessing the skill requirements and industry expectations for...
Zeshan Sattar- Assessing the skill requirements and industry expectations for...
 
Emixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native developmentEmixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native development
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
Varsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
Varsha Sewlal- Cyber Attacks on Critical Critical InfrastructureVarsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
Varsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
 
UiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to Hero
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
 

Unit 09: Web Application Testing

  • 1. Unit 9: Web Application Testing  Testing is the activity conducted to evaluate the quality of a product and to improve it by finding errors. Testing dsbw 2011/2012 q1 1
  • 2. Testing Terminology  An error is “the difference between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition” (IEEE standard 610.12- 1990).  This “true, specified, or theoretically correct value or condition” comes from  A well-defined requirements model, if available and complete  An incomplete set of fuzzy and contradictory goals, concerns, and expectations of the stakeholders  A test is a set of test cases for a specific object under test: the whole Web application, components of a Web application, a system that runs a Web application, etc.  A single test case describes a set of inputs, execution conditions, and expected results, which are used to test a specific aspect of the object under test dsbw 2011/2012 q1 2
  • 3. Testing [and] Quality  Testing should address compliance not only to functional requirements but also to quality requirements, i.e., the kinds of quality characteristics expected by stakeholders.  ISO/IEC 9126-1 [Software] Quality Model: dsbw 2011/2012 q1 3
  • 4. Goals of Testing  The main goal of testing is to find errors but not to prove their absence  A test run is successful if errors are detected. Otherwise, it is unsuccessful and “a waste of time”.  Testing should adopt a risk-based approach:  Test first and with the greatest effort those critical parts of an application where the most dangerous errors are still undetected  A further aim of testing is to bring risks to light, not simply to demonstrate conformance to stated requirements.  Test as early as possible at the beginning of a project: errors happened in early development phases are harder to localize and more expensive to fix in later phases. dsbw 2011/2012 q1 4
  • 5. Test Levels (1/2)  Unit tests  Test the smallest testable units (classes, Web pages, etc.) independently of one another.  Performed by the developer during implementation.  Integration tests  Evaluate the interaction between distinct and separately tested units once they have been integrated.  Performed by a tester, a developer, or both jointly.  System tests  Test the complete, integrated system.  Typically performed by a specialized test team. dsbw 2011/2012 q1 5
  • 6. Test Levels (2/2)  Acceptance tests  Evaluate the system with the client in an “realistic” environment, i.e. with real conditions and real data.  Beta tests  Let friendly users work with early versions of a product to get early feedback.  Beta tests are unsystematic tests which rely on the number and “malevolence” of potential users. dsbw 2011/2012 q1 6
  • 7. Fitting Testing in the Development Process  Planning: Defines the quality goals, the general testing strategy, the test plans for all test levels, the metrics and measuring methods, and the test environment.  Preparing: Involves selecting the testing techniques and tools and specifying the test cases (including the test data).  Performing: Prepares the test infrastructure, runs the test cases, and then documents and evaluates the results.  Reporting: Summarizes the test results and produces the test reports. dsbw 2011/2012 q1 7
  • 8. Web Testing: A Road Map Content Interface Testing Testing Usability Testing user Navigation Testing Component Testing Configuration Testing Performance Security technology Testing Testing dsbw 2011/2012 q1 8
  • 9. Usability  Usability is a quality attribute that assesses how easy user interfaces are to use. Also refers to methods for improving ease-of-use during the design process.  Usability is defined by five quality components:  Learnability: How easy is it for users to accomplish basic tasks the first time they encounter the design?  Efficiency: Once users have learned the design, how quickly can they perform tasks?  Memorability: When users return to the design after a period of not using it, how easily can they reestablish proficiency?  Errors: How many errors do users make, how severe are these errors, and how easily can they recover from the errors?  Satisfaction: How pleasant is it to use the design? dsbw 2011/2012 q1 9
  • 10. Why Usability matters*  62% of web shoppers gave up looking for an item. (Zona study)  50% of web sales are lost because visitors can’t easily find content. (Gartner Group)  40% of repeat visitors do not return due to a negative experience. (Zona study)  85% of visitors abandon a new site due to poor design. (cPulse)  Only 51% of sites complied with simple web usability principles. (Forrester study of 20 major sites) (*) data from www.usabilitynet.org/management/c_cost.htm dsbw 2011/2012 q1 10
  • 11. Why people fail Search Findability (IA, Category names, Navigation, Links) Page design (Readability, Layout, Graphics, Amateur, Scrolling) Information (Content, Product info, Corporate info, Prices) Task support (Workflow, Privacy, Forms, Comparison, Inflexible) Fancy design (Multimedia, Back button, PDF/Printing, New window, Sound) Other (Bugs, Presence on Web, Ads, New site, Usability problems weighted by how frequently they Metaphors) caused users to fail a task [NL06] dsbw 2011/2012 q1 11
  • 12. Top Ten (Usability) Mistakes in Web Design 1. Bad search 2. Pdf files for online reading 3. Not changing the color of visited links 4. Non-scannable text 5. Fixed font size 6. Page titles with low search engine visibility 7. Anything that looks like an advertisement 8. Violating design conventions 9. Opening new browser windows 10. Not answering users' questions dsbw 2011/2012 q1 12
  • 13. Assessing Usability  Two major types of assessing methods:  Usability evaluations:  Evaluators and no users  Techniques: surveys/questionnaires, observational evaluations, guideline based reviews, cognitive walkthroughs, expert reviews, heuristic evaluations  Usability tests: focus on users working with the product  Usability testing is the only way to know if the Web site actually has problems that keep people from having a successful and satisfying experience. dsbw 2011/2012 q1 13
  • 14. Usability Testing  Usability testing is a methodology that employs potential users to evaluate the degree to which a website/software meets predefined usability criteria.  Basic Process: 1. Watch Customers 2. They Perform Tasks 3. Note Their Problems 4. Make Recommendations 5. Iterate dsbw 2011/2012 q1 14
  • 15. Measures of Usability  Effectiveness (Ability to successfully accomplish tasks)  Percentage of goals/tasks achieved (success rate)  Number of errors  Efficiency (Ability to accomplish tasks with speed and ease)  Time to complete a task  Frequency of requests for help  Number of times facilitator provides assistance  Number of times user gives up dsbw 2011/2012 q1 15
  • 16. Measures of Usability  Satisfaction (Pleasing to users)  Positive and negative ratings on a satisfaction scale  Percent of favorable comments to unfavorable comments  Number of good vs. bad features recalled after test  Number of users who would use the system again  Number of times users express dissatisfaction or frustration  Learnability (Ability to learn how to use site and remember it)  Ratio of successes to failures  Number of features that can be recalled after the test dsbw 2011/2012 q1 16
  • 17. Usability Testing Roles  Facilitator:  Oversees the entire test process  Plan, test, and report.  Participant:  Actual or potential customer.  Representative users (marketing, designers) avoided.  Observer (optional):  Records events as they occur.  Limits interaction with the customer.  Does contribute to the report. dsbw 2011/2012 q1 17
  • 18. Usability Testing Process Step 1: Planning The Usability Test  Define what to test  Define which customers should be tested  Define what tasks should be tested  Write usability scenarios and tasks  Select participants Step 2: Conducting The Usability Test  Conduct a test  Collect data Step 3: Analyzing and Reporting The Usability Test  Compile results  Make recommendations dsbw 2011/2012 q1 18
  • 19. People – Context – Activities Step 1: Planning The Usability Test  Define what to test  → Activities (Use Cases)  Define which customers (user profiles) to be tested  → People (Actors)  Provide a background for the activities to test  → Context dsbw 2011/2012 q1 19
  • 20. Usability Scenarios and Tasks  Provide the participant with motivation and context to make the situation more realistic  Include several tasks:  Make the first task simple  Give a goal, without describing steps  Set some success criteria, examples:  N% of test participants will be able to complete x% of tasks in the time allotted.  Participants will be able to complete x% of tasks with no more than one error per task.  N% of test participants will rate the system as highly usable on a scale of x to x. dsbw 2011/2012 q1 20
  • 21. Example of Scenario with Tasks  Context:  You want to book a sailing on Royal Caribbean International for next June with your church group. The group is called “Saint Francis Summer 2010”. The group is selling out fast, so you want to book a cabin, which is close to an elevator because your leg hurts from a recent injury.  Tasks to perform: 1. Open your browser 2. Click the link labeled “Royal Caribbean” 3. Tell me the available cabins in the “Saint Francis Summer 2010” group 4. Tell me a cabin number closest to an elevator 5. Book the cabin the best suits your needs dsbw 2011/2012 q1 21
  • 22. Selecting Participants  Recruit participants  In-house  recruitment firms, databases, conferences  Match participants with user profiles  Numbers: of participants, floaters  Schedule test sessions  Incentives:  Gift checks ($100 per session)  Food or gift cards dsbw 2011/2012 q1 22
  • 23. How Many Test Participants Are Required?  The number of usability problems found in a usability test with n participants is: N(1-(1-L)n)  N : total number of usability problems in the design  L : the proportion of usability problems discovered while testing a single participant. For L = 31% dsbw 2011/2012 q1 23
  • 24. How Many Test Participants Are Required?  It seems that you need to test with at least 15 participants to discover all the usability problems  However, is better to perform 3 tests with 5 participants than to perform one with 15 participants:  After the first test with 5 participants has found 85% of the usability problems, you will want to fix them in a redesign.  After creating the new design, you need to test again.  The second test with 5 users will discover most of the remaining 15% of the original usability problems that were not found in the first test (and some new one).  The new test will be able to uncover structural usability problems that were obscured in initial studies as users were stumped by surface-level usability problems.  Fix the new problems, and test … dsbw 2011/2012 q1 24
  • 25. Usability Labs … Not Necessary The testing room contains office The observer side contains a furniture, video tape equipment, a powerful computer to collect the microphone and a computer with usability data and analyze it. A one- appropriate software. way mirror separates the rooms. dsbw 2011/2012 q1 25
  • 27. Conducting Tests: Facilitator’s Role  Start with an easy task to build confidence  Sit beside the person not behind the glass  Use “think-out-loud” protocol  Give participants time to think it through  Offer appropriate encouragement  Lead participants, don’t answer questions (being an enabler)  Don’t act knowledgeable (treat them as the experts)  Don’t get too involved in data collection  Don’t jump to conclusions  Don’t solve their problems immediately dsbw 2011/2012 q1 27
  • 28. Collecting Data  Performance  Objective (what actually happened)  Usually Quantitative  Time to complete a task  Time to recover from an error  Number of errors  Percentage of tasks completed successfully  Number of clicks  Pathway information  Preference  Subjective (what participants say/thought)  Usually Qualitative  Preference of versions  Suggestions and comments  Ratings or rankings (can be quantitative) dsbw 2011/2012 q1 28
  • 29. Report findings and recommendations  Make report usable for your users  Include quantitative data (success rates, times, etc.)  Avoid words like “few, many, several”. Include counts  Use quotes  Use screenshots  Mention positive findings  Do not use participant names, use P1, P2, P3, etc.  Include recommendations  Make it short dsbw 2011/2012 q1 29
  • 30. Component Testing  Focuses on a set of tests that attempt to uncover errors in WebApp functions  Conventional black-box and white-box test case design methods can be used at each architectural layer (presentation, domain, data access)  Form data can be exploited systematically to find errors:  Missing/incomplete data  Type conversion problems  Value boundary violations  Fake data  Etc.  Database testing is often an integral part of the component- testing regime dsbw 2011/2012 q1 30
  • 31. Configuration Testing: Server-Side Issues  Is the WebApp fully compatible with the server OS?  Are system files, directories, and related system data created correctly when the WebApp is operational?  Do system security measures (e.g., firewalls or encryption) allow the WebApp to execute and service users without interference or performance degradation?  Has the WebApp been tested with the distributed server configuration (if one exists) that has been chosen?  Is the WebApp properly integrated with database software? Is the WebApp sensitive to different versions of database software?  Do server-side WebApp scripts execute properly?  Have system administrator errors been examined for their affect on WebApp operations?  If proxy servers are used, have differences in their configuration been addressed with on-site testing? dsbw 2011/2012 q1 31
  • 32. Configuration Testing: Client-Side Issues  Hardware—CPU, memory, storage and printing devices  Operating systems—Linux, Macintosh OS, Microsoft Windows, a mobile-based OS  Browser software—Internet Explorer, Mozilla/Netscape, Opera, Safari, and others  User interface components—Active X, Java applets and others  Plug-ins—QuickTime, RealPlayer, and many others  Connectivity—cable, DSL, regular modem, T1 dsbw 2011/2012 q1 32
  • 33. Security Testing  Designed to probe vulnerabilities  of the client-side environment,  the network communications that occur as data are passed from client to server and back again,  and the server-side environment  On the client-side, vulnerabilities can often be traced to pre- existing bugs in browsers, e-mail programs, or communication software.  On the network infrastructure  On the server-side, Review the DSBW Unit on  At host level WebApp Security  At WebApp level dsbw 2011/2012 q1 33
  • 34. Performance Testing: Main Questions  Does the server response time degrade to a point where it is noticeable and unacceptable?  At what point (in terms of users, transactions or data loading) does performance become unacceptable?  What system components are responsible for performance degradation?  What is the average response time for users under a variety of loading conditions?  Does performance degradation have an impact on system security?  Is WebApp reliability or accuracy affected as the load on the system grows?  What happens when loads that are greater than maximum server capacity are applied? dsbw 2011/2012 q1 34
  • 35. Performance Testing: Load Tests  A load test verifies whether or not the system meets the required response times and the required throughput.  Steps: 1. Determine load profiles (what access types, how many visits per day, at what peak times, how many visits per session, how many transactions per session, etc.) and the transaction mix (which functions shall be executed with which percentage). 2. Determine the target values for response times and throughput (in normal operation and at peak times, for simple or complex accesses, with minimum, maximum, and average values). 3. Run the tests, generating the workload with the transaction mix defined in the load profile, and measure the response times and the throughput. 4. The results are evaluated, and potential bottlenecks are identified. dsbw 2011/2012 q1 35
  • 36. Performance Testing: Stress Tests  A stress test verifies whether or not the system reacts in a controlled way in “stress situations”, which are simulated by applying extreme conditions, such as unrealistic overload, or heavily fluctuating load.  The test is aimed at answering the questions:  Does the server degrade ‘gently’ or does it shut down as capacity is exceeded?  Does server software generate “server not available” messages? More generally, are users aware that they cannot reach the server?  Are transactions lost as capacity is exceeded?  Is data integrity affected as capacity is exceeded? dsbw 2011/2012 q1 36
  • 37. Performance Testing: Stress Tests (cont.)  Under what load conditions the server environment fails? How does failure manifest itself? Are automated notifications sent to technical support staff at the server site?  If the system does fail, how long will it take to come back on- line?  Are certain WebApp functions (e.g., compute intensive functionality, data streaming capabilities) discontinued as capacity reaches the 80 or 90% level? dsbw 2011/2012 q1 37
  • 38. Performance Testing: Interpreting Graphics  Load: the number of requests that arrive at the system per time unit  Throughput: the number of requests served per time unit.  SLA: Service Level Agreement dsbw 2011/2012 q1 38
  • 39. Test Automation  Automation can significantly increase the efficiency of testing and enables new types of tests that also increase the scope (e.g. different test objects and quality characteristics) and depth of testing (e.g. large amounts and combinations of input data).  Test automation brings the following benefits:  Running automated regression tests on new versions of a WebApp allows to detect defects caused by side-effects to unchanged functionality.  Various test methods and techniques would be difficult or impossible to perform manually. For example, load and stress testing requires to simulate a large number of concurrent users.  Automation allows to run more tests in less time and, thus, to run the tests more often leading to greater confidence in the system under test.  Web Site Test Tools: http://www.softwareqatest.com/qatweb1.html dsbw 2011/2012 q1 39
  • 40. References  R. G. Pressman, D. Lowe: Web Engineering. A Practitioner’s Approach. McGraw Hill, 2008. Chapter 15.  KAPPEL, Gerti et al. Web Engineering, John Wiley & Sons, 2006. Chapter 7.  [NH06] NIELSEN, J. and LORANGER, H. 2006 Prioritizing Web Usability. New Riders Publishing.  www.useit.com (Jakob Nielsen)  www.usability.gov dsbw 2011/2012 q1 40