More Related Content Similar to Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Test Cases While Improving Test Coverage (20) More from CA Technologies (20) Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Test Cases While Improving Test Coverage1. Case Study: How CA Went From 40 Days to Three Days
Building Crystal-Clear Test Cases While Improving Test
Coverage
Stephen Tyler
DO3T022S
VP, Software Engineering
CA Technologies
DEVOPS – CONTINUOUS DELIVERY
2. 2 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
© 2017 CA. All rights reserved. All trademarks referenced herein belong to their respective companies.
The content provided in this CA World 2017 presentation is intended for informational purposes only and does not form any type
of warranty. The information provided by a CA partner and/or CA customer has not been reviewed for accuracy by CA.
For Informational Purposes Only
Terms of This Presentation
3. 3 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Abstract
Here at CA Technologies, our development teams share many of the same challenges
producing quality software as our customers. This session will cover how we:
Introduced collaborative modeling between our product owners and engineers to drive
extreme clarity around requirements during backlog refinement, before a story was
picked up for a sprint.
Built out a requirements model to generate a set of test cases that can fully cover the
acceptance criteria for your stories.
Gained the result—reducing the time it took to build out regression test cases from 40
days to three—while improving test coverage with less test cases
Stephen
Tyler
CA Technologies
VP, Software
Engineering
4. 4 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Agenda
WHY: OUR CHALLENGES AND WHY WE THOUGHT MBT MIGHT HELP
WHAT: OUR HYPOTHESES AND METHODS
RESULTS: WHAT WORKED, LESSONS LEARNED, NEXT STEPS
1
2
3
5. 5 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Challenges and Aspirations
Incomplete/ambiguous requirements
Inadequate functional coverage
Too many defects leading to rework
ACCEPTANCE TESTING
FOR NEW STORIES
Lack of visibility to functional
coverage/test efficiency
Test Suite maintenance
Automation cost
REGRESSION TESTING
Clear, complete,
unambiguous
requirements
What We Hoped MBT Would Get Us
Confidence in
functional
coverage
Fewer defects
escaping the
sprint
Increased
effectiveness
and efficiency
Ease of
Maintenance
Reduced
automation cost
6. 6 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Acceptance Testing Example 1
Configure Connection and Resources Dialog
7. 7 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
In the MBT tool we used, the functionality is
modeled as a flowchart/graph as shown
here.
Each node of the graph is an object on
the UI to which data is entered and
metadata associated to the node defines
what data ins and outs it supports.
Each path through the graph is a test
case driven by particular data entries or
decisions at various nodes along the
path.
In this model there are 23 distinct paths. That
is 23 test cases to fully cover the
functionality as modelled.
Acceptance Testing Example 1
The Model
8. 8 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
All 23 possible paths
through the model
Path 15 of 23 visually highlighted on the graph and
textually represented (human readable test steps)
Model-generated Protractor
script to execute path 15
Generating Test Cases and Automated Tests
9. 9 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Automating the Automation
Export the “Automation layer” for each path through the
model.
The script snippets for the relevant actions on each node
in the exported paths are concatenated with appropriate
variable substitution to produce an executable script
corresponding to that path in the model.
Attach Automation “actions” to the
outputs of each node in the model.
Each action is defined with
parameterized text snippets.
10. 10 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Acceptance Testing Example 2
Service Configuration Sub-dialog
We will focus on the “client-facing certificate configuration”
requirement, implemented in the sub-dialog shown here.
11. 11 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Initial flowchart shown here describes the basic
functional requirement (not the UI) – i.e. what should
happen with various combinations of inputs to the fields,
with constraints that define which “paths” through the
“graph” should lead to successful vs unsuccessful
validation.
However, there were also UI behavior requirements where
not just combination of entries, but also order of entry was
significant i.e. the validate button should only be enabled
(therefore clickable at all) when the fields for the other data
items were in certain states.
This required us to change our approach to modeling...
Acceptance Testing Example 2
Model
12. 12 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
This is the same flow refactored to be suitable for
the generation of UI tests. It’s a little more
complex and required thinking creatively about
how to model the requirement and a bit of a
learning curve for our engineers.
More challenging to model but worth it:
• No escaped defects relating to this story were found
in the subsequent pre-release testing, and so far none
have been logged by our users that have adopted the
release this was shipped in.
• Automated regression tests based on the model and
running in CI have proven effective in detecting
regression defects and preventing defect escape from
subsequent sprints.
Acceptance Testing Example 2
Tailoring for UI Order Significance and Automation
13. 13 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Example 3 - TDM Shredder Regression Testing
TDM Shredder - Synthetic Data Generation (e.g. for XML)
Register XSD or
Sample XML if no
XSD
Create Tables in
Staging DB
Register Derived
Tables with TDM
Repository
Import Data from
Sample XML to Derived
Tables in Staging DB
Template-ize for data
generation
Publish Synthetic
Data to Staging DB
Export from Staging
DB
to XML files
Import Data to a
TDM Data Pool from
Staging DB
Staging DB
(for Derived Tables)
TDM
Repository
TDM Shredder TDM DataMaker
Sample
XML
XSD
Generated
XML Files
TDM Shredder Lines of Code: 130,000. i.e. Medium-high complexity sub-component of a product. Lots of complexity in the types of input file
and specs supported, especially with XML. The testing complexity is in the data inputs, as much as in the UI behaviour.
14. 14 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Example 3 - TDM Shredder Model
The main flow.
2 of the 6 sub-flows.
1 main Flow , 6 sub flows, 85 total nodes.
15. 15 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Example 3 - TDM Shredder Regression Testing
Test Configurations – Data-driven Test Steps
Decision table nodes can be driven with many specific
examples of each type, each one is a separate test case.
For each variety of file, different inputs can
be passed using decision table inputs.
Very efficient way to model / represent large number of test cases that differ only in the nature of the test data.
16. 16 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Example 3 - TDM Shredder Regression Testing
Test Case Authoring With and Without MBT
Manual Test Authoring
# of TC
Man Days to
Create
Code
Coverage
Functional Coverage
318 40 55% ~65% (Guess)
Model-based Test Authoring
# of TC
Man Days to
Create
Code Coverage Functional Coverage
450 * 3 63% 67% (Known)
* Test Cases: The model has 17,000 distinct paths (test cases). Far too many to execute so we used the modelling tool’s path selection
capability to select a manageable number of paths smartly, minimizing duplication and preserving best possible functional coverage. We used
an “All in/out edges” algorithm to select 450 test cases (comparable number to the manually written suite).
MBT enabled us to create a better regression suite with a lot less time and effort than doing it
manually. This applies to initial creation and maintenance over time.
Furthermore with the MBT approach we know what the functional coverage is, and we know what we
are choosing to not test and why.
Coverage: The 450 paths selected gave 67% functional coverage and 8% better code coverage than the manually authored test cases. Had
we executed all 17000 we would have 100% functional coverage.
17. 17 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Example 3 - TDM Shredder Regression Testing
Test Automation
Automated 68 test cases out of the 318 manually authored tests cases. Required 30 man days.
Chose one highly complex end to end path (43 nodes) and added automation logic to enable
us to have the model produce a working Protractor script. Required 6 man days.
Involved around half of the total 85 nodes in the model, so we created a lot of “node code” that would be
re-used heavily if we continued to automate other paths.
Required just 3 man days to get the next 13 paths done due to increasing re-use of automation logic
already there in each node of the model. As we cover more and more nodes and add automation to more
paths, the marginal cost of automating each additional test case drops rapidly.
From this we estimate that:
The time required to complete the initial coding of the automation in the tool would be similar to what it
took to manually author the existing automated tests
The time and cost involved in maintaining the automated test suite when the functionality changes will
be much lower with the model-based approach.
18. 18 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Key Learnings and Insights
The most compelling benefits were in the areas of requirements clarification and designing acceptance test cases for
new stories.
Modelling of entire components to create high functional coverage regression test suites is possible and highly valuable but it is
a lot more work and requires access to info/people that may be very hard for large components or systems that are older and not
well understood.
Getting started is easy but there are significant learning curves around how to model, especially when trying to model UI /
event-driven behaviour where order is significant. This was fairly easy for us to work through as we have highly technical testers
embedded in our dev teams. YMMV.
Huge potential for automated test suite maintenance to become an order of magnitude simpler if automation scripts become fully
model-generated. Our experience with this though was that debugging was a challenge, especially trying to generate raw
automation script (e.g. protractor or Selenium script) directly from the model. Scripting at a higher level of abstraction e.g. against
application specific test automation frameworks may help.
“We modeled a flow for a recent new feature. While we had spent over a week
with the architects and UX designer to define the story and particularly the
acceptance criteria, it took only 5 minutes to realize our requirements were
incomplete. Not only that, but I quickly realized we had not even discussed
edge cases and interactions that were immediately visible from the
requirement modeling and analysis.”
Anand Kameswaran
Product Owner
“The combination of the clarity requirements that had been forced through the
process of modelling, together with the ability to create test cases from the
model to cover every possible path, raised my confidence to a 9/10 that the
testing done was adequate for story acceptance and to prevent defect escape.
I’d say for the prior approach it was more like 3/10.”
Robert Eaglestone
Principal QA Engineer
19. 19 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Challenges and Aspirations
Incomplete/ambiguous requirements
Inadequate functional coverage
Too many defects/rework
ACCEPTANCE TESTING
FOR NEW STORIES
Lack of visibility to functional
coverage/test efficiency
Automated Test Suite maintenance
Automation cost
REGRESSION TESTING
Clear,
complete,
unambiguous
requirements
What We Hoped MBT Would Get Us
Confidence in
functional
coverage
Fewer defects
escaping the
sprint
Increased
effectiveness
and efficiency
Ease of
Maintenance
Reduced
automation cost
20. 20 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Questions?
21. 21 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Drive
Quality and
Testing
Efficiency
Test at the
Speed of
Agile
Must See Demos
Automate
Your
Ecosystem
End-to-end
Theater 3
308
Theater 3
311
Theater 3
312
22. 22 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
Stay connected at communities.ca.com
Thank you.
23. 23 COPYRIGHT © 2017 CA. ALL RIGHTS RESERVED#CAWORLD #NOBARRIERS
DevOps:
Continuous Delivery
For more information on DevOps: Continuous Delivery,
please visit: http://cainc.to/CAW17-CD