Weitere ähnliche Inhalte
Ähnlich wie Value Flow ScoreCards - For better strategies, coverage & processes (2008) (20)
Mehr von Neil Thompson (12)
Kürzlich hochgeladen (20)
Value Flow ScoreCards - For better strategies, coverage & processes (2008)
- 1. BCS SIGiST
British Computer Society
Specialist Interest Group in Software Testing 18 Sep 2008
Neil Thompson
18 Sep 2008 “Testers of Tomorrow” v1.1
& Mike Smith
Value Flow ScoreCards
for better strategies, coverage and processes
Neil Thompson Thompson information Systems Consulting Ltd
23 Oast House Crescent
& Mike Smith Testing Solutions Group Ltd Farnham,UK
St Mary’s Court
England,
Surrey
GU9 0NP
20 St Mary at Hill www.TiSCL.com
London
England, UK
EC3R 8EE
www.testing-solutions.com
©
- 2. BCS SIGiST
What is a Value Flow ScoreCard? 18 Sep 2008
Neil Thompson
& Mike Smith
SIX VIEWPOINTS of what stakeholders want
Supplier Process Product Customer Financial Improvement &
Infrastructure
Objectives WHY we
do things
Measures
WHAT
(will
constitute
Targets success)
HOW to
Initiatives do things
well
It‟s a simple table which we can use to help control our work:
• do things “well enough” for an appropriate balance of stakeholders
• in this presentation, test strategy, test coverage,
process improvement and process definition ©
• (but arguably can apply it to anything!) 2
- 3. BCS SIGiST
Background
18 Sep 2008
Mike Smith: Neil Thompson
& Mike Smith
• white papers on test process (1999 & 2002)
• keynote presentation to Ericsson on measurement in testing (2007)
Neil Thompson:
• Organisation before automation (EuroSTAR 1993, multidimensionality of test coverage)
• Goldratt‟s Theory of Constraints & Systems Thinking in process definition (STAREast
2003), SDLC (EuroSP3 2004), and process improvement (EuroSTAR 2006)
Both of us: participation in the Software Testing Retreat – “Test entities”
and “Appropriate Testing” (ApT)
Holistic Test Analysis & Design, STARWest 2007
• a flexible tabular format used for test coverage
• relating this to Balanced ScoreCards (Kaplan & Norton,
business strategy etc) Separating “what” from “how”, ICST 2008:
• Test Conditions as the keystone
test entity
©
Value Flow ScoreCards take this further… 3
- 4. BCS SIGiST
Rationale: Why invent Value Flow ScoreCards?
18 Sep 2008
Neil Thompson
• Trends in Information Systems:
• Trends in Information Systems:
& Mike Smith
– More agility: lean lifecycles, rapid testing, “good enough quality” (eg
– More agility: lean lifecycles, rapid testing, “good enough quality” (eg
James Bach)
James Bach)
– More control: outsourcing, offshoring, Sarbanes-Oxley
– More control: outsourcing, offshoring, Sarbanes-Oxley
• However, these trends seem to pull in opposite directions!?
• However, these trends seem to pull in opposite directions!?
– see “Balancing Agility and Discipline” (Boehm & Turner)
– see “Balancing Agility and Discipline” (Boehm & Turner)
– … but agile is also disciplined! (or should be)
– … but agile is also disciplined! (or should be)
• So – what can IS development & testing learn from:
• So – what can IS development & testing learn from:
– Business Performance Measurement & Management?
– Business Performance Measurement & Management?
– Lean manufacturing // agile & Systems Thinking?
– Lean manufacturing agile & Systems Thinking?
• Our agenda for doing things “well enough” then better:
• Our agenda for doing things “well enough” then better:
– The Systems Development LifeCycle as a flow of value
– The Systems Development LifeCycle as a flow of value
– Balanced ScoreCards beyond strategy & six-sigma
– Balanced ScoreCards beyond strategy & six-sigma
– Test & measurement models combined – the Treble-V model,
– Test & measurement models combined – the Treble-V model,
informing development through early Test Analysis
informing development through early Test Analysis
– Practical uses of Value Flow ScoreCards in test
– Practical uses of Value Flow ScoreCards in test ©
strategy, coverage, process improvement & definition
strategy, coverage, process improvement & definition 4
- 5. BCS SIGiST
The SDLC as a flow of value 18 Sep 2008
Neil Thompson
& Mike Smith
• Working systems have value; documents in themselves do
RAW MATERIALS FINISHED
not; so this is the PRODUCT
quickest Demonstrations &
a b c Stated
route! requirements acceptance tests
Programming
• SDLCs are necessary, but introduce impediments to value
flow: misunderstandings, disagreements…
documents are like inventory/stock, or “waste”
b Implicit
a b c a d requirements
’
Documented
? Acceptance tests
I
I
requirements
?
Meeting / escalation to agree Intermediate documentation!
©
Programming 5
- 6. BCS SIGiST
Lean manufacturing, Goldratt’s Theory of Constraints…
agile IS methods… 18 Sep 2008
Neil Thompson
customers should pull “good enough” value & Mike Smith
LEVELS OF DOCUMENTATION, FLOW OF FULLY-WORKING
pushed by specifiers SOFTWARE, pulled by
customer demand
+ Test Specifications
Requirements Accepted
System-
tested
+ Func
Spec
WORKING
SOFTWARE
Integrated
+ Technical
Design
Unit /
Component
-tested
+ Unit / Component ©
specifications 6
- 7. BCS SIGiST
But customers are not the only stakeholders
18 Sep 2008
Neil Thompson
& Mike Smith
• ScoreCards – first published
by Kaplan & Norton:
– “Translating Strategy into Action”
– Using four complementary views… Financial
• Intentions to:
– Drive behaviour Customer Vision & Internal
– Measure outcomes Strategy Processes
– Improve predictability
Learning
& Growth
• Now software testing is not only finding bugs,
but measuring quality, ScoreCards seem useful
here…
©
7
- 8. BCS SIGiST
ScoreCard principles we can use 18 Sep 2008
Neil Thompson
& Mike Smith
• For all four views (Financial, Customer, Internal &
Learning)
– “what” needs doing, and “why”:
• Objectives, with associated…
• …Measures & Targets
– „how‟ to achieve that:
• Initiatives
• Cascading Scorecards
– One person‟s “how” is another person‟s “what”
– Measures & Targets are cascaded down to subordinates
• Lead & Lag indicators (Measures & Targets)
– “Goal” indicators (reactive, known when achieved)
– “Performance” indicators (proactive, ongoing monitoring)
©
8
- 9. BCS SIGiST
Taking Balanced ScoreCard beyond strategy:
TSG’s views of quality 18 Sep 2008
Neil Thompson
& Mike Smith
www.balancedscorecard.org (Financial)
© Paul Arveson 1998
version after value
Kaplan & Norton Efficiency
Productivity
On-time,
in budget
Customer Improve Process
(User) (“Manufact‟g”)
ment Compliance
Benefits eg TPI/TMM…
Acceptance eg ISO9000
Predictability
Satisfaction Repeatability
Learning
- Complaints Innovation
- Mistakes
Product
Risks
- Faults Software Quality version
- Failures published by Isabel Evans
www.testing-solutions.com,
adapted here by
Neil Thompson
• We can apply
these WHY
(complementary) WHAT
views of quality HOW
to testing ©
9
- 10. BCS SIGiST
Cascading Balanced ScoreCards:
“Translating strategy into action”, eg… 18 Sep 2008
Neil Thompson
& Mike Smith
• Organisation‟s
WHY
objectives WHAT HOW
WHY
• Objectives of an WHAT HOW
IS/IT project WHAT HOW
WHY
• Project Test Plan WHAT HOW
(a) A top-down view:
down the business /
organisation ©
10
- 11. BCS SIGiST
But then also: the test process as a scorecard
18 Sep 2008
Neil Thompson
& Mike Smith
System development starts with
TEST BASIS (EG
SYSTEM SPEC)
TEST
ANALYSIS
TEST
DESIGN
TEST
EXECUTION the logical (“what”) before specifying
the physical (“how”), so
let’s do this for testing also!
WHY WHAT HOW
WHY
WHAT HOW
(b) This is a left-to-right view
(complementary to the
top-down view which is
aligned with the project and
business / organisation‟s,
©
objectives) 11
- 12. BCS SIGiST
And then: if we add distinct Test Analysis to the
W-model… the Treble-V model! 18 Sep 2008
Neil Thompson
& Mike Smith
PROJECT STATIC DYNAMIC TEST DYNAMIC TEST DYNAMIC TEST
REQ‟TS SPEC TESTING ANALYSIS DESIGN EXECUTION
LOGICAL STATIC DYNAMIC TEST DYNAMIC TEST DYNAMIC TEST
DESIGN TESTING ANALYSIS DESIGN EXECUTION
PHYSICAL STATIC DYNAMIC TEST DYNAMIC TEST DYNAMIC TEST
DESIGN TESTING ANALYSIS DESIGN EXECUTION
COMPONENT STATIC DYNAMIC TEST DYNAMIC TEST DYNAMIC TEST
DESIGN TESTING ANALYSIS DESIGN EXECUTION
BUILD
(c) This is a third “cascade” view – down then up, through:
• Layers of stakeholders
• Levels of integration
©
12
(It‟s not only for waterfall SDLCs, eg iterative… )
- 13. BCS SIGiST
The Treble-V model develops cascading
ScoreCards a little further 18 Sep 2008
Neil Thompson
& Mike Smith
ORGANISATION & PROJECT OBJECTIVES
TEST POLICY, STRATEGY, PROJECT TEST PLAN
(incl. reviews, inspections etc)
1 1
3 3 3 3
PROJECT
2 STATIC
2 DYNAMIC TEST
2 2 DYNAMIC TEST
DYNAMIC TEST
REQUIREMENTS TESTING ANALYSIS DESIGN EXECUTION
4 4
4 4?
FUNCTIONAL
SPECIFICATION
1: “Translating strategy into action”
2: The test process as a scorecard (at each test level)
TECHNICAL 3: Scorecard applied to activities in test process
DESIGN
and… (even more interestingly!)
4: Scorecard applied to activities in development process
COMPONENT
SPECIFICATIONS
BUILD ©
13
- 14. BCS SIGiST
There exists a “Six-Sigma Business ScoreCard”:
but is Six-Sigma applicable to IS value flow? 18 Sep 2008
Neil Thompson
& Mike Smith
• Principles:
DEVELOPMENT – In a multi-step manufacturing
MODEL process, if „quality‟ of any step
is <100%, overall quality WORKING
simplification SYSTEM
falls dramatically with AT
REAL Requirements
numerous steps & Execution
WORLD
components
– for overall quality to be
refinement Functional
with risk of Specification „good enough‟, each Execution
ST
distortion step / component
should be within 6σ, ie
99.9996% perfect test execution
Technical
Design
IT with risk of
• IS is not exactly Execution
compromises
like manufacturing,
Component but we can
Spec CT
learn… Execution
programming ©
with risk of mistakes SOFTWARE 14
- 15. BCS SIGiST
We learn two things from Six-Sigma: (i) confirms
we need Validation in addition to Verification 18 Sep 2008
Neil Thompson
& Mike Smith
validation Customer
DEVELOPMENT TEST testing
MODEL MODEL Product
WORKING
simplification SYSTEM
Acceptance Test AT
REAL Requirements Analysis & Design Execution
WORLD
verification testing
refinement Functional
System Test ST
with risk of Specification Analysis & Design Execution
distortion
test execution
Technical Integration Test IT with risk of
Design Analysis & Design Execution
Based on flipchart drawn by
compromises
Neil Thompson,
Software Testing Retreat ,
Llangadog, Wales
Component Component Test CT
Spec Analysis & Design Execution
programming ©
with risk of mistakes SOFTWARE 15
- 16. BCS SIGiST
(ii) the Six-Sigma ScoreCard includes Suppliers
18 Sep 2008
Neil Thompson
& Mike Smith
• Value chain ≈ Supply chain! Financial
Efficiency
– in the IS SDLC, each Productivity
On-time,
participant should try to in budget
„manage their supplier‟ Supplier
Upward
- Cost of quality Customer
VALIDATION
– this is an instance of
Risks
management
Benefits
Acceptance
(test) scorecard applied to Information
gathering
Satisfaction
- Complaints
activities in development
Improve
process ment
– we add this to the other 5, eg TPI/TMM…
Predictability
Learning
giving a 6th view of quality Innovation
• Now each step in
the value chain can
manage its inputs, outputs
and other stakeholders Process
Compliance
Product
VERIFICATION
eg ISO9000 Risks
Repeatability Test coverage
- Faults
- Mistakes - Failures
Six-Sigma Business ScoreCard published by
©
Praveen Gupta (2nd ed. McGraw Hill 2007), but 16
this slide shows Neil Thompson‟s version
- 17. BCS SIGiST
The Treble-V model is a cascade of
Value Flow ScoreCards 18 Sep 2008
Neil Thompson
& Mike Smith
Organisation & Project Objectives
1 from TEST POLICY, STRATEGY, PROJECT TEST PLAN
Coverage 2
Objectives
Financial
from 3
REQUIREMENTS, STATIC Financial
FUNCTIONAL SPEC, TESTING Static Testers’
DYNAMIC
3
TECHNICAL DESIGN, own Objectives
TEST Test Analysts’
MODULE SPECS ANALYSIS own Objectives
Supplier Improv’t Customer
Supplier Improv’t Customer
Process Product
Initiatives
for next
Process Product
stage of (etc)
test process
Feedback 4
objectives
for
Initiatives
Business Analysts,
for next (etc)
Architects &
level down the
Developers
Treble-V model
©
17
- 18. BCS SIGiST
Potential number of ScoreCards depends on
how your SDLC is handled by different roles 18 Sep 2008
Neil Thompson
& Mike Smith
Business Analysts Requirements Reviewers Acceptance Test Analysts AT Designers & Scripters Acceptance Testers
Architects Func Spec Reviewers Sys Test Analysts ST Designers & Scripters Sys Testers
Designers Tech Design Reviewers Int Test Analysts IT Designers, Scripters & Executers
Component Test Analysts, Designers & Executers?
Pieces of a jig-saw!
via pair programming?
This example is a “full-ish” set: Developers
• higher-level tests are
scripted – other staff ©
may then execute 18
- 19. BCS SIGiST
The Value Flow ScoreCard in action 18 Sep 2008
Neil Thompson
& Mike Smith
Financial Customer Financial Improv’t
Supplier Process Product
Supplier Improv’t Customer
Process Product
• Yes – it‟s just a table!
…Into which we can put
useful things…
• We start with repositionable
paper notes, then can
©
put in spreadsheet(s) 19
- 20. BCS SIGiST
Value Flow ScoreCard contents 18 Sep 2008
Neil Thompson
& Mike Smith
Supplier Process Product Customer Financial Improvement &
Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure
management eg ISO9000 Risks Risks Productivity eg TPI/TMM…
Repeatability Test coverage Benefits On-time, Predictability
Info from other Acceptance in budget Learning
levels of - Faults Satisfaction Innovation
Treble-V model - Mistakes - Failures - Complaints - Cost of quality
• Give
Objectives input to
WHY
upstream
reviews
Measures • Staff-days
invested
WHAT
(“Indicators”)
• 1 Staff-day
Targets per
Test Policy
• Send Denis
Initiatives every time HOW
• What kind of useful things?
• Here‟s a simple example
©
20
- 21. BCS SIGiST
Example set of Objectives, Measures, Targets &
Initiatives 18 Sep 2008
Neil Thompson
& Mike Smith
Supplier Process Product Customer Financial Improvement &
Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure
management eg ISO9000 Risks Risks Productivity eg TPI/TMM…
Repeatability Test coverage Benefits On-time, Predictability
Info from other Acceptance in budget Learning
levels of - Faults Satisfaction Innovation
Treble-V model - Mistakes - Failures - Complaints - Cost of quality
• Appear
• Give input to • Maintain • Run enough • Get users to • Gain Industry-
Objectives upstream minimum tests (?) “sign off”
successful as
standard WHY
Project
reviews compliance respectability
Manager
• Staff-days • Frequency • Test cases • Signatures • Go-live date
Measures invested of audits executed • Expenditure • Maturity levels
WHAT
• One for User • Go-live date (“Indicators”)
• 1 Staff-day • 1 audit per • 1479 (?) Acceptance as originally • Level 2 by 2008
Targets per Test Policy year <see next • One for planned • Level 3 by 2010
slide!> Operational • Expenditure
Acceptance < budget
• Send Denis • React to • Invite users &
• <see later • Improvement
Initiatives every time correspondence slides!> operators to
actions HOW
from auditors specify Accep-
tance Tests
• These are for testing in general,
in a project context
©
21
- 22. BCS SIGiST
Lag & Lead indicators; Goal-Question-Metric;
making Measures & Targets SMART 18 Sep 2008
Neil Thompson
& Mike Smith
Supplier Process Product Customer Financial Improvement &
Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure
management eg ISO9000 Risks Risks Productivity eg TPI/TMM…
Repeatability Test coverage Benefits On-time, Predictability
Info from other Acceptance in budget Learning
levels of - Faults Satisfaction Innovation
Treble-V model - Mistakes - Failures - Complaints - Cost of quality
Objectives
GOAL ……...The six viewpoints assure “Relevance”……...
• Staff-days • Frequency • Test cases • Signatures • Go-live date • Maturity levels
invested of audits executed • Expenditure
Measures Specific
• Test conditions
QUESTION agreed Measurable
• 3 Staff-days • 1 audit per • 1479 (?) • One for User • Go-live date • Level 2 by 2008 Achievable
per year Acceptance as originally • Level 3 by 2010 Relevant
Targets Test Policy • One for planned
• (in language of Operational • Expenditure Timely
METRIC stakeholders Acceptance < budget
Initiatives
Many of these are “lag” indicators:
• reactive, known only when achieved
Here are two “lead” indicators, proactive:
• Timely influence on quality, in advance
• help assess & maintain Achievability
©
22
- 23. BCS SIGiST
Four practical uses of Value Flow ScoreCards
18 Sep 2008
Neil Thompson
A. Test coverage & Mike Smith
– extend control to stakeholders
– transcend the small „textbook‟ repertoire of techniques
– holistic Test Analysis & Design: integrates and clarifies test
items, features, bases and product risks
– better information traceability
B. Test Policy, Strategy & Planning
– ensure alignment with organisational objectives
– help completeness, no subjects forgotten
– Goal–Question–Metric traceability
C. Process Improvement
– not just test, but whole lifecycle
– prioritised treatment of symptoms
– transcend limitations of TMMTM or TPI®
D. Process definition
– “Appropriate Testing” (ApT) in
©
different project/product circumstances 23
- 24. BCS SIGiST
A. Test coverage: Do you control your
testing, or does your testing control you? 18 Sep 2008
Neil Thompson
& Mike Smith
• Test Cases thought of
FLEXIBLE,
• Scripts / Procedures written RISK-MANAGED
• Expectation that “those are the tests” TEST EXECUTION
• What you really want to cover
THE REMAINDER OF YOUR LIFE • Governance / management needs
or… • Product risks ©
(ON THAT PROJECT)
24
- 25. BCS SIGiST
Test coverage: other common problems 18 Sep 2008
Neil Thompson
& Mike Smith
• Have you seen any of these?
– important tests omitted
– large numbers of low-value tests
– higher levels of testing merely repeating
Component Testing
– insufficient attention to non-functional tests
– unstructured piles of detailed scripts
– difficult-to-maintain testware…
©
25
- 26. BCS SIGiST
Numerous test cases & scripts are almost
meaningless to stakeholders, without a “map” 18 Sep 2008
Neil Thompson
& Mike Smith
1479 Now, let‟s
test cases, so start with a
it must be good,
right?
classification
tree
Test
specification
process
Documentation
to agree
coverage
©
26
- 27. BCS SIGiST
Testware: not a rigid hierarchy 18 Sep 2008
Neil Thompson
& Mike Smith
WHY WHAT HOW
System‟s Test Test Test Scripts / Test Execution
specifications Conditions Cases Procedures Schedule
Does this work?
No, because of
these hierarchies
So, we need
System‟s
specifications
Test Test Test Scripts / Test Execution many-to-many
Conditions Cases Procedures Schedule
entity relationships
©
27
- 28. BCS SIGiST
An easy way for many-to-many relationships: a
flexible table 18 Sep 2008
Neil Thompson
& Mike Smith
System‟s Test Test Test Scripts / Test Execution
specifications Conditions Cases Procedures Schedule
“ “ “
“ “ “
“ “
• But merely decomposing
What about:
TEST ITEMS? the system‟s specification
is not a recipe for very good tests
FEATURES TO BE TESTED?
• We want a Validation (customer)
PRODUCT RISKS? view of quality
in addition to
the traditional
Verification (product) ©
view… 28
- 29. BCS SIGiST
So – Value Flow ScoreCards can measure test
coverage: for Test Analysts… 18 Sep 2008
Neil Thompson
& Mike Smith
(from Supplier Process Product Customer Financial Improvement &
LEVEL TEST PLAN Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure
management eg ISO9000 Risks Risks Productivity eg TPI/TMM…
and Info from other Repeatability Test coverage Benefits On-time, Predictability
TEST BASES) levels of Acceptance in budget Learning
Treble-V model - Faults Satisfaction Innovation
- Mistakes - Failures - Complaints - Cost of quality
Test Items Product Constraints
(level of benefits
Objectives Features to be
integration)
Test Basis Features to be
tested References tested
Product Product Product Product Product
Risks Risks Risks Risks Risks
Measures
Test Conditions (we
could cover)
Test Conditions
Targets we intend
to cover
Objectives for
Initiatives Test Cases
NB this is the “manual”
Holistic Test Analysis & Design
xls perspective. Formal relational (to test design & execution) ©
database implementations, eg T-Plan, 29
may require a more rigorous treatment.
- 30. BCS SIGiST
… then for Test Designers, (+Scripters if used),
and Executers 18 Sep 2008
Neil Thompson
& Mike Smith
(from Supplier Process Product Customer Financial Improvement &
Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure
Test management eg ISO9000
Repeatability
Risks
Test coverage
Risks
Benefits
Productivity
On-time,
eg TPI/TMM…
Predictability
Analysis) Info from other
levels of - Faults
Acceptance
Satisfaction
in budget Learning
Innovation
Treble-V model - Mistakes - Failures - Complaints - Cost of quality
Objectives for Objectives for
Test
Objectives Test Cases
Execution
Schedule
S-curves of:
• Test Cases
executed,
Measures passed, failed
• Incidents fixed,
retested, closed
Coverage of
Test Conditions
Targets executed S-curves projected
to target dates
Initiatives (to next level of Treble-V model)
Objectives for
incident-fixers
©
(to Developers) 30
- 31. BCS SIGiST
Holistic Test Analysis & Design spreadsheet centres on
Test Conditions: usable also with Exploratory Testing? 18 Sep 2008
Neil Thompson
& Mike Smith
TESTING MISSION
Determine Determine Determine Configure Operate Observe Evaluate Report
Model test space oracles coverage test procedures test test test test test
PRODUCT system system system results results
DOMAIN PROBLEM DOMAIN
TEST
LAB
Product Quality Project Tests Perceived
Elements Criteria Environment Quality
Test Script or Exploratory Regime
EXPLORATORY
1. Test Items 2. Test Features 3. Test Basis 4. Product 5. Test Ver / Val Test Data Technique Test TEST EXECUTION
& Sub-items & Sub-features References Risks Conditions Mechanism Indications Names Objectives RECORD (and/or
TEST SCRIPT REF)
+ whether
Behavioural
or Structural
A Elements from
“Heuristic Test Strategy Model”,
“ “ B “Universal Testing Method v2.0” &
“Improving By Doing”
“ “ C quoted from Rapid Software Testing v2.1.2,
training from James Bach & Michael Bolton
… www.satisfice.com, www.developsense.com
cross-referred here by Neil Thompson
Overview A B C … ©
31
- 32. BCS SIGiST
B. Test Policy, Strategy & Planning: some
common problems 18 Sep 2008
Neil Thompson
& Mike Smith
• Testing not obviously (or at all) aligned to
organisation‟s objectives
• Test Policy, Strategy & Plan Documents which
are:
– cut-and-paste, “boilerplate”, same for all projects,
copied from textbooks…
– tedious, dreary verbiage, too long
– too short!
– wishful thinking
– unbalanced
– etc?
©
32
- 33. BCS SIGiST
Value Flow ScoreCards for Test Policy, Strategy
& Planning 18 Sep 2008
Neil Thompson
& Mike Smith
Organisation‟s ScoreCards, Goals, Objectives…
ORGANIS-
ATION
LEVEL
Test Policy
Org Test Strategy
Prog Test Strategy
PROGRAMME
LEVEL
Proj Test Strategy Project Test Plan
PROJECT /
PRODUCT
Requirements Reviewers Acceptance Test Analysts AT Designers & Scripters Acceptance Testers
TEST
LEVELS Func Spec Reviewers Sys Test Analysts ST Designers & Scripters Sys Testers
(continues as on earlier slide) ©
33
- 34. BCS SIGiST
Example for Test Policy 18 Sep 2008
Neil Thompson
& Mike Smith
Organisation‟s Organisation‟s ScoreCards
Goals &
Supplier Process Product Customer Financial Improvement & Infrastructure
Objectives Upward Compliance VERIFICATION VALIDATION Efficiency
management eg ISO9000 Risks Risks Productivity eg TPI/TMM…
Repeatability Test coverage Benefits On-time, Predictability
Information Acceptance in budget Learning
gathering - Faults Satisfaction Innovation
- Mistakes - Failures - Complaints - Cost of quality
• Constant
Objectives improvement of
GOAL development &
test processes
Measures and for
• TMM levels development?
QUESTION
• TMM level 2
Targets at least, now
METRIC • TMM level 3
within 2 years
Initiatives
(for Test Strategy / Strategies) ©
34
- 35. BCS SIGiST
Test Policy (more): Have we thought of all the viewpoints?
Do we have Measures, Targets & Initiatives? 18 Sep 2008
Neil Thompson
& Mike Smith
Organisation‟s Organisation‟s ScoreCards
Goals &
Supplier Process Product Customer Financial Improvement & Infrastructure
Objectives Upward Compliance VERIFICATION VALIDATION Efficiency
management eg ISO9000 Risks Risks Productivity eg TPI/TMM…
Repeatability Test coverage Benefits On-time, Predictability
Information Acceptance in budget Learning
gathering - Faults Satisfaction Innovation
- Mistakes - Failures - Complaints - Cost of quality
• IS actively • Products • Products • Proj Mgr is • Staff must be • Constant • Use TestFrame
Objectives supports to satisfy to be fit responsible certified improv‟t of for test analysis
employees specified for purpose for quality dev & test & execution
GOAL requirements processes
• Indep- • (comprehensive • Automate regr
• Bus Mgt is • Testing
endence scope) • Detect tests as much
responsible prioritised &
increases defects as possible
for enforcing managed
with early
Test Policy
test type • Defect source
analysis
• Both static • Defect • Product • TMM levels
Measures & dynamic Detection risks
• ISTQB • Freq of process
adjustments
• Planning, Percentage
QUESTION preparation • Importance heeding metrics
& evaluation of req‟ts
• Software & • Advisors • TMM level 2 • Twice per year
Targets related work Expert at least, now
products • Managers
METRIC Advanced • TMM level 3
• Analysts within 2 years
Foundation
Initiatives
Source: summarised from an example in TestGrip by Marselis, van Royen, Schotanus & Pinkster (CMG, 2007)
©
35
- 36. BCS SIGiST
Summary for Test Policy, Strategy & Planning:
what’s the point of Value Flow ScoreCards? 18 Sep 2008
Neil Thompson
& Mike Smith
• Remind the authors of all the viewpoints which
should be considered
• Encourage balance across those viewpoints
• Focus on ways of measuring success – expose
vague / wishful thinking assertions
• Get the key points recorded & agreed, before
writing indigestible documents
• Pave the way for achievably
implementing these aspirations
• Consider qualitative measures, eg rubrics,
where quantitative seems inappropriate
(“Tyranny of Numbers”)
©
36
- 37. BCS SIGiST
C. Process improvement 18 Sep 2008
Neil Thompson
& Mike Smith
• In software testing, the popular process
improvement methods have fixed subject areas
(“Moment of involvement” etc)
• TMMTM and TPI® ascend through maturity levels
– “these things are good for you, in this sequence”
• TOMTM is symptom-driven but still has a fixed
structure, and suggested causes (a “built-in
improvement model”)
• But some other fields (eg manufacturing, supply
chains) use Goldratt-Dettmer (or Toyota-style
equivalent) on symptoms, giving flexible focus
• Driving principle: the “constraint” is the weakest
link, address that first before attacking everything
• Look for causes not just symptoms… ©
37
- 38. BCS SIGiST
Process improvement with Goldratt-Dettmer
thinking tools 18 Sep 2008
Neil Thompson
& Mike Smith
CURRENT CONFLICT FUTURE
ILLS RESOLUTION REMEDIES
Symptom x
Alleviation
Symptom y of symptoms
“It seems
here that…”
Symptom z
Revelation(s) Distilled
“truth”
Fix for
“On the other intermediate
hand…” cause n
Intermediate
cause n Fix for
intermediate
Intermediate cause o
cause o For use where:
• Different stakeholders
want different things Fix for (can’t fix
• Evidence seems root cause a root cause b)
Root Root
contradictory
cause a cause b • Are rival theories for ©
38
for remedies
- 39. BCS SIGiST
An easy route into Goldratt-Dettmer:
Strengths, Weaknesses, Opportunities & Threats 18 Sep 2008
Neil Thompson
& Mike Smith
STRENGTHS OPPORTUNITIES TACTICAL: Address culture by
worked examples of diagrams
Some managers are
considering
agile methods
(Use Strengths to help TACTICAL: Include tables &
diagrams in test specifications
Business analysts may be
amplify opportunities) motivated by UML training
ONGOING:
Techniques
training &
Action planning
Can still improve coverage coaching
at macro level with
informal techniques STRATEGIC:
(80/20) Improve SDLC method TRANS-
CONFLICT FUTURE REMEDIES ITION
CURRENT ILLS RESOLUTION
PRE-
Culture of our testers is to
prefer large text documents
SDLC method does not
encourage diagrams
REQUISITES
to diagrams
System specs are heavy text
documents
Testers not trained
in techniques
(Use Threats to help Anticipating &
Test specs are large & “texty”
identify obstacles) overcoming
obstacles
Test coverage omissions & overlaps
WEAKNESSES Too many failures in Live
THREATS
(actually, as shown here Goldratt-Dettmer has five diagram sets in its ©
full version, to cater for “making change stick”) 39
- 40. BCS SIGiST
Value Chain ScoreCards allow us to “swimlane”
symptoms & causes (and proposed remedies) 18 Sep 2008
Neil Thompson
& Mike Smith
(TMap/TPI LIFECYCLE…………………………………………
approximation) ORGANISATION …...………………………………….INFRASTRUCTURE
TECHNIQUES………… & TOOLS
Supplier Process Product Customer Financial Improvement &
Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure
management eg ISO9000 Risks Risks Productivity eg TPI/TMM…
Repeatability Test coverage Benefits On-time, Predictability
Information Acceptance in budget Learning
gethring - Faults Satisfaction Innovation
- Mistakes - Failures - Complaints - Cost of quality
CURRENT ILLS
Objectives
CONFLICT
RESOLUTION
FUTURE REMEDIES
Measures
PRE-
REQUISITES
Targets
TRANSITION
Initiatives
©
Note: this is similar to Kaplan & Norton‟s “Strategy Maps” (Harvard Business School Press 2004) 40
- 41. BCS SIGiST
Systems Thinking: some cause-effect trees “re-
root” to form vicious / virtuous feedback loops 18 Sep 2008
Neil Thompson
& Mike Smith
CURRENT Example (vicious) Systems Thinking
Each author seems to vary;
ILLS notation this is Neil Thompson‟s,
incorporating elements of
Too many failures in Live
Impact Jerry Weinberg‟s &
Symptom x Testers too busy Dennis Sherwood‟s
of live
to “weed” tests
Redundant failures
Symptom y regression tests
More tests added take too long
Symptom z Test suites grow
to run
ever larger
REINFORCING
LOOP
Test coverage overlaps omissions
Size of Ignorance
test suites of test
coverage
Regression tests
not automated
Intermediate
cause n
Test specs are large & “texty”
BALANCING
Intermediate Time LOOP
cause o System specs spent Reluctance
are heavy text “firefighting” to remove tests
documents from suites
Testers not trained
in techniques
A loop is balancing if it contains an
odd number of opposing links ;
else it is reinforcing
Root Root Culture of our testers SDLC method
cause a cause b
is to prefer does not ©
large text documents encourage
to diagrams diagrams 41
- 42. BCS SIGiST
Value Flow ScoreCards give two nested sets of feedback
loops: do things well enough now, then improve 18 Sep 2008
Neil Thompson
& Mike Smith
TIME, COST
Via cascade of Supplier Process Product Customer Financial Improvement & Infrastructure
Value Flow SCOPE…………………………………
…………………………………..RISK & QUALITY……………………………………………………
Role 1
Role 2
Role n
Find vicious loops… Balance them, and/or… maybe even
reverse them
(“Tipping Point!) … but tailor quality to
balance of stakeholders.
Then iterate
It‟s a Test Policy, process improvements,
Jim, but… but focussed on
Is it a good Test “constraints” ie
Policy? where they will
have most payback.
Can still use structure of TMMTM, TPI®, TOMTM… if desired, ©
they may be mapped on to these value flow columns
42
- 43. BCS SIGiST
D. Process definition: where we want to be in the range
formal-informal for these circumstances, and how 18 Sep 2008
Neil Thompson
From & Mike Smith
Supplier Process Product Customer Financial Improvement &
Context / Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure
Circumstances management eg ISO9000
Repeatability
Risks
Test coverage
Risks
Benefits
Productivity
On-time,
eg TPI/TMM… Note: comparable with
Predictability Paul Gerrard’s “Axioms”,
Information Acceptance in budget Learning
gathering - Faults Satisfaction but in our version these
Innovation
- Mistakes - Failures - Complaints - Cost of quality are the only two axioms
Legal: Process Application Sector Job type & size
Efficiency
• regulation constraints, eg: characteristics Resources:
Objectives • standards • quality mgmt
Culture
• money ( skills, environments)
• configuration Technical • time
Moral: mgmt risks Effectiveness
Business risks
• safety
Technology
CURRENT SITUATION
• Methodology
unhappy with
Insurance Assurance Efficiency • Unsure
how best
to test
Type of “V-model” Handover & acceptance etc (about 30
Measures criteria categories)
CONFLICT
formal informal formal informal
RESOLUTION
Where in the range Where in the range DESIRED
Targets (specific aspects) (specific aspects) SITUATION
©
Initiatives Appropriate Testing in this context / circumstances 43