Weitere ähnliche Inhalte
Ähnlich wie 'Best Practices' & 'Context-Driven' - Building a bridge (2003) (20)
Mehr von Neil Thompson (10)
Kürzlich hochgeladen (20)
'Best Practices' & 'Context-Driven' - Building a bridge (2003)
- 1. “Best Practices” and
“Context-Driven:
Building a Bridge
Neil Thompson © 2003
www.TiSCL.com Thompson
+44 (0)7000 NeilTh information
Test Management track presentation (634584) Systems
STAREast T15 Neil Thompson slide 0 of 29 Consulting Limited
- 2. Presentation contents
• Theme: what to bridge, and how.................
• Learning objectives
• Instead of Best Practice: “always-good” practices
• How deep is the schism, how strong the bridge supports?
• Goldratt’s Theory of Constraints
• The thinking tools and how to apply them
• Relationship to process improvement
• Conclusions & key references
• Lessons learned, and way forward
© Thompson
information
Systems
T15 Consulting Limited
STAREast Neil Thompson 1 of 29 2003
- 3. Theme: from conflict to integrated
framework
Best Unifying principles
Practice “Always-Good”
Practices
“fossilised What Goldratt’s
“formalised How
thinking” “thinking tools”
sloppiness”
Constraints,
Context- Requirements,
Driven Objectives etc
Thompson
Expert pragmatism
©
information
Systems
with structure
Consulting Limited T15
2003 STAREast Neil Thompson 2 of 29
- 4. Learning objectives
• Understand the Best Practice v. Context-Driven debate
• If your allegiance is already fixed, please un-fix at least for
now: consider “always-good” principles & practices
• Gain knowledge on Goldratt’s Theory of Constraints
(like it or not, it’s widely used)
• Appreciate the subtleties of extending its use beyond
manufacturing etc
• Learn enough about the thinking tools to use them
• Understand enough of my examples to determine if, and
how, you can use some or all of this framework
© Thompson
information
Systems
T15 Consulting Limited
STAREast Neil Thompson 3 of 29 2003
- 5. Instead of Best Practice: “always-good”
practices (the top-down view)
Always-good
Effectiveness
Efficiency
• “doing the right
things” • “doing things right”
(eg optimising speed
Risk Quality
management
& minimising cost)
management
Insurance Assurance
• detecting errors, • giving confidence in
faults & failures fitness-for-purpose © Thompson
information
Systems
T15 Consulting Limited
STAREast Neil Thompson 4 of 29 2003
- 6. Always-good practices (bottom-up)
Principles Elements
What testing against, Detecting errors: the V-model
QA context Giving confidence: the W-model
Good enough quality
Quality & risk management
Risk use & test Risk management & testing
prioritisation Risks by layer of V-model
Tests’ priorities based on risks
Effective- Appropriate stepwise Strategy, plan, design, cases, scripts, data, execution &
ness refinement management procedures
Structured & Test, check results, debug, fix, retest & regression-test
controlled execution (problems & changes, by urgency & importance)
Informed decision- Test coverage & handover criteria
making Metrics (progress & problems)
Role of metrics in risk management
Summary: quantify residual risks & confidence
Appropriate skills Business, technical & testing; roles & responsibilities;
independent testing
Efficiency Appropriate Glass-box, black-box etc
techniques Rehearsing acceptance
Appropriate tools Capture-replay, management etc
Testing process Process review & improvement, eg symptom-based,
overall Capability Maturity Model
© Thompson
information Decide targets, and improve as appropriate
Systems
Consulting Limited T15
2003 STAREast Neil Thompson 5 of 29
- 7. How deep is the schism?
• CD views of BP: • BP views of CD:
“fossilised thinking, “formalised sloppiness,
disenfranchising, posturing, “I agree”,
a disservice to testing” stating the obvious”
Each side may be merely competitive, or ethically outraged
• CD more outraged? • BP more willing to
appease / compromise?
Not trying to be controversial: is this schism good for testing?
© Thompson
information
Systems
T15 Consulting Limited
STAREast Neil Thompson 6 of 29 2003
- 8. How strong are the bridge supports?
This bridge is supported by “always-good” practices and Goldratt
How always-good are How applicable is Goldratt
these practices really? to this situation?
• survived some scrutiny so far • began in manufacturing
• not guaranteed; I modify them • since applied to other areas, especially
slightly over time project management
• but you could use your own • limited attempts on software dev’t, but...
• other authors say fixed “principles”: • principles used by Agile methods
this is just adding a level below • new here is applying all the thinking
• deliberately omit trivial-obvious, eg tools to process context (not just
use good env’ts (?) improvement)
• omit if unclear message, eg “test
before code” (?)
• the ? marks illustrate value of the
thinking tools: I may add these! © Thompson
information
Systems
T15 Consulting Limited
STAREast Neil Thompson 7 of 29 2003
- 9. Goldratt’s Theory of Constraints
diagram based on those in “The Race”, E.M. Goldratt & R. Fox 1986
Goal: to win the war
Objective: to maximise throughput (right soldiers doing right things)
Constraint on throughput: slowest marcher
Drum Rope
Buffer
Critical chain: weakest link is all we need fix, by means of...
Five focussing steps: identify constraint, exploit it, subordinate all else,
elevate it (i.e. strengthen so not now weakest), then... identify next constraint
But now it’s no longer simple: so need iterative tools for:
what to change, what to change to, how
Five thinking tools (based on sufficient causes & © Thompson
information
necessary conditions) Systems
T15 Consulting Limited
STAREast Neil Thompson 8 of 29 2003
- 10. Five logical tools for process
improvement (Dettmer*, after Goldratt)
What to ........... What to change to .......
change (1) (2) (3)
Core problem+ Prerequisites+ CURRENT REALITY
(other) Root causes Conflicts + Injections
Intermediate Intermediate FUTURE
Requirements +
effects effects
INJECTIONS REALITY
Undesirable Desired
Objective
effects effects
CURRENT CONFLICT .... How to change ....
REALITY RESOLUTION (4) (5)
Intermediate Needs+
objectives Specific
* very slightly paraphrased here actions
Obstacles Intermediate
Thompson
effects
© PRE-
information
Systems Objective
Consulting Limited REQUISITES Objective
2003
STAREast T15 Neil Thompson 9 of 29 TRANSITION
- 11. The five logical tools applied to
testing context, practices etc.
Always- Good What “appropriate” means
Context good practices in this context
practices in this REALITY + Injections
Root
“Prerequisites” context 3
causes
Intermediate
FUTURE
Extremes
Intermediate
Requirements effects REALITY
effects
Sub-requirements
(valid & invalid Desired
Effects effects
Objectives assumptions)
+ INJECTIONS
CURRENT 2a Questions to Choice categories
REALITY 1
POSITIONING
+ Justifications consider & actions
5a Choice
• Methodology 2b Intermediate categories +
unhappy with CONFLICT sub-prerequisites NEEDS +
Specific
( actions) RESOLUTION actions
INTERACTIONS
• Unsure how
best to test Obstacles Actions & sub-
( conditions) 4 Intermediate objectives
effects
PRE- Sub-prerequisite
5b
© Thompson Objectives
information REQUISITES
Systems Sub-
Consulting Limited objectives
2003 STAREast T15 Neil Thompson 10 of 29 TRANSITION
- 12. 1 Context (CURRENT REALITY)
Business/ App type Corporate culture
org. sector Nation
QUALITY / RISK FACTORS
SCOPE, COST, TIME,
(eg USA) Technology
Legal Moral Job type & size:
• project/programme
constraints: constraints, eg: • bespoke/product
• regulation • human safety • new/maintenance
• standards • money, property
• convenience
Process
Resources: constraints, eg:
• money ( skills, environments) • quality management
• time • configuration mgmt
• Unsure
how best Methodology
• Methodology
to test happy with
unhappy with
© Thompson
information
Systems
Consulting Limited
2003 STAREast T15 Neil Thompson 11 of 29
- 13. Always-good practices
2a
(CONFLICT RESOLUTION upper level)
Always-good © Thompson
Effectiveness information
Systems
Consulting Limited
2003
Efficiency
Risk management Quality management
Decide process targets Assess where errors originally made
& improve over time
Insurance Assurance Be pragmatic over quality targets
Plan early, then Define & use metrics
Give confidence (AT) rehearse-run, Use handover & acceptance criteria
Define & detect errors (UT,IT,ST) acceptance tests
V-model: what testing against W-model: quality management Use independent system & acceptance testers
Risks: list & evaluate Tailor risks & priorities etc to factors Use appropriate skills mix
Refine test specifications progressively: Define & agree roles & responsibilities
Prioritise tests based on risks
Plan based on priorities & constraints
Design flexible tests to fit Use appropriate techniques & patterns
Define & measure
Allow appropriate script format(s)
test coverage
Use synthetic + lifelike data Use appropriate tools
Allow & assess for coverage changes Document execution & management procedures Optimise efficiency
Distinguish problems from change requests
Measure progress & problem significance Prioritise urgency & importance
Quantify residual risks & confidence Distinguish retesting from regression testing
STAREast T15 Neil Thompson 12 of 29
- 14. What are the dimensions of formality?
Next diagram will take each box from the previous diagram and assess it on
a formal-informal continuum, so...
In preparation for this: what do we mean by “formality”?
• adherence to standards • consistency
and/or proprietary • contracted-ness
methods • trained-ness and
• detail certification of staff
• amount of • “ceremony”, eg degree
documentation to which tests need to
• scientific-ness be witnessed, results
• degree of control audited, progress
• repeatability reported ©T hompson
i nformation
STAREast T15 Neil Thompson 13 of 29
• any others? S
C
ystems
onsulting Limited
2003
- 15. Good practices in this context
2b
(CONFLICT RESOLUTION lower level)
• THEY ARE LEVELS, All systems are integrated from parts
We’re too
Use a lazy to think
NOT STAGES
• SOME SPECS ARE
waterfall OUT OF DATE / IMPERFECT,
We want baselines to test against BUT WE COPE
V-model APPROPRIATE
We want to test viewpoints of: • NEED NOT BE 1-1 CORRESP USE OF
• users SPECS-LEVELS V-model:
• someone expert & independent
• NEED NOT BE 4 LEVELS what testing
• designers
• programmers Two heads are better than one against
• MULTIPLE PARTIAL PASSES
“Conflict” We’re doing iterative development
Different levels
mitigate different risks
We’re doing adaptive development (no specs)
• CAN USE EXPLORATORY
Documentation must be minimised We have little time TESTING AGAINST
CONSENSUS BASIS
Don’t
We’re object-oriented • V-MODEL IS IMPLICIT IN BINDER’S BOOKTesting OO systems: models,
use a patterns & tools
V-model V-model is MANY PEOPLE We want to
discredited STAND BY V-MODEL be trendy, anyway
© Thompson
information
Systems
Consulting Limited
STAREast T15 Neil Thompson 14 of 29 2003
- 16. Summary of conflict resolution
The positioning of our always-good practice on the formal-informal
scale has been prepared by:
• splitting it into assumed reasons eg “documentation must be minimised”
• challenging the validity of those assumption and inserting “injections”
of things not originally thought of
• questioning generally, which also helps understand causes & effects...
Questioning some Questioning Neutral questions
"positive" statements "negatively"
we must always... oh, really? so what?
it's absolutely who says? why is that?
impossible to...
is that still true?
are there no
exceptions at all?
By doing the above, we have also distilled out some basic
justifications of why this practice is “always-good” © Thompson
information
Systems
Consulting Limited
STAREast T15 Neil Thompson 15 of 29 2003
- 17. 3 What “appropriate” means in this
context (FUTURE REALITY)
The system has:
• users
Our user requirements
• (potentially) expert & independent testers
are out of date and
• designers (where significant)
were vague when written
• programmers
We do have a good functional spec
and independent testers available
Our system is
very simple We do need separate
development & acceptance
V-model with
test levels
• NEED NOT BE 4 LEVELS only 3 levels:
We don’t need a acceptance (v. consensus)
separate system (v. spec)
integration test level unit (informal)
Our programmers hate
documentation
© Thompson
information
Systems
Consulting Limited
STAREast T15 Neil Thompson 16 of 29 2003
- 18. Where are we up to?
Remember, can apply this partially
Always- Good What “appropriate” means
Context good practices in this context
practices in this
context 3
Overall
Specifics
Effectiveness
Can Can
stop stop FUTURE
Can here REALITY
here
stop
Efficiency
here
CURRENT 2a Questions to Choice categories
REALITY 1
Positioning
consider & actions
5a
2b
CONFLICT
RESOLUTION
4
PRE- 5b
© Thompson
information REQUISITES
Systems
Consulting Limited
2003 STAREast T15 Neil Thompson 17 of 29 TRANSITION
- 19. 4 Questions to consider
(PREREQUISITES)
USING ON THIS JOB
Based on Kaner, Bach & Pettichord (2002) a V-model with only 3 levels:
Lessons learned in software testing: acceptance (v. consensus)
a Context-Driven approach system (v. spec)
unit (informal)
What are best
Under what circumstances Who is How effects on:
Effectiveness
would this work? most likely will • you?
Efficiency
to benefit? you • other stakeholders?
know
Tried before? What’s different What if it How
What happened here, and would will worked? overcome
& why? it matter? you (consider objections
learn? a pilot) & sell
this?
What are worst
effects on: What if a
Under what circumstances Who is most • you? key person
would this not work? likely to be • other stakeholders? disagrees?
disadvantaged?
© Thompson
information
Systems
Consulting Limited
STAREast T15 Neil Thompson 18 of 29 2003
- 20. 5a Choices (TRANSITION lower level)
USING ON THIS JOB
a V-model with
For each testing level: only 3 levels:
Test against one basis acceptance (v. consensus)
or multiple? system (v. spec)
unit (informal)
Each basis formally CM-baselined,
or loose CM?
Each basis documented,
partially verbal, or If documented, what format(s)?
wholly verbal? • natural language
• pseudo-code
• UML
• ELHs, STDs, FDs etc
Functional / • formal / mathematical
non-functional
If mixed, how manage?
constituents?
If verbal,
how communicate and
how get agreement?
In what “language” and
level of detail are Where non-functional,
risks expressed? relationship to
quality standards? © Thompson
information
Systems
Consulting Limited
STAREast T15 Neil Thompson 19 of 29 2003
- 21. 5b Choice categories & actions
(TRANSITION upper level)
Overall objectives, eg determine
process improvement targets
and improve over time
Processes V-model: what testing against
Use independent Communications
Independence system & acceptance testers
plan:
• training courses
INTERACTIONS
Use appropriate
Techniques techniques • workshops Effectiveness objectives,
• documented rg quantify residual risks
Tools Use appropriate in-house & confidence for each job
tools guidelines
Coverage Allow & assess for coverage changes Efficiency objectives,
eg optimise (within context)
why? Intranet
Ceremony Provide standard progress reports
site
Investment Define & use metrics Assess where errors originally made
NEEDS
CMM/TMM TPI® Symptom-based
level targets targets targets eg TOM™ Thompson
©
information
Systems
Consulting Limited
STAREast T15 Neil Thompson 20 of 29 2003
- 22. So why not just start with a
process improvement method?
• Very good question! Possible answers:
– “we’re not ready for process improvement, just want a
structure to apply Context-Driven”
– “ we want to choose a process improvement method”
– “the methods are too restrictive, we want to build our own”
• Theory of Constraints answer:
– need to do some analysis before we know where best
to focus (or to improve processes)
• Thinking tools answer:
– want to look at interactions between choices... © Thompson
information
Systems
Consulting Limited
STAREast T15 Neil Thompson 21 of 29 2003
- 23. Interactions between choice categories
INFLUENCES
Processes Independence © Thompson
information
Systems
Consulting Limited
2003
INFLUENCE INFLUENCES
Techniques
HINDERS
INFLUENCE CONSTRAIN
GREATLY
HELP Tools
PROVIDE
SOME DATA
HELP FOR
Coverage MEASURE
SOME Ceremony
HELPS
IMPROVES MEASURE DOCUMENTATION
HELPS
Investment
STAREast T15 Neil Thompson 22 of 29
- 24. Can use tables
before / instead of / after diagrams
Good Can © Thompson
Context practices stop
Can What “appropriate” information
Systems
here Consulting Limited
stop means 2003
here
Can
FUTURE stop
CONFLICT REALITY here
RESOLUTION
Principles &
elements
Extremes &
CURRENT variables Questions to Choice categories
REALITY consider & actions
Can
stop
here
Needs eg TMM
Can
stop
here
PREREQUISITES
STAREast T15 Neil Thompson 23 of 29 TRANSITION
- 25. Conclusions
• Yes, there are differences between Context-Driven
& Best Practice, but:
– not as great as may first appear
– each side attempts to be inclusive (but sees other as not!)
– partly due to different backgrounds & industry sectors
– this public-domain method can bridge the gap
– need not be interpreted rigidly, or executed in full
– the basic thinking is causes & effects: anyone can do it
• To summarise:
– the framework, in effect, deconstructs then
reconstructs in your context... © Thompson
information
Systems
Consulting Limited
STAREast T15 Neil Thompson 24 of 29 2003
- 26. How many diagrams, where and when
“Universal” CONFLICT RESOLUTION upper one
2a-b
Your context CURRENT
REALITY one
Each requirement CONFLICT
to examine RESOLUTION
eg
a V-model
eg testware
lifecycle
eg patterns+
techniques
... 2-6
(out of 25)
lower 2b-3
What “appropriate” FUTURE
means in your context REALITY ... 2-6 or more
(out of 25)
3-4
Questions to
consider
PRE-
REQUISITES
... 2-6
4-5a (out of 25)
Choices TRANSITION ... 2-6
lower 5a-b (out of 25)
Choice
categories TRANSITION
one
& actions upper
when © Thompson
information where stop how
Systems
STAREast T15 Neil Thompson 25 of 29 Consulting Limited
2003
points many
- 27. Deconstruct then reconstruct:
the framework is a “meta-V-model”
All possible CONFLICT RESOLUTION TRANSITION Choice
contexts upper upper categories
& actions
Your context CURRENT TRANSITION
REALITY lower Choices
CONFLICT
Each practice PRE- Questions to
RESOLUTION
to examine REQUISITES consider
lower
FUTURE
FUTURE
REALITY
REALITY
What “appropriate” © Thompson
information
means in your context Systems
Consulting Limited
STAREast T15 Neil Thompson 26 of 29 2003
- 28. • Context-Driven: Key references
– Kaner, Bach & Pettichord (2002) Lessons learned in software testing, Wiley
• Best Practice:
– Erik van Veenendaal et al. (2002) The Testing Practitioner, U.T. Nolthenius
• My inspiration:
– Jens Pas (EuroSTAR 1998) Software testing metrics
– Gregory Daich (STAREast 2002) Software documentation superstitions
• Theory of Constraints understanding:
– Eliyahu M. Goldratt (1984 then 1992 with Jeff Cox) The Goal; (1997) Critical Chain, N. River Press
• TOC overview and the thinking tools:
– H. William Dettmer (1997) Goldratt’s TOC: a systems approach to cont. improv’t, ASQ
• Related (but differently-specialised) thinking from Agile community:
– Alistair Cockburn: A methodology per project, www.crystalmethodologies.org
– Mary Poppendieck: Lean development: an agile toolkit,
© Thompson
www.poppendieck.com information
Systems
Consulting Limited
STAREast T15 Neil Thompson 27 of 29 2003
- 29. Lessons learned, and way forward
• Reactions so far:
– “but this looks very like our “Best Practice” training course syllabus”
– “this may be too controversial”
– “I don’t understand what you’re on about”
– “why bother to analyse statements of the obvious?”
– “we should be able to come up with something more specific / detailed /
quantitative / prescriptive than this”
• So main lessons so far are:
– it may be obvious (and excitingly new) to me, but others differ, and...
– the reactions are mutually inconsistent!
• Ways forward:
– for me: can the other parts of Goldratt’s thinking also usefully apply to testing?
– for you (and therefore also for me!): can you use this framework?
© Thompson
information
Systems
Consulting Limited
STAREast T15 Neil Thompson 28 of 29 2003
- 30. • Questions please?
• Thank you!
Neil Thompson
NeilT@TiSCL.com
• Contact details...
+44 (0)7000 NeilTh (634584) www.TiSCL.com
23 Oast House Crescent © 2003
Farnham, Surrey, England Thompson
GU9 0NP, United Kingdom information
Test Management track presentation Systems
STAREast T15 Neil Thompson slide 29 of 29 Consulting Limited