Adam Suchley - Predictive Delivery Assurance - APM Assurance SIG Conference 2018
1.
2. C O N F I D E N T I A L
Using data to drive predictive
assurance
Assurance in Uncertain Times
29th November 2018
3. 3
How your data can support and drive
proactive and predictive assurance activities
Adam Suchley, Senior Delivery Manager, Mudano
4. 4
Contents
1. How do we target effective assurance activity?
2. How do we remove subjectivity from project assessment?
3. How do we perform assurance interventions proactively before
problems arrive?
• The power of machine learning
4. Summary
5. Q&A
5. 5
Assurance activities tend to be based on backward looking
subjective data
Typically based on infrequent risk assessments and Project Status reports provided by Project
Managers responsible for delivery
Subject to Project Manager subjectivity
• Biased
• Reflection of PM personality
How do we target effective assurance activity?
9. 9
Assurance activities tend to be based on backward looking
subjective data
Typically based on an infrequent risk assessment and Project Status reports provided by
Project Managers responsible for delivery
Subject to Project Manager subjectivity
• Biased
• Based on PM personality
• RAG is coarse
– Many ‘degrees of amber’
– Doesn’t change that often
Backwards looking – always behind the curve
90%
9%
10%
5%
85%
5%
80%
15%
1%
How do we target effective assurance activity?
10. 10
How do we
remove
subjectivity
from project
assessment?
Develop a quantitative
assessment framework
tailored to the business
Define ultimate project success
indicators and categorise these
Define evidence and metrics
that support quantification of
project success
Define signals that can be
derived from the raw data to
evidence these metrics
Combine these signals to
generate a quantitative Project
Health Score
11. 11
• This process will be specific to your business
• Canvass a wide range of opinions
• Approach is to iterate and refine, and will evolve over time after implementation and monitoring
Mapping project success indicators to signals
Define ultimate project
success indicators and
categorise these
Define evidence and metrics
that support quantification of
project success indicator
Define signals that can be
derived from the raw data to
evidence each metric
Deliverables meet or exceed
agreed targets
User acceptance and
associated processes
successfully completed
Count of the number of
change requests per
project per time period
Example:
Budgetary expectations met
Project level financial records
over time
Number and relative size of
changes to in-flight project
budget
Example:
12. 12
In order to derive meaningful and accurate signals of project success,
data management, quality and curation issues need first be resolved
To make data core to your business……
Ride the wave
of regulation
Align data
change to
business
change projects
Trigger demand
through step
change value
Quantify impact
of poor data
quality
Embed in
leadership
scorecards
Think Big Start Small…….
Prioritise key data fields for signal derivations for data capture / data quality remediation /
consolidation into a central hub
13. 13
Combine signals to generate a Project Health Score
Independent Subject Matter Experts
• SME based weights
• Independent scoring of a representative sample of
projects. Simple machine learning model based on
category and overall scores to provide weights
Data-driven
• Machine learning model to determine weights
analytically
Validate and benchmark Project Health Score
• SME or analytical
Range of options for signal combination dependent on maturity of ultimate project success measurement
Project Health Score
Category
Signal
14. 14
Project Health Score has multiple possible use cases
Project Health Score and RAG
1. Project Health Score can augment and eventually replace RAG reporting
2. Project Health Score can be benchmarked against RAG
3. Compare with RAG to identify potential underreported risk and target assurance interventions
Potential
underreported
risk
15. 15
Portfolio delivery risk analysis
1. Calculate Project Health Score against current portfolio, segmented by delivery area
2. Compare implied delivery risk to understand potential systemic issues by area
Division A
Division B
Division C
Division D
(Median values)
16. 16
Individual project trend analysis
1. Understand historical variability in project health
2. Compare with RAG to analyse individual PM behaviour
17. 17
How do we
perform
assurance
interventions
proactively
before problems
arrive?
Develop machine learning
algorithms trained on
historical performance data
Define Project Health Score deterioration
‘target’ for machine learning
Collate historical data to train algorithms
Validate algorithms at both portfolio and
individual project level
Apply algorithms to target assurance
interventions: test and learn
19. 19
Nothreshold
ProjectHealthScore
Time step
20% incidence
rate
5%threshold
5% incidence rate
Firstly, we need a definition of Project Health Score deterioration
1. Project Health Score slumping based on a decrease over a number of time steps
2. Flex definition based on assurance intervention operating model and target incidence rate
20. 20
Build machine learning models to predict project health deterioration
1. Machine learning algorithms can be used to predict RAG to build confidence in approach
2. Need historical data to train algorithms. More historical data better. But needs to be consistent
and with good data quality. This can be extremely challenging
3. Validate predictive features in the data are sensical: e.g. escalation in number of new risks
indicating elevated slumping probability. Insightful common features of failure can feedback into
change strategy
4. Validate algorithms for select individual projects
21. 21
PM commentary fields can be key predictors!!!!!
• We find that PM commentary length is negatively correlated to RAG status
– PMs on projects running red tend to write more
– Conversely, if the project is green why is the PM writing so much? Just overly keen?
• Public machine learning algorithms can identify the sentiment – positivity and negativity – and the
subjectivity of text
• Algorithms can be trained to the language used in your organisation
– Annotate tens of thousands of sentences with sentiment
– Likely to be more predictive than generic sentiment
– Could be used to automatically annotate text in any tool to highlight positive and negative sentiment
• Neural Nets can find hidden features in commentary!
22. 22
Start using slumping predictions on a test and learn basis
1. Track assurance intervention success to refine machine learning algorithms – is there any
pattern in where algorithms tend to work better or worse?
2. Model prediction precision key to cost/benefit analysis
0 10 20 30 40 50
Neural Network with Signals and Text Encoder
Random Forest with Signals
Project Features + Text
Text Encoder
Sentiment and Subjectivity
Random
Precision
• 1,000 £100k projects (£100m)
• 5% slumping probability costing
10% of budget (50 projects slump,
worth £5m, costs us £500k)
• 50% prediction precision saves
£250k
Model
complexity
23. 23
In summary
Infrequent risk assessments
and subjective, backward
looking status reports
Quantitative framework for
development of an objective
Project Health Score
Machine learning driving
proactive assurance
interventions
Change investment strategy
informed by machine
learning models
Automated decision making?
FUTURE
TODAY
Data Management is key to support this
24. 24Confidential to *insert client name* and Mudano
DATA
ANALYTICS
DATA
INNOVATION
AND APPLIED
MACHINE
LEARNING
REGULATORY
COMPLIANCE
DATA
TRANSFORMATION
DATA
STRATEGY
https://www.youtube.com/watch?time_continue=62&v=zHi3oufTUmg
Thank you!
Any questions?
http://deliveryscience.mudano.com/sharktower/