Unleash Your Potential - Namagunga Girls Coding Club
Cpm500 d _alleman__tpm jun 2010 lesson 3 (v2)
1. 500D
Rights Reserved
Glen B. Alleman
Lewis & Fowler
galleman@lewisandfowler.com
(303) 241 9633
1/64
Lesson D: Implementing Technical
Performance Measurement
22nd Annual International IPM Conference
November 8-10, 2010
Bethesda, MD
Professional Education Program (Training Track) presented by
PMI-College of Performance Management faculty
CPM-500(D) : Principles of
Technical Management
2. 500D
Rights Reserved
The Purpose Of This Lesson
Defines the term and associated concept of
Technical Performance Measurement (TPM) on
projects.
Discusses the interrelations between TPM and
Earned Value Management (EVM).
Introduces the student to implementation of
computer based TPM tools, such as those used
by the Defense Contract Management Agency
(DCMA).
2/64
3. 500D
Rights Reserved
Learning Objectives
3/64
TLO #9: The student will understand the role of Technical Performance Measurement
(TPM) in the project office.
ELO #1: The student will recognize the policy requirements for Technical Performance
Measures.
ELO #2: The student will recognize the role of Integrated Baseline Reviews in confirming the
entire technical scope of work has been planned.
ELO #3: The student will recognize the role of the WBS in supporting Technical Performance
Measure requirements.
TLO #9: The student will understand the scope of DCMA’s (or other) TPM software
management tool implementation.
ELO #1: The student will recognize the benefits and challenges of Technical Performance
Measure implementation.
ELO #2: The student will recognize the use of control limit charts to track Technical
Performance Measure metrics.
ELO #3: The student will understand the methodology and approach used to show the
effect of Technical Performance Measure on Earned Value.
4. 500D
Rights Reserved
Can Earned Value Alone Get Us
To Our Destination?
How do we increase visibility into program performance?
How do we reduce cycle time to deliver the product?
How do we foster accountability?
How do we reduce risk?
How do we start our journey to success?
Increasing the Probability of Success means we have to
Connect The Dots Between EVM and TPM to Reach Our Destination 4/64
6. 500D
Rights Reserved
Increasing the Probability of
Program Success Means …
Risk
SOW
Cost
WBS
IMP/IMS
TPM
PMB
Building A Credible Performance Measurement Baseline
This is actually harder than it looks!
6/64
7. 500D
Rights Reserved
Doing This Starts With Some Guidance
Systems engineering uses technical performance
measurements to balance cost, schedule, and
performance throughout the life cycle. Technical
performance measurements compare actual versus
planned technical development and design. They
also report the degree to which system requirements
are met in terms of performance, cost, schedule, and
progress in implementing risk handling. Performance
metrics are traceable to user–defined capabilities.
― Defense Acquisition Guide
(https://dag.dau.mil/Pages/Default.aspx)
In The End ― It’s All About Systems Engineering
7/64
9. 500D
Rights Reserved
Just A Reminder Of The …
Primary Elements of Earned Value
Cost
Technical
Performance
Schedule
Funding margin
for under
performance
Schedule margin for
over target baseline
(OTB)
Schedule margin for
underperformance or
schedule extension
Over cost or
under
performance
Over cost or
over
schedule
Over
schedule or
under
performing
9/64
10. 500D
Rights Reserved
Previous Approaches Using EV
Are Mostly Unsuccessful In
Connecting These
Traditional approaches to program management
are retrospective,
– Cost and schedule of Earned Value,
– Risk Management, and
– Systems Engineering.
Reporting past performance,
– Sometimes 30 to 60 days old, and
– Variances are reported beyond the window of
opportunity for correction.
10/64
11. 500D
Rights Reserved
This Has All Been Said Before.
We Just Weren’t Listening…
… the basic tenets of the process are the need for
seamless management tools, that support an
integrated approach … and “proactive
identification and management of risk” for critical
cost, schedule, and technical performance
parameters.
― Secretary of Defense, Perry memo, May 1995
Why Is This Hard To Understand?
We seem to be focused on EV reporting, not the use
of EV to manage the program.
Getting the CPR out the door is the end of Program
Planning and Control’s efforts, not the beginning.
11/64
TPM Handbook 1984
12. 500D
Rights Reserved
The Gap Seems To Start With A
Common Problem
Many Times, The Information from Cost, Schedule, Technical
Performance, and Risk Management Gets Mixed Up When We
Try to Put Them Together
12/64
13. 500D
Rights Reserved
When We Put The Cart Before
The Horse, We Discover …
EVM really doesn’t do its job effectively.
Most of the time EV has no measure of quality
or compliance with technical requirements.
EV measures progress to plan in units of
“money,” not tangible value to the customer.
Most EV System Descriptions fail to connect the
dots between cost, schedule, and technical
performance – even though instructed to do so
in official guidance.
13/64
14. 500D
Rights Reserved
The NDIA EVM Intent Guide Says
Notice the inclusion of Technical along with
Cost and Schedule
That’s the next step is generating Value from Earned Value
EV MUST include the Technical Performance Measures
15. 500D
Rights Reserved
Back To Our Technical
Performance Measures
Technical Performance Measures do what
they say,
Measure the Technical Performance
of the product or service produced by the
program.
15/64
16. 500D
Rights Reserved
What’s Our Motivation for
“Connecting the Dots?”
Technical Performance Measures …
Provide program management with information
to make better decisions,
Increase the probability of delivering a solution
that meets both the requirements and mission
need.
TPMs are a set of measures that provide the supplier and
acquirer with insight into progress to plan of the technical
solution, the associated risks, and emerging issues.
16/64
We’ve been talking about this since as early as 1984, in Technical Performance
Measurement Handbook, Defense Systems Management College, Fort Belvoir, VA 22060
17. 500D
Rights Reserved
Measure of Effectiveness (MoE)
Measures of Effectiveness …
Are stated in units meaningful to the buyer,
Focus on capabilities independent of any
technical implementation,
Are connected to the mission success.
The operational measures of success that are closely
related to the achievements of the mission or operational
objectives evaluated in the operational environment,
under a specific set of conditions.
“Technical Measurement,” INCOSE–TP–2003–020–01
MoE’s Belong to the End User
17/64
18. 500D
Rights Reserved
Measure of Performance (MoP)
Measures of Performance are …
Attributes that assure the system has the
capability to perform,
Assessment of the system to assure it meets
design requirements to satisfy the MoE.
Measures that characterize physical or functional
attributes relating to the system operation, measured or
estimated under specific conditions.
“Technical Measurement,” INCOSE–TP–2003–020–01
MoP’s belong to the Program – Developed by the Systems
Engineer, Measured By CAMs, and Analyzed by PP&C
18/64
19. 500D
Rights Reserved
Key Performance Parameters (KPP)
Key Performance Parameters …
Have a threshold or objective value,
Characterize the major drivers of performance,
Are considered Critical to Customer (CTC).
Represent the capabilities and characteristics so
significant that failure to meet them can be cause for
reevaluation, reassessing, or termination of the program
“Technical Measurement,” INCOSE–TP–2003–020–01
The acquirer defines the KPPs during the operational
concept development – KPPs say what DONE looks like
19/64
20. 500D
Rights Reserved
Technical Performance Measures (TPM)
“Technical Measurement,” INCOSE–TP–2003–020–01
Technical Performance Measures …
Assess design progress,
Define compliance to performance requirements,
Identify technical risk,
Are limited to critical thresholds,
Include projected performance.
Attributes that determine how well a system or system
element is satisfying or expected to satisfy a technical
requirement or goal
20/64
21. 500D
Rights Reserved
Dependencies Between Measures
21/64
“Coming to Grips with Measures of Effectiveness,” N. Sproles,
Systems Engineering, Volume 3, Number 1, pp. 50–58
MoE
KPP
MoP TPM
Mission
Need
Acquirer Defines the Needs and Capabilities
in terms of Operational Scenarios
Supplier Defines Physical Solutions that
meet the needs of the Stakeholders
Operational
measures of
success related to
the achievement
of the mission or
operational
objective being
evaluated.
Measures that
characterize
physical or
functional
attributes relating
to the system
operation.
Measures used to
assess design
progress,
compliance to
performance
requirements, and
technical risks.
22. 500D
Rights Reserved
When Do We First Encounter The
Technical Performance
Measures?
At the IBR of course…
That’s when all the pieces
come together.
That’s when we can have
line of sight from the
requirements to the TPM’s
to the work needed to
produce the deliverables.
§5.3.1 of the 1 Sept 2010
version.
22/64
23. 500D
Rights Reserved
“Candidates” for Technical Measures
INCOSE Systems Engineering Handbook
Concept Description
Physical Size and Stability
Useful Life
Weight
Volumetric capacity
Functional Correctness
Accuracy
Power performance
All the “ilities”
Supportability
Maintainability
Dependability
Reliability = Mean Time Failure
Efficiency
Utilization
Response time
Throughput
Suitability for Purpose Readiness 23/64
Reliability
24. 500D
Rights Reserved
“Measures” of Technical Measures
INCOSE Systems Engineering Handbook
Attribute Description
Achieved to Date
Measured technical progress or estimate of
progress
Current Estimate
Value of a technical parameter that is predicted to
be achieved
Milestone
Point in time when an evaluation of a measure is
accomplished
Planned Value Predicted value of the technical parameter
Planned Performance
Profile
Profile representing the project time phased
demonstration of a technical parameter
Tolerance Band Management alert limits
Threshold Limiting acceptable value of a technical parameter
Variances
Demonstrated technical variance
Predicted technical variance 24/64
25. 500D
Rights Reserved
A Familiar Graphic of TPMs
Variance
Planned Value
Planned Profile
Current Estimate
Milestones
Threshold
Upper Limit
Lower Limit
MeanToBetweenFailure
Time = Program Maturity
Achieved to Date
25/64
TPM
26. 500D
Rights Reserved
A Simple Method of Assembling the TPMs
Select Technical
Performance
Parameters
Define the
planned
progress for
each TPM
Assess the
impact on Risk
from this
progress
Weight XXXX
Speed XXXX
MTBF XXXX
Loiter Time XXXX
Parameters Progress Risk
MOE / MOP KPP / TPM Risks
26/64
29. 500D
Rights Reserved
What Does A Real Technical
Performance Measure Look Like?
Not that bagels are not
interesting in Lesson 1 and
2, but let’s get ready to look
at a flying machine.
29/64
30. 500D
Rights Reserved
1.1 Air Vehicle
1.1.1 Sensor Platform
1.1.2 Airframe
1.1.3 Propulsion
1.1.4 On Board Comm
1.1.5 Auxiliary Equipment
1.1.6 Survivability
Modules
1.1.7 Electronic Warfare
Module
1.1.8 On Board
Application &
System SW
1.3 Mission Control /
Ground Station SW
1.3.1 Signal Processing
SW
1.3.2 Station Display
1.3.3 Operating System
1.3.4 ROE Simulations
1.3.5 Mission Commands
TPMs Start With The WBSThe WBS for a UAV
1.1.2 Airframe
30/64
31. 500D
Rights Reserved
What Do We Need To Know About
This Program Through TPMs
What WBS elements represent the TPMs?
What Work Packages produce these WBS
elements?
Where do these Work Packages live in the IMS?
What are the Earned Value baseline values for
these Work Packages?
How are we going to measure all these variables?
What does the curve look like for these
measurements?
31/64
32. 500D
Rights Reserved
Let’s Connect The Dots
Risk
SOW
Cost
WBS
IMP/IMS
TPM
PMB
Named
Deliverables
defined in the WBS
BCWS at the Work
Package, rolled to the
Control Account
TPMs attached to each
critical deliverables in the
WBS and identified in
each Work Package in the
IMS, used to assess
maturity in the IMP
The Products and
Processes that produce
them in a “well structured”
decomposition in the WBS
IMS contains all
the Work
Packages, BCWS,
Risk mitigation
plans, and rolls to
the Integrated
Master Plan to
measure
increasing maturity
Technical and Programmatic
Risks Connected to the WBS
and IMS
32/64
33. 500D
Rights Reserved
Verifying Each TPM
Evidence that we’re in compliance
CA
Do we know what we promised to
deliver, now that we’ve won?
With our submitted ROM what are the values we need to get
through Integrated Baseline Review. How do we measure
weight for each program event?
SFR
Can we proceed into preliminary
design?
The contributors to the vehicle weight are confirmed and the
upper limits defined in the product architecture and
requirements flow down database (DOORS) into a model.
SRR
Can we proceed into the System
Development and Demonstration
(SDD) phase?
Do we know all drivers of vehicle weight? Can we bound their
upper limits? Can the subsystem owners be successful within
these constraints uses a high fidelity model?
PDR
Can we start detailed design, and
meet the stated performance
requirements within cost, schedule,
risk, and other constraints?
Does each subsystem designer have the target component
weight target and have some confidence they can stay below
the upper bound? Can this be verified in some tangible way?
Either through prior examples or a lab model?
CDR
Can the system proceed to
fabrication, demonstration, and test,
within cost, schedule, risk, and other
system constraints?
Do we know all we need to know to start the fabrication of
the first articles of the flight vehicle. Some type of example,
maybe a prototype is used to verify we’re inside the lines.
TRR
Can the system ready to
proceed into formal test?
Does the assembled vehicle fall within the weight range limits
for 1st flight – will this thing get off the ground? 33/64
34. 500D
Rights Reserved
25kg
23kg
28kg
TPM Trends & Responses
Dr. Falk Chart – modified
EV Taken, planned values met, tolerances kept, etc.
26kg
PDRSRRSFRCA TRRCDR
ROM in Proposal
Design Model
Bench Scale Model Measurement
Detailed Design Model
Prototype Measurement
Flight 1st Article
34/64
TechnicalPerformanceMeasure
VehicleWeight
35. 500D
Rights Reserved
The Assessment Of Weight As A
Function Of Time
At Contract Award there is a Proposal grade estimate of
vehicle weight.
At System Functional Review, the Concept of
Operations is validated for the weight.
At System Requirements Review the weight targets are
flowed down to the subsystems components.
At PDR the CAD model starts the verification process.
At CDR actual measurements are needed to verify all
models.
At Test Readiness Review we need to know how much
fuel to put on board for the 1st flight test.
35/64
36. 500D
Rights Reserved
1.1 Air Vehicle
1.1.1 Sensor Platform
1.1.2 Airframe
Airframe Weight TPMThe WBS for a UAV
1.1.2 Airframe
CA SFR SRR PDR CDR TRR
Planned Value 28.0kg 27.0kg 26.0kg 25.0kg 24.0kg 23.0kg
Actual Value 30.4kg 29.0kg 27.5kg 25.5kg
Assessed Risk
to TRR
Moderate
>2.0kg off
target
Low
1–2 kg off
target
Low
1–2 kg off
target
Very Low (less
than 1.0 kg
off target)
Planned
Method
“Similar to”
Estimate
ROM
Program–
unique design
model
Program–
unique design
model with
validated data
Actual
measurement
of bench–test
components
Actual
measurement
of prototype
airframe
Actual
Method
“Similar to”
Estimate
ROM ROM ROM
The planned weight is
25kg. The actual weight is
25.5kg.
Close to plan! So we are
doing okay, right?
36/64
Here’s the Problem
37. 500D
Rights Reserved
Is This A Problem?
You Bet’ya It’s A Problem!
The measurement is close to the planned value,
But the planned method of measurement is a
program unique design model with validated
data,
But the actual method of measurement is a
Rough Order of Magnitude estimate,
No improvement in fidelity since the System
Functionality Review (SFR), and
The TPM provides no new information – so
we’re probably late and don’t know it yet.
37/64
38. 500D
Rights Reserved
Raison d'etre for Technical
Performance Measures
The real purpose of
Technical Performance
Measures is to reduce
Programmatic and
Technical RISK
38/64
Risk
SOW
Cost
WBS
IMP/IMS
TPM
PMB
39. 500D
Rights Reserved
Buying Down Risk with TPMs
“Buying down” risk is
planned in the IMS.
MoE, MoP, and KPP
defined in the work
package for the critical
measure – weight.
If we can’t verify
we’ve succeeded, then
the risk did not get
reduced.
The risk may have
gotten worse.
39/64
Risk: CEV-037 - Loss of Critical Functions During Descent
Planned Risk Level Planned (Solid=Linked, Hollow =Unlinked, Filled=Complete)
24
22
20
18
16
14
12
10
8
6
4
2
0
Conduct Force and Moment Wind
Develop analytical model to de
Conduct focus splinter review
Conduct Block 1 w ind tunnel te
Correlate the analytical model
Conduct w ind tunnel testing of
Conduct w ind tunnel testing of
Flight Application of Spacecra
CEV block 5 w ind tunnel testin
In-Flight development tests of
Damaged TPS flight test
31.Mar.05
5.Oct.05
3.Apr.06
3.Jul.06
15.Sep.06
1.Jun.07
1.Apr.08
1.Aug.08
1.Apr.09
1.Jan.10
16.Dec.10
1.Jul.11
Weight risk
reduced from
RED to Yellow
Weight confirmed
ready to fly – it’s
GREEN at this point
40. 500D
Rights Reserved
Increasing the Probability of
Success with Risk Management
Going outside the TPM
limits always means
cost and schedule
impacts.
“Coloring Inside the
Lines” means knowing
the how to keep the
program GREEN, or at
least stay close to
GREEN.
40/64
So much for our strategy of winning
through technical dominance
41. 500D
Rights Reserved
Connecting the EV Variables
41/64
Integrating Cost, Schedulele, and Technical Performance
Assures Program Management has the needed performance information to deliver
on‒time, on‒budget, and on‒specification
Technical Performance Measures
Cost Schedule
Conventional Earned Value
+
=
Master Schedule is used
to derive Basis of
Estimate (BOE) not the
other way around.
Probabilistic cost
estimating uses past
performance and cost
risk modeling.
Labor, Materiel, and
other direct costs
accounted for in Work
Packages.
Risk adjustments for all
elements of cost.
Cost Baseline
Earned Value is diluted
by missing technical
performance.
Earned Value is diluted
by postponed features.
Earned Value is diluted
by non compliant quality.
All these dilutions
require adjustments to
the Estimate at Complete
(EAC) and the To
Complete Performance
Index (TCPI).
Technical Performance
Requirements are
decomposed into
physical deliverables.
Deliverables are
produced through Work
Packages.
Work Packages are
assigned to accountable
manager.
Work Packages are
sequenced to form the
highest value stream
with the lowest technical
and programmatic risk.
Schedule Baseline
42. 500D
Rights Reserved
TPM Checklist
MoE MoP TPM
Traceable to needs,
goals, objectives, and
risks
Traceable to applicable
MOEs, KPPs, system level
performance requirements,
and risks
Traceable to applicable MoPs,
system element performance,
requirements, objectives,
risks, and WBS elements
Defined with associated
KPPs
Focused on technical risks
and supports trades
between alternative
solutions
Further decomposed,
budgeted, and allocated to
lower level system elements in
the WBS and IMS
Each MoE independent
from others
Provided insight into
system performance
Assigned an owner, the CAM
and Work Package Manager
Each MoE independent
of technical any solution
Decomposed, budgeted
and allocated to system
elements
Sources of measure identified
and processes for generating
the measures defined.
Address the required
KPPs
Assigned an “owner,” the
CAM and Technical
Manager
Integrated into the program’s
IMS as part of the exit criteria
for the Work Package 42/64
43. 500D
Rights Reserved
Increasing the Probability of
Program Success Means …
Risk
SOW
Cost
WBS
IMP/IMS
TPM
PMB
Building A Credible Performance Measurement Baseline
Using the Check List – “Connect the Dots”
43/64
44. 500D
Rights Reserved
Did We Accomplish the Learning
Objectives?
44/64
TLO #9: The student will understand the role of Technical Performance Measurement (TPM) in the project
office.
ELO #1: The student will recognize the policy
requirements for TPM.
Policies and supporting guidance, with links and
reference numbers provided.
ELO #2: The student will recognize the role of IBRs in
confirming the entire technical scope of work has
been planned.
This is the first place where cost, schedule and
technical performance come together – in the
Integrated Master Schedule (IMS)
ELO #3: The student will recognize the role of the
WBS in supporting TPM requirements.
TPMs are first located in the WBS
TLO #9: The student will understand the scope of DCMA’s (or other) TPM software management tool
implementation.
ELO #1: The student will recognize the benefits and
challenges of TPM implementation.
Progress is measured in units of physical percent
complete. TPMs are those units.
ELO #2: The student will recognize the use of control
limit charts to track TPM metrics.
We’ve seen notional and actual charts
ELO #3: The student will understand the
methodology and approach used to show the effect
of TPMs on earned value.
The example of our “flying machine” connects the
dots for TPMs, risk, cost, and schedule.
47. 500D
Rights Reserved
OMB Circular A–11, Section 300 Interim Defense Acquisition Guidebook (DAG)
6/15/09
GAO Report 06–250 Systems Engineering Plan (SEP) Preparation Guide
4/08
DoDI 5000.02, Operation of the Defense
Acquisition System (POL) 12/08
WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05
Integrated Master Plan (IMP) & Integrated
Master Schedule Preparation & Use Guide
(IMS) 10/21/05
Guide for Integrating SE into DOD Acquisition
Contracts 12/06
Defense Acquisition Program Support
Methodology (DAPS) V2.0 3/20/09
Guide to the Project Management Institute Body of
Knowledge (PMBOK Guide®), 4th Edition
Standard for Application and
Management of the SE Process (IEEE
1220)
Capability Maturity Model Integration (CMMI®)
IEEE 1220: 6.8.1.5 Processes for Engineering a System (ANSI/EIA–632)
NASA EVM Guide NPG 9501.3
Many of Sources for Connecting the Dots
47/64
48. 500D
Rights Reserved
Office of Management and
Budget
Circular No. A–11, Section 300
Planning, Budgeting, Acquisition and Management
of Capital Assets
Section 300–5
– Performance–based acquisition management
– Based on EVMS standard
– Measure progress towards milestones
• Cost
• Capability to meet specified requirements
• Timeliness
• Quality
48/64
49. 500D
Rights Reserved
Need: Accurate Performance
Measurement
GAO Report 06–250
Findings and
Recommendations
Information Technology:
Improve the Accuracy and
Reliability of Investment
Information
2. If EVM is not
implemented effectively,
decisions based on
inaccurate and potentially
misleading information
3. Agencies not measuring
actual versus expected
performance in meeting
IT performance goals.
49/64
50. 500D
Rights Reserved
DOD Guides:
Technical Performance
Department of Defense Guidelines for Technical Performance Measures
DoDI 5000.02, Operation of the Defense Acquisition System (POL) 12/08
Interim Defense Acquisition Guidebook (DAG) 6/15/09
Systems Engineering Plan (SEP) Preparation Guide 4/08
WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05
Integrated Master Plan (IMP) & Integrated Master Schedule Preparation &
Use Guide (IMS) 10/21/05
Guide for Integrating SE into DOD Acquisition Contracts (Integ SE) 12/06
Defense Acquisition Program Support Methodology (DAPS) V2.0 3/20/09
50/64
51. 500D
Rights Reserved
DoD: TPMs in Technical Baselines and Reviews
DoD Policy or Guide
POL
DAG
SEP
WBS
IMP/IMS
Integrated
Systems
Engineering
DAPS
Technical Baselines:
IMP/IMS
Functional (SFR)
Allocated (PDR)
Product (CDR)
Event driven timing
Success criteria of
technical review
Entry and exit criteria
for technical reviews
Assess technical
maturity 51/64
52. 500D
Rights Reserved
DoD: TPMs in Integrated Plans
DoD Policy or Guide
POL
DAG
SEP
WBS
IMP/IMS
Integrated
Systems
Engineering
DAPS
Integrated SEP with:
IMP/IMS
TPMs
EVM
Integrated WBS with
Requirement
Specification
Statement of Work
IMP/IMS/EVMS
Link risk management,
technical reviews, TPMs,
EVM, WBS, IMS
52/64
53. 500D
Rights Reserved
Guidance in Standards, Models,
and Defense Acquisition Guide
Processes for Engineering a System (ANSI/EIA–632)
Standard for Application and Management of the SE
Process (IEEE 1220)
Capability Maturity Model Integration (CMMI®)
– CMMI for Development, Version 1.2
– CMMI for Acquisition, Version 1.2
– Using CMMI to Improve Earned Value Management,
2002
Guide to the Project Management Institute Body of
Knowledge (PMBOK Guide®), 4th Edition
53/64
54. 500D
Rights Reserved
Technical Performance
Measures (TPM)
More Sources
IEEE 1220: 6.8.1.5,
Performance–based
progress measurement
EIA–632: Glossary CMMI for Development
Requirements
Development
TPMs are key to
progressively assess
technical progress
Predict future value of key
technical parameters of
the end system based on
current assessments
Specific Practice (SP) 3.3,
Analyze Requirements
Typical work product:
TPMs
Establish dates for
– Checking progress
– Meeting full
conformance to
requirements
Planned value profile is
time–phased achievement
projected
• Achievement to date
• Technical milestone
where TPM evaluation is
reported
Subpractice:
Identify TPMs that will be
tracked during
development
54/64
55. 500D
Rights Reserved
PMBOK® Guide
10.5.1.1 Project Management Plan
Performance Measurement Baseline:
– Typically integrates scope, schedule, and cost
parameters of a project
– May also include technical and quality parameters
55/64
56. 500D
Rights Reserved
PMBOK® Guide
8.3.5.4 Work Performance Measurements
Used to produce project activity metrics
Evaluate actual progress as compared to
planned progress
Include, but are not limited to:
– Planned vs. actual technical performance
– Planned vs. actual schedule performance, and
– Planned vs. actual cost performance.
56/64
57. 500D
Rights Reserved
TPMs in DAG and DAPS
Defense Acquisition Guide
Performance measurement of WBS elements, using
objective measures:
– Essential for EVM and Technical Assessment activities
Use TPMs and Critical Technical Parameters (CTP) to
report progress in achieving milestones
DAPS
Use TPMs to determine whether % completion
metrics accurately reflect quantitative technical
progress and quality toward meeting Key
Performance Parameters (KPP) and Critical
Technical Parameters
57/64
58. 500D
Rights Reserved
TPMs in DAG
Compare the actual versus planned technical
development and design
Report progress in the degree to which system
performance requirements are met.
Plan is defined in terms of:
– Expected performance at specific points
• Defined in the WBS and IMS
– Methods of measurement at those points
– Variation limits for corrective action.
58/64
59. 500D
Rights Reserved
PMBOK® Guide
11.6.2.4 Technical Performance Measurement
Compares technical accomplishments… to … project
management plan’s schedule of technical
achievement
Requires definition of objective quantifiable
measures of technical performance which can be
used to compare actual results against targets.
Might include weight, transaction times, number of
delivered defects, storage capacity etc.
Deviation, such as demonstrating more or less
functionality than planned at a milestone…forecast
degree of success in achieving the project’s scope.
59/64
60. 500D
Rights Reserved
CMMI–ACQ
Acquisition Technical Management
SP 1.3 Conduct Technical Reviews
Typical supplier deliverables
Progress reports and process, product, and
service level measurements
TPMs
60/64
61. 500D
Rights Reserved
SMS Shall:
Monitor Progress Against the Plan
4.2.12.2 Monitoring
– Contractor SHALL monitor progress against plan to
validate, approve, and maintain each baseline and
functional architecture
4.2.12.2.2 Required Product Attributes
– Each documented assessment includes:
– TPMs, metrics
– Metrics and technical parameters for tracking that
are critical indicators of technical progress and
achievement
61/64
62. 500D
Rights Reserved
NASA EVM Guide:
Technical Performance
• NASA EVM Guide NPG 9501.3
– 4.5 Technical Performance Requirements (TPR): When
TPRs are used,
– appropriate and relevant metrics…
– must be defined in the solicitation
– Appendix A.7, 14.1 TPR
• Compares:
• Expected performance and
• Physical characteristics
• With contractually specified values.
• Basis for reporting established milestones
• Progress toward meeting technical requirements
62/64
63. 500D
Rights Reserved
See next chart for linkage of technical baselines to technical reviews
Document, Baseline,
IMS, EVM Parameter
IMP, Functional Baseline Measures Of Effectiveness (MOE)
IMP, WBS, Functional Baseline Measures Of Performance (MOP)
IMP, Allocated Baseline Technical Performance Measure
IMS
TPM Milestones And Planned
Values
Work Packages TPM% Complete Criteria
Derivation and Flow
Down of TPMs
63/64
64. 500D
Rights Reserved
Interesting Attributes of TPMs
Achieved to Date (sounds like EV)
Current Estimate (sounds like EAC/ETC)
Milestone
Planned (target) value (sounds like PV)
Planned performance profile (sounds like a PMB)
Tolerance band (sounds like reporting thresholds)
Threshold (yep, just what we thought)
Variance (sounds like variance!)
64/64