SlideShare ist ein Scribd-Unternehmen logo
1 von 38
CMMI Model ChangesCMMI Model Changes
for High Maturityfor High Maturity
Herb WeinerHerb Weiner
Pat O’ToolePat O’Toole
2008 SEPG Conference2008 SEPG Conference
Tampa, FloridaTampa, Florida
Problem StatementProblem Statement
 High maturity practices are not consistently
understood, applied, or appraised
 SEI is addressing the training and appraisal portions of
the CMMI Product Suite; e.g.,
 Understanding CMMI High Maturity Practices course
 Several recent presentations by SEI personnel
 High Maturity Lead Appraisers certification
 However, there is insufficient foundation for these
“raise-the-floor” interpretations in CMMI v1.2
 Goals do not establish the requirements
 Practices do not establish the expectations
 Informative material purported to take on greater importance.
Eating Your Own Dog FoodEating Your Own Dog Food
 Requirements Management SG1Requirements Management SG1::
 Requirements are managed and inconsistenciesRequirements are managed and inconsistencies
with project plans and work products arewith project plans and work products are
identifiedidentified
 CMMI Product Suite Management SG1CMMI Product Suite Management SG1::
 CMMI model requirements are managed andCMMI model requirements are managed and
inconsistencies with CMMI training courses andinconsistencies with CMMI training courses and
appraisal methods are identified.appraisal methods are identified.
ApproachApproach
 Draft proposed changes
 CMMI Model & SCAMPI Method Changes for High Maturity
(Herb Weiner, May 2007)
 Solicit feedback from SEI authorized people via ATLAS
 ATLAS = Ask The Lead AppraiserS
 ATLAS has been expanded to include CMMI instructors
 Candidate lead appraisers and instructors also included
 Publish results to SEI authorized individuals
 Submit CRs to SEI for consideration
 Update model to re-align the CMMI Product Suite.
ATLAS FeedbackATLAS Feedback
 For each proposed change, respondents indicated:
 Strongly support (It’s perfect!)
 Support (It’s better)
 Are ambivalent (It’s OK either way)
 Disagree (It’s worse)
 Strongly disagree (What were you thinking?)
 Ratings were determined on a +1 to -1 scale as follows:
 Strongly support = +1.0
 Support = +0.5
 Ambivalent = 0.0
 Disagree = -0.5
 Strongly disagree = -1.0
 For each change, the average rating will be displayed for:
 [High Maturity Lead Appraisers, Other SEI authorized individuals]
ProposedProposed
OPPOPP
ChangesChanges
OPP Proposed Change #1 of 4OPP Proposed Change #1 of 4
Move SP 1.3 to SP 1.1
Current:
SP 1.1 Select Processes
SP 1.2 Establish Process-Performance Measures
SP 1.3 Establish Quality and Process-Performance
Objectives
Proposed:
SP 1.1 Establish Quality and Process-Performance
Objectives
SP 1.2 Select Processes
SP 1.3 Establish Process-Performance Measures
 MA, OPF, and QPM establish objectives in SP 1.1.
(.50, .51)
OPP Proposed Change #2 of 4OPP Proposed Change #2 of 4
Revise OPP SP 1.4
Current:
Establish and maintain the organization’s process-
performance baselines.
Proposed:
Conduct process-performance analyses on the selected
processes and subprocesses to verify process stability
and to establish and maintain the organization’s
process-performance baselines.
 SP 1.1 & 1.2 indicate process-performance analysis will
be conducted, but that’s the last we hear of it
 Baselines are established for stable processes
 Elevate this from informative to expected.
(.39, .42)
OPP Proposed Change #3 of 4OPP Proposed Change #3 of 4
Revise OPP SP 1.5
Current:
Establish and maintain the process-performance models for the
organization’s set of standard processes.
Proposed:
Establish and maintain models that predict process performance
related to the quality and process-performance objectives.
 The SEI’s new training courses emphasize use of process-
performance models with respect to quantitative objectives
 Focusing this practice on these objectives achieves better alignment
between the model and training.
(.59, .50)
OPP Proposed Change #4 of 4OPP Proposed Change #4 of 4
Enhance the informative material
Proposed:
Modify informative material that suggests improving process
performance such as the examples found in OPP SP 1.3 (which
imply that common causes of variation be addressed)
Add new informative material should indicate that, at ML4/CL4,
achieving such improvement might be addressed via OPF and
GP3.1, while at ML5/CL5, it is more likely to be achieved through
CAR, OID, and GP5.2
 In order to delineate level 4 from level 5, the model should avoid
implying that common causes of variation are addressed at level 4
 ML4/CL4: Process stability / execution consistency / special causes
 ML5/CL5: Improving capability / systemic improvement / common causes.
(.36, .44)
ProposedProposed
QPMQPM
ChangesChanges
QPM Proposed Change #1 of 4QPM Proposed Change #1 of 4
Revise QPM SP 1.4
Current:
SP 1.4 Manage Project Performance
Monitor the project to determine whether the project’s objectives
for quality and process performance will be satisfied, and identify
corrective action as appropriate.
Proposed:
SP 1.4 Analyze Project Performance
Analyze the collective performance of the project's subprocesses
to predict whether the project's objectives for quality and process
performance will be satisfied and identify the need for corrective
action as appropriate.
 Fixes mismatch between the current title and practice statement
 Recognizes that project management deals with both
quantitatively managed, and non-quantitatively managed
processes.
(.54, .57)
QPM Proposed Change #2 of 4QPM Proposed Change #2 of 4
Add QPM SP 1.5
Current: <None>
Proposed:
SP 1.5 Use Process-Performance Models
Use calibrated process-performance models
throughout the life cycle to identify, analyze, and
execute corrective action when necessary.
 Currently, PPMs aren’t expected to be used in QPM
 But use throughout life cycle appears to be expected by SEI
 PPMs may support process or subprocess activities
 Added practice to SG 1, but it could have been added to SG2.
(.39, .46)
QPM Proposed Change #3 of 4QPM Proposed Change #3 of 4
Add QPM SP 2.3
Current: <None>
Proposed:
SP 2.3 Address Special Causes of Variation
Identify, address, and prevent reoccurrence of special causes of
variation in the selected subprocesses.
 “Special causes” are featured in SEI materials
 Currently “special causes” are only in QPM’s informative material
 The Glossary definition of “stable process” includes “…and prevent
reoccurrences of special causes”
 Add informative material to ensure that process performance data
and statistical techniques are used appropriately.
(.64, .48)
QPM Proposed Change #4 of 4QPM Proposed Change #4 of 4
Revise QPM SP 2.3 (now SP 2.4)
Current:
SP 2.3 Monitor Performance of the Selected Subprocesses
Monitor the performance of the selected subprocesses to
determine their capability to satisfy their quality and process-
performance objectives, and identify corrective action as
necessary.
Proposed:
SP 2.4 Analyze Performance of the Selected Subprocesses
Analyze the performance of the selected subprocesses to predict
their capability to satisfy their quality and process-performance
objectives, and identify and take corrective action as necessary.
 “Analyze” is a much stronger word than “monitor”
 “Predict” is a much stronger word than “determine”
 Emphasize “taking corrective action,” not just identifying it.
(.59, .46)
ProposedProposed
CARCAR
ChangesChanges
CAR Proposed Change #1 of 7CAR Proposed Change #1 of 7
Thematic Change
 Currently, there is little to suggest that CAR should target
statistically managed subprocesses to identify and analyze
common causes of variation to address:
 Stable processes with unacceptably high standard deviations;
 Stable processes not capable of achieving quality or process
performance objectives; and
 Stable and capable processes that might be improved to enhance
competitive advantage
 Change the focus of CAR’s specific goals and practices from
“defects and other problems” to “problems”
 By collapsing this phrase, model users will not limit their application of
CAR to the subset of problem candidates called “defects”
 Also include a discussion of “opportunities” in the informative material.
(.50, .46)
CAR Proposed Change #2 of 7CAR Proposed Change #2 of 7
Revise CAR SG 1
Current:
SG 1 Determine Causes of Defects
Root causes of defects and other problems are
systematically determined.
Proposed:
SG 1 Determine and Analyze Causes
Common causes of variation and root causes of
problems are systematically analyzed.
 Reflects the Thematic Change
 “Analyzed” is a stronger word than “determined”.
(.56, .63)
CAR Proposed Change #3 of 7CAR Proposed Change #3 of 7
Revise CAR SP 1.1
Current:
SP 1.1 Select Defect Data for Analysis
Select the defects and other problems for analysis.
Proposed:
SP 1.1 Select Data for Analysis
Select for analysis, using established criteria, quantitatively managed
processes that are candidates for improvement as well as problems
that have a significant effect on quality and process performance.
 Reflects the Thematic Change
 “Significant effect” emphasizes quantitatively managed processes.
(.64, .53)
CAR Proposed Change #4 of 7CAR Proposed Change #4 of 7
Revise CAR SP 1.2 and add SP1.3-SP 1.4
Current:
SP 1.2 Analyze Causes
Perform causal analysis of selected defects and other problems and propose
actions to address them.
Proposed:
SP 1.2 Analyze Common Causes
Analyze common causes of variation to understand the inherent quality and
process performance constraints.
SP 1.3 Analyze Root Causes
Perform causal analysis on selected problems to determine their root
causes.
SP 1.4 Propose Actions to Address Causes
Propose actions to address selected common causes of variation and to
prevent recurrence of selected problems.
 Reflects the Thematic Change.
 Establishes expectations for BOTH common causes and root causes.
(.44, .57)
CAR Proposed Change #5 of 7CAR Proposed Change #5 of 7
Add CAR SP 1.5
Current: <None>
Proposed:
SP 1.5 Predict Effects of Proposed Actions
Use process performance models and statistical
techniques to predict, in quantitative terms, the effects
of the proposed actions, as appropriate.
 Reflects the SEI’s expected use of PPMs and statistical
methods in high maturity organizations
 Supports proper cost/benefit analysis.
(.52, .58)
CAR Proposed Change #6 of 7CAR Proposed Change #6 of 7
Revise CAR SG 2, SP 2.1 – SP 2.2
Current:
SG 2 Analyze Causes
Root causes of defects and other problems are systematically addressed to
prevent their future occurrence.
SP 2.1 Implement the Action Proposals
Implement the selected action proposals that were developed in causal analysis.
SP 2.2 Evaluate the Effect of Changes
Evaluate the effect of changes on process performance.
Proposed:
SG 2 Address Causes
Common causes of variation and root causes of problems are systematically
addressed to quantitatively improve quality and process performance.
SP 2.1 Implement the Action Proposals
Implement selected action proposals that are predicted to achieve a measurable
improvement in quality and process performance.
SP 2.2 Evaluate the Effect of Implemented Actions
Evaluate the effect of implemented actions on quality and process performance.
CAR Proposed Change #6 of 7CAR Proposed Change #6 of 7
Proposed: (Copied from previous slide)
SG 2 Address Causes
Common causes of variation and root causes of problems are
systematically addressed to quantitatively improve quality and process
performance.
SP 2.1 Implement the Action Proposals
Implement selected action proposals that are predicted to achieve a
measurable improvement in quality and process performance.
SP 2.2 Evaluate the Effect of Implemented Actions
Evaluate the effect of implemented actions on quality and process
performance.
 Reflects the Thematic Change
 Wording enhanced to focus on measurable improvement of “quality and
process performance” – a phrase reserved for high maturity practices
 SP 2.2 modified to include quality as well as process performance
 A perceived oversight in the current practice.
(.46, .64)
CAR Proposed Change #7 of 7CAR Proposed Change #7 of 7
Revise CAR SP 2.3
Current:
SP 2.3 Record Data
Record causal analysis and resolution data for use across the
project and organization.
Proposed:
SP 2.3 Submit Improvement Proposals
Submit process- and technology-improvement proposals based
on implemented actions, as appropriate.
 Proposed practice relies on OID to determine “use across the
project and organization”
 Recognizes that CAR may have been applied locally but the resulting
improvements may be more broadly applicable.
(.48, .41)
CAR Proposed Change #8 of 7CAR Proposed Change #8 of 7
 CAR is the only high maturity process area with
no lower-level foundation
 OPP – OPD & MA
 QPM – PP, PMC & IPM
 OID – OPF & OPD
 Several alternatives were explored via ATLAS:
0. Leave CAR exactly as it is
1. Add “Causal Analysis” PA at ML2
2. Add “Causal Analysis” PA at ML3
3. Add “Causal Analysis” practice to PMC SG2
4. ADD “Issue & Causal Analysis” PA at ML2
5. Add “Causal Analysis” goal to OPF
(-.08,-.19)
(-.45,-.55)
(-.45,-.26)
(+.09,+.16)
(-.55,-.22)
(-.45,-.22)
ProposedProposed
OIDOID
ChangesChanges
OID Proposed Change #1 of 7OID Proposed Change #1 of 7
Revise OID SG 1
Current:
SG 1 Select Improvements
Process and technology improvements, which contribute to
meeting quality and process-performance objectives, are selected.
Proposed:
SG 1 Select Improvements
Process and technology improvements are identified proactively,
evaluated quantitatively, and selected for deployment based on
their contribution to quality and process performance.
 Somewhat passive vs. very proactive
 Focus on quantitative evaluation and ongoing improvement.
(.66, .63)
OID Proposed Change #2 of 7OID Proposed Change #2 of 7
Revise OID SP 1.1
Current:
SP 1.1 Collect and Analyze Improvement Proposals
Collect and analyze process- and technology-improvement
proposals.
Proposed:
SP 1.1 Solicit Improvement Proposals
Solicit proposals for incremental process and technology
improvements.
 “Solicit” is more proactive than “collect”
 “Analysis” is deferred to SP 1.3 and SP 1.4
 Explicitly targets incremental improvements.
(.66, .43)
OID Proposed Change #3 of 7OID Proposed Change #3 of 7
Revise OID SP 1.2
Current:
SP 1.2 Identify and Analyze Innovations
Identify and analyze innovative improvements that could increase
the organization’s quality and process performance.
Proposed:
SP 1.2 Seek Innovations
Seek and investigate innovative processes and technologies that
have potential for significantly improving the organization’s quality
and process performance.
 “Seek and investigate” is more proactive than “identify”
 “Analysis” is deferred to SP 1.3 and SP 1.4
 Focuses on “significant” performance enhancement.
(.65, .50)
OID Proposed Change #4 of 7OID Proposed Change #4 of 7
Add OID SP 1.3
Current: <None>
Proposed:
SP 1.3 Model Improvements
Use process performance models, as appropriate, to
predict the effect of incremental and innovative
improvements in quantitative terms.
 Adds modeling as an additional “filter”
 Supports quantitative cost/benefit analysis.
(.68, .44)
OID Proposed Change #5 of 7OID Proposed Change #5 of 7
Revise OID SP 1.3 (now SP 1.4)
Current:
SP 1.3 Pilot Improvements
Pilot process and technology improvements to select
which ones to implement.
Proposed:
SP 1.4 Pilot Improvements
Pilot proposed improvements, as appropriate, to
evaluate the actual effect on quality and process
performance in quantitative terms.
 Piloting performed “as appropriate”
 Provides rationale for implementation.
(.70, .61)
OID Proposed Change #6 of 7OID Proposed Change #6 of 7
Revise OID SP 1.4 (now SP 1.5)
Current:
SP 1.5 Select Improvements for Deployment
Select process and technology improvements for
deployment across the organization.
Proposed:
SP 1.4 Select Improvements for Deployment
Select process and technology improvements for
deployment across the organization based on an
evaluation of costs, benefits, and other factors.
 Provides cost and benefits as the basis for selection
 “Other factors” provides flexibility.
(.67, .51)
OID Proposed Change #7 of 7OID Proposed Change #7 of 7
Replace OID SP 2.3
Current:
SP 2.3 Measure Improvement Effects
Measure the effects of the deployed process and
technology improvements.
Proposed:
SP 2.3 Measure Improvement Effects
Evaluate the effects of deployed improvements on
quality and process performance in quantitative terms.
 Specifies evaluation criteria
 Indicates “quantitative” evaluation
 New informative material – update baselines/models.
(.70, .63)
What’s Next?What’s Next?
Change RequestsChange Requests
1. Since the feedback related to the
proposed changes was primarily
supportive, all will be submitted as Change
Requests to the SEI for consideration.
2. Change request submitted for UCHMP
course – add exercise to re-write high
maturity practices using ATLAS results as
the base.
Now It’s YOUR Turn!Now It’s YOUR Turn!
Handout contains ATLAS #12Z proposing:
Consolidating ML5 PAs into ML4
Changing ML5 to “Sustaining Excellence”
Achieve ML4
 ML4 = OPP, QPM, CAR, & OID
 No additional process areas at ML5
Perform at high maturity for 2 contiguous years
Demonstrate sustained business benefit as well
Submit your input to PACT.otoole@att.net
Results will be published to all submitters.
Questions?Questions?
??????
Download & Contact InformationDownload & Contact Information
Refer to the following websites to:
 Contact the authors
 Download the final SEPG 2008 presentation
 Download the supporting ATLAS 12A – 12D results
 Download the CMMI Model and SCAMPI Method
Changes presentation from the May 2007 San
Francisco Beyond CMMI v1.2 Workshop
Herb Weiner
Herb.Weiner@welchallyn.com
www.highmaturity.com
Pat O’Toole
PACT.otoole@att.net
www.pactcmmi.com

Weitere ähnliche Inhalte

Was ist angesagt?

How to take organizations to higher testing maturity suresh bose anagha mahaj...
How to take organizations to higher testing maturity suresh bose anagha mahaj...How to take organizations to higher testing maturity suresh bose anagha mahaj...
How to take organizations to higher testing maturity suresh bose anagha mahaj...Anagha Mahajan
 
CMMI v 1.2 Basics
CMMI v 1.2 BasicsCMMI v 1.2 Basics
CMMI v 1.2 BasicsQAI
 
CMMI - High Maturity Misconceptions and Pitfalls
CMMI - High Maturity Misconceptions and PitfallsCMMI - High Maturity Misconceptions and Pitfalls
CMMI - High Maturity Misconceptions and PitfallsRajesh Naik
 
CMMI Project Planning Presentation
CMMI Project Planning PresentationCMMI Project Planning Presentation
CMMI Project Planning PresentationTiago Teixeira
 
Getting Started With CMMi level 3
Getting Started With CMMi level 3Getting Started With CMMi level 3
Getting Started With CMMi level 3Manas Das
 
SEM 4 OM SUMMER 2014 ASSIGNMENTS
SEM 4 OM SUMMER 2014 ASSIGNMENTSSEM 4 OM SUMMER 2014 ASSIGNMENTS
SEM 4 OM SUMMER 2014 ASSIGNMENTSsolved_assignments
 
CMMi level 3 presentation
CMMi level 3 presentationCMMi level 3 presentation
CMMi level 3 presentationadinmani
 
Introduction to quality and total quality management1
Introduction to quality and total quality management1Introduction to quality and total quality management1
Introduction to quality and total quality management1Prof (Dr.) Chamaru De Alwis
 
Core tools apqp, ppap, fmea, spc and msa
Core tools   apqp, ppap, fmea, spc and msa Core tools   apqp, ppap, fmea, spc and msa
Core tools apqp, ppap, fmea, spc and msa Mouhcine Nahal
 
CMMI for Development
CMMI for DevelopmentCMMI for Development
CMMI for DevelopmentUmar Alharaky
 
ISTQB Advanced Study Guide - 8
ISTQB Advanced Study Guide - 8ISTQB Advanced Study Guide - 8
ISTQB Advanced Study Guide - 8Yogindernath Gupta
 
A lean model based outlook on cost & quality optimization in software projects
A lean model based outlook on cost & quality optimization in software projectsA lean model based outlook on cost & quality optimization in software projects
A lean model based outlook on cost & quality optimization in software projectsSonata Software
 
Process auditing as per VDA 6.3
Process auditing as per VDA 6.3Process auditing as per VDA 6.3
Process auditing as per VDA 6.3Kiran Walimbe
 
2nd Generation Six Sigma the Breakdown (at Samsung)
2nd Generation Six Sigma the Breakdown (at Samsung)2nd Generation Six Sigma the Breakdown (at Samsung)
2nd Generation Six Sigma the Breakdown (at Samsung)Richard Platt
 

Was ist angesagt? (20)

TMMi e-Survey guidance
TMMi e-Survey guidanceTMMi e-Survey guidance
TMMi e-Survey guidance
 
Introduction to CMMI-DEV v1.3 - Day 4
Introduction to CMMI-DEV v1.3  - Day 4Introduction to CMMI-DEV v1.3  - Day 4
Introduction to CMMI-DEV v1.3 - Day 4
 
How to take organizations to higher testing maturity suresh bose anagha mahaj...
How to take organizations to higher testing maturity suresh bose anagha mahaj...How to take organizations to higher testing maturity suresh bose anagha mahaj...
How to take organizations to higher testing maturity suresh bose anagha mahaj...
 
Tqm ch 06
Tqm ch 06Tqm ch 06
Tqm ch 06
 
CMMI v 1.2 Basics
CMMI v 1.2 BasicsCMMI v 1.2 Basics
CMMI v 1.2 Basics
 
The Process Audit
The Process AuditThe Process Audit
The Process Audit
 
CMMI - High Maturity Misconceptions and Pitfalls
CMMI - High Maturity Misconceptions and PitfallsCMMI - High Maturity Misconceptions and Pitfalls
CMMI - High Maturity Misconceptions and Pitfalls
 
Ga article
Ga articleGa article
Ga article
 
Quality concept
Quality concept Quality concept
Quality concept
 
CMMI Project Planning Presentation
CMMI Project Planning PresentationCMMI Project Planning Presentation
CMMI Project Planning Presentation
 
Getting Started With CMMi level 3
Getting Started With CMMi level 3Getting Started With CMMi level 3
Getting Started With CMMi level 3
 
SEM 4 OM SUMMER 2014 ASSIGNMENTS
SEM 4 OM SUMMER 2014 ASSIGNMENTSSEM 4 OM SUMMER 2014 ASSIGNMENTS
SEM 4 OM SUMMER 2014 ASSIGNMENTS
 
CMMi level 3 presentation
CMMi level 3 presentationCMMi level 3 presentation
CMMi level 3 presentation
 
Introduction to quality and total quality management1
Introduction to quality and total quality management1Introduction to quality and total quality management1
Introduction to quality and total quality management1
 
Core tools apqp, ppap, fmea, spc and msa
Core tools   apqp, ppap, fmea, spc and msa Core tools   apqp, ppap, fmea, spc and msa
Core tools apqp, ppap, fmea, spc and msa
 
CMMI for Development
CMMI for DevelopmentCMMI for Development
CMMI for Development
 
ISTQB Advanced Study Guide - 8
ISTQB Advanced Study Guide - 8ISTQB Advanced Study Guide - 8
ISTQB Advanced Study Guide - 8
 
A lean model based outlook on cost & quality optimization in software projects
A lean model based outlook on cost & quality optimization in software projectsA lean model based outlook on cost & quality optimization in software projects
A lean model based outlook on cost & quality optimization in software projects
 
Process auditing as per VDA 6.3
Process auditing as per VDA 6.3Process auditing as per VDA 6.3
Process auditing as per VDA 6.3
 
2nd Generation Six Sigma the Breakdown (at Samsung)
2nd Generation Six Sigma the Breakdown (at Samsung)2nd Generation Six Sigma the Breakdown (at Samsung)
2nd Generation Six Sigma the Breakdown (at Samsung)
 

Ähnlich wie Cmmi%20 model%20changes%20for%20high%20maturity%20v01[1]

Assessing Your Processes using ISO Standards
Assessing Your Processes using ISO StandardsAssessing Your Processes using ISO Standards
Assessing Your Processes using ISO StandardsPECB
 
Awareness of iatf 16949
Awareness of iatf 16949Awareness of iatf 16949
Awareness of iatf 16949Pavan Patil
 
Best Practices Fusion: Lean Six Sigma & CMMI
Best Practices Fusion: Lean Six Sigma & CMMIBest Practices Fusion: Lean Six Sigma & CMMI
Best Practices Fusion: Lean Six Sigma & CMMIggack
 
PECB Webinar: Aligning ISO 25000 and CMMI for Development
PECB Webinar: Aligning ISO 25000 and CMMI for DevelopmentPECB Webinar: Aligning ISO 25000 and CMMI for Development
PECB Webinar: Aligning ISO 25000 and CMMI for DevelopmentPECB
 
Six Sigma & Lean Production
Six Sigma & Lean ProductionSix Sigma & Lean Production
Six Sigma & Lean ProductionFaisalKhan904
 
Operations Management: Six sigma benchmarking of process capability analysis...
Operations Management:  Six sigma benchmarking of process capability analysis...Operations Management:  Six sigma benchmarking of process capability analysis...
Operations Management: Six sigma benchmarking of process capability analysis...FGV Brazil
 
Process and product quality Assurance
Process and product quality AssuranceProcess and product quality Assurance
Process and product quality AssuranceJoydip Bhattacharya
 
Assessment checklist
Assessment checklistAssessment checklist
Assessment checklistKASC
 
BPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & Trends
BPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & TrendsBPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & Trends
BPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & TrendsRajesh Timane, PhD
 
scribd.vdownloaders.com_day-6-quality.pdf
scribd.vdownloaders.com_day-6-quality.pdfscribd.vdownloaders.com_day-6-quality.pdf
scribd.vdownloaders.com_day-6-quality.pdfAbdullahSamy6
 
Iso 9001 2015 checklist
Iso 9001 2015 checklistIso 9001 2015 checklist
Iso 9001 2015 checklistAtul Sharma
 
Quality management templates
Quality management templatesQuality management templates
Quality management templatesselinasimpson371
 
Six Sigma Implementation to reduce rejection rate of Pump Casings at local Ma...
Six Sigma Implementation to reduce rejection rate of Pump Casings at local Ma...Six Sigma Implementation to reduce rejection rate of Pump Casings at local Ma...
Six Sigma Implementation to reduce rejection rate of Pump Casings at local Ma...IOSR Journals
 
РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...
РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...
РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...GoQA
 

Ähnlich wie Cmmi%20 model%20changes%20for%20high%20maturity%20v01[1] (20)

Assessing Your Processes using ISO Standards
Assessing Your Processes using ISO StandardsAssessing Your Processes using ISO Standards
Assessing Your Processes using ISO Standards
 
Ch 12(spi)cm mi scampi
Ch 12(spi)cm mi scampiCh 12(spi)cm mi scampi
Ch 12(spi)cm mi scampi
 
Awareness of iatf 16949
Awareness of iatf 16949Awareness of iatf 16949
Awareness of iatf 16949
 
QAIBP
QAIBPQAIBP
QAIBP
 
csc 510 Project
csc 510 Projectcsc 510 Project
csc 510 Project
 
Best Practices Fusion: Lean Six Sigma & CMMI
Best Practices Fusion: Lean Six Sigma & CMMIBest Practices Fusion: Lean Six Sigma & CMMI
Best Practices Fusion: Lean Six Sigma & CMMI
 
PECB Webinar: Aligning ISO 25000 and CMMI for Development
PECB Webinar: Aligning ISO 25000 and CMMI for DevelopmentPECB Webinar: Aligning ISO 25000 and CMMI for Development
PECB Webinar: Aligning ISO 25000 and CMMI for Development
 
Six Sigma & Lean Production
Six Sigma & Lean ProductionSix Sigma & Lean Production
Six Sigma & Lean Production
 
Lecture08
Lecture08Lecture08
Lecture08
 
Operations Management: Six sigma benchmarking of process capability analysis...
Operations Management:  Six sigma benchmarking of process capability analysis...Operations Management:  Six sigma benchmarking of process capability analysis...
Operations Management: Six sigma benchmarking of process capability analysis...
 
Process and product quality Assurance
Process and product quality AssuranceProcess and product quality Assurance
Process and product quality Assurance
 
Assessment checklist
Assessment checklistAssessment checklist
Assessment checklist
 
BPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & Trends
BPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & TrendsBPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & Trends
BPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & Trends
 
scribd.vdownloaders.com_day-6-quality.pdf
scribd.vdownloaders.com_day-6-quality.pdfscribd.vdownloaders.com_day-6-quality.pdf
scribd.vdownloaders.com_day-6-quality.pdf
 
S3 p2 lee_kong_hui
S3 p2 lee_kong_huiS3 p2 lee_kong_hui
S3 p2 lee_kong_hui
 
Iso 9001 2015 checklist
Iso 9001 2015 checklistIso 9001 2015 checklist
Iso 9001 2015 checklist
 
Quality management templates
Quality management templatesQuality management templates
Quality management templates
 
Six Sigma Implementation to reduce rejection rate of Pump Casings at local Ma...
Six Sigma Implementation to reduce rejection rate of Pump Casings at local Ma...Six Sigma Implementation to reduce rejection rate of Pump Casings at local Ma...
Six Sigma Implementation to reduce rejection rate of Pump Casings at local Ma...
 
Spice
SpiceSpice
Spice
 
РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...
РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...
РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...
 

Mehr von JULIO GONZALEZ SANZ

Introduction to bayesian_networks[1]
Introduction to bayesian_networks[1]Introduction to bayesian_networks[1]
Introduction to bayesian_networks[1]JULIO GONZALEZ SANZ
 
Workshop healthy ingredients ppm[1]
Workshop healthy ingredients ppm[1]Workshop healthy ingredients ppm[1]
Workshop healthy ingredients ppm[1]JULIO GONZALEZ SANZ
 
The need for a balanced measurement system
The need for a balanced measurement systemThe need for a balanced measurement system
The need for a balanced measurement systemJULIO GONZALEZ SANZ
 
Just in-time and lean production
Just in-time and lean productionJust in-time and lean production
Just in-time and lean productionJULIO GONZALEZ SANZ
 
History of manufacturing systems and lean thinking enfr
History of manufacturing systems and lean thinking enfrHistory of manufacturing systems and lean thinking enfr
History of manufacturing systems and lean thinking enfrJULIO GONZALEZ SANZ
 
Une 66175 presentacion norma 2006 por julio
Une 66175 presentacion norma 2006 por julioUne 66175 presentacion norma 2006 por julio
Une 66175 presentacion norma 2006 por julioJULIO GONZALEZ SANZ
 
An architecture for data quality
An architecture for data qualityAn architecture for data quality
An architecture for data qualityJULIO GONZALEZ SANZ
 
Sap analytics creating smart business processes
Sap analytics   creating smart business processesSap analytics   creating smart business processes
Sap analytics creating smart business processesJULIO GONZALEZ SANZ
 
Big data analytics, research report
Big data analytics, research reportBig data analytics, research report
Big data analytics, research reportJULIO GONZALEZ SANZ
 
Evaluating and comparing software metrics in the software engineering laboratory
Evaluating and comparing software metrics in the software engineering laboratoryEvaluating and comparing software metrics in the software engineering laboratory
Evaluating and comparing software metrics in the software engineering laboratoryJULIO GONZALEZ SANZ
 
The complexity of social networks
The complexity of social networksThe complexity of social networks
The complexity of social networksJULIO GONZALEZ SANZ
 

Mehr von JULIO GONZALEZ SANZ (20)

Cmmi 26 ago_2009_
Cmmi 26 ago_2009_Cmmi 26 ago_2009_
Cmmi 26 ago_2009_
 
Introduction to bayesian_networks[1]
Introduction to bayesian_networks[1]Introduction to bayesian_networks[1]
Introduction to bayesian_networks[1]
 
Workshop healthy ingredients ppm[1]
Workshop healthy ingredients ppm[1]Workshop healthy ingredients ppm[1]
Workshop healthy ingredients ppm[1]
 
The need for a balanced measurement system
The need for a balanced measurement systemThe need for a balanced measurement system
The need for a balanced measurement system
 
Magic quadrant
Magic quadrantMagic quadrant
Magic quadrant
 
6 six sigma presentation
6 six sigma presentation6 six sigma presentation
6 six sigma presentation
 
Volvo csr suppliers guide vsib
Volvo csr suppliers guide vsibVolvo csr suppliers guide vsib
Volvo csr suppliers guide vsib
 
Just in-time and lean production
Just in-time and lean productionJust in-time and lean production
Just in-time and lean production
 
History of manufacturing systems and lean thinking enfr
History of manufacturing systems and lean thinking enfrHistory of manufacturing systems and lean thinking enfr
History of manufacturing systems and lean thinking enfr
 
Using minitab exec files
Using minitab exec filesUsing minitab exec files
Using minitab exec files
 
Sga iso-14001
Sga iso-14001Sga iso-14001
Sga iso-14001
 
Cslt closing plenary_portugal
Cslt closing plenary_portugalCslt closing plenary_portugal
Cslt closing plenary_portugal
 
Une 66175 presentacion norma 2006 por julio
Une 66175 presentacion norma 2006 por julioUne 66175 presentacion norma 2006 por julio
Une 66175 presentacion norma 2006 por julio
 
Swebokv3
Swebokv3 Swebokv3
Swebokv3
 
An architecture for data quality
An architecture for data qualityAn architecture for data quality
An architecture for data quality
 
Sap analytics creating smart business processes
Sap analytics   creating smart business processesSap analytics   creating smart business processes
Sap analytics creating smart business processes
 
Big data analytics, research report
Big data analytics, research reportBig data analytics, research report
Big data analytics, research report
 
Evaluating and comparing software metrics in the software engineering laboratory
Evaluating and comparing software metrics in the software engineering laboratoryEvaluating and comparing software metrics in the software engineering laboratory
Evaluating and comparing software metrics in the software engineering laboratory
 
Cmmi six sigma bok
Cmmi six sigma bokCmmi six sigma bok
Cmmi six sigma bok
 
The complexity of social networks
The complexity of social networksThe complexity of social networks
The complexity of social networks
 

Cmmi%20 model%20changes%20for%20high%20maturity%20v01[1]

  • 1. CMMI Model ChangesCMMI Model Changes for High Maturityfor High Maturity Herb WeinerHerb Weiner Pat O’ToolePat O’Toole 2008 SEPG Conference2008 SEPG Conference Tampa, FloridaTampa, Florida
  • 2. Problem StatementProblem Statement  High maturity practices are not consistently understood, applied, or appraised  SEI is addressing the training and appraisal portions of the CMMI Product Suite; e.g.,  Understanding CMMI High Maturity Practices course  Several recent presentations by SEI personnel  High Maturity Lead Appraisers certification  However, there is insufficient foundation for these “raise-the-floor” interpretations in CMMI v1.2  Goals do not establish the requirements  Practices do not establish the expectations  Informative material purported to take on greater importance.
  • 3. Eating Your Own Dog FoodEating Your Own Dog Food  Requirements Management SG1Requirements Management SG1::  Requirements are managed and inconsistenciesRequirements are managed and inconsistencies with project plans and work products arewith project plans and work products are identifiedidentified  CMMI Product Suite Management SG1CMMI Product Suite Management SG1::  CMMI model requirements are managed andCMMI model requirements are managed and inconsistencies with CMMI training courses andinconsistencies with CMMI training courses and appraisal methods are identified.appraisal methods are identified.
  • 4. ApproachApproach  Draft proposed changes  CMMI Model & SCAMPI Method Changes for High Maturity (Herb Weiner, May 2007)  Solicit feedback from SEI authorized people via ATLAS  ATLAS = Ask The Lead AppraiserS  ATLAS has been expanded to include CMMI instructors  Candidate lead appraisers and instructors also included  Publish results to SEI authorized individuals  Submit CRs to SEI for consideration  Update model to re-align the CMMI Product Suite.
  • 5. ATLAS FeedbackATLAS Feedback  For each proposed change, respondents indicated:  Strongly support (It’s perfect!)  Support (It’s better)  Are ambivalent (It’s OK either way)  Disagree (It’s worse)  Strongly disagree (What were you thinking?)  Ratings were determined on a +1 to -1 scale as follows:  Strongly support = +1.0  Support = +0.5  Ambivalent = 0.0  Disagree = -0.5  Strongly disagree = -1.0  For each change, the average rating will be displayed for:  [High Maturity Lead Appraisers, Other SEI authorized individuals]
  • 7. OPP Proposed Change #1 of 4OPP Proposed Change #1 of 4 Move SP 1.3 to SP 1.1 Current: SP 1.1 Select Processes SP 1.2 Establish Process-Performance Measures SP 1.3 Establish Quality and Process-Performance Objectives Proposed: SP 1.1 Establish Quality and Process-Performance Objectives SP 1.2 Select Processes SP 1.3 Establish Process-Performance Measures  MA, OPF, and QPM establish objectives in SP 1.1. (.50, .51)
  • 8. OPP Proposed Change #2 of 4OPP Proposed Change #2 of 4 Revise OPP SP 1.4 Current: Establish and maintain the organization’s process- performance baselines. Proposed: Conduct process-performance analyses on the selected processes and subprocesses to verify process stability and to establish and maintain the organization’s process-performance baselines.  SP 1.1 & 1.2 indicate process-performance analysis will be conducted, but that’s the last we hear of it  Baselines are established for stable processes  Elevate this from informative to expected. (.39, .42)
  • 9. OPP Proposed Change #3 of 4OPP Proposed Change #3 of 4 Revise OPP SP 1.5 Current: Establish and maintain the process-performance models for the organization’s set of standard processes. Proposed: Establish and maintain models that predict process performance related to the quality and process-performance objectives.  The SEI’s new training courses emphasize use of process- performance models with respect to quantitative objectives  Focusing this practice on these objectives achieves better alignment between the model and training. (.59, .50)
  • 10. OPP Proposed Change #4 of 4OPP Proposed Change #4 of 4 Enhance the informative material Proposed: Modify informative material that suggests improving process performance such as the examples found in OPP SP 1.3 (which imply that common causes of variation be addressed) Add new informative material should indicate that, at ML4/CL4, achieving such improvement might be addressed via OPF and GP3.1, while at ML5/CL5, it is more likely to be achieved through CAR, OID, and GP5.2  In order to delineate level 4 from level 5, the model should avoid implying that common causes of variation are addressed at level 4  ML4/CL4: Process stability / execution consistency / special causes  ML5/CL5: Improving capability / systemic improvement / common causes. (.36, .44)
  • 12. QPM Proposed Change #1 of 4QPM Proposed Change #1 of 4 Revise QPM SP 1.4 Current: SP 1.4 Manage Project Performance Monitor the project to determine whether the project’s objectives for quality and process performance will be satisfied, and identify corrective action as appropriate. Proposed: SP 1.4 Analyze Project Performance Analyze the collective performance of the project's subprocesses to predict whether the project's objectives for quality and process performance will be satisfied and identify the need for corrective action as appropriate.  Fixes mismatch between the current title and practice statement  Recognizes that project management deals with both quantitatively managed, and non-quantitatively managed processes. (.54, .57)
  • 13. QPM Proposed Change #2 of 4QPM Proposed Change #2 of 4 Add QPM SP 1.5 Current: <None> Proposed: SP 1.5 Use Process-Performance Models Use calibrated process-performance models throughout the life cycle to identify, analyze, and execute corrective action when necessary.  Currently, PPMs aren’t expected to be used in QPM  But use throughout life cycle appears to be expected by SEI  PPMs may support process or subprocess activities  Added practice to SG 1, but it could have been added to SG2. (.39, .46)
  • 14. QPM Proposed Change #3 of 4QPM Proposed Change #3 of 4 Add QPM SP 2.3 Current: <None> Proposed: SP 2.3 Address Special Causes of Variation Identify, address, and prevent reoccurrence of special causes of variation in the selected subprocesses.  “Special causes” are featured in SEI materials  Currently “special causes” are only in QPM’s informative material  The Glossary definition of “stable process” includes “…and prevent reoccurrences of special causes”  Add informative material to ensure that process performance data and statistical techniques are used appropriately. (.64, .48)
  • 15. QPM Proposed Change #4 of 4QPM Proposed Change #4 of 4 Revise QPM SP 2.3 (now SP 2.4) Current: SP 2.3 Monitor Performance of the Selected Subprocesses Monitor the performance of the selected subprocesses to determine their capability to satisfy their quality and process- performance objectives, and identify corrective action as necessary. Proposed: SP 2.4 Analyze Performance of the Selected Subprocesses Analyze the performance of the selected subprocesses to predict their capability to satisfy their quality and process-performance objectives, and identify and take corrective action as necessary.  “Analyze” is a much stronger word than “monitor”  “Predict” is a much stronger word than “determine”  Emphasize “taking corrective action,” not just identifying it. (.59, .46)
  • 17. CAR Proposed Change #1 of 7CAR Proposed Change #1 of 7 Thematic Change  Currently, there is little to suggest that CAR should target statistically managed subprocesses to identify and analyze common causes of variation to address:  Stable processes with unacceptably high standard deviations;  Stable processes not capable of achieving quality or process performance objectives; and  Stable and capable processes that might be improved to enhance competitive advantage  Change the focus of CAR’s specific goals and practices from “defects and other problems” to “problems”  By collapsing this phrase, model users will not limit their application of CAR to the subset of problem candidates called “defects”  Also include a discussion of “opportunities” in the informative material. (.50, .46)
  • 18. CAR Proposed Change #2 of 7CAR Proposed Change #2 of 7 Revise CAR SG 1 Current: SG 1 Determine Causes of Defects Root causes of defects and other problems are systematically determined. Proposed: SG 1 Determine and Analyze Causes Common causes of variation and root causes of problems are systematically analyzed.  Reflects the Thematic Change  “Analyzed” is a stronger word than “determined”. (.56, .63)
  • 19. CAR Proposed Change #3 of 7CAR Proposed Change #3 of 7 Revise CAR SP 1.1 Current: SP 1.1 Select Defect Data for Analysis Select the defects and other problems for analysis. Proposed: SP 1.1 Select Data for Analysis Select for analysis, using established criteria, quantitatively managed processes that are candidates for improvement as well as problems that have a significant effect on quality and process performance.  Reflects the Thematic Change  “Significant effect” emphasizes quantitatively managed processes. (.64, .53)
  • 20. CAR Proposed Change #4 of 7CAR Proposed Change #4 of 7 Revise CAR SP 1.2 and add SP1.3-SP 1.4 Current: SP 1.2 Analyze Causes Perform causal analysis of selected defects and other problems and propose actions to address them. Proposed: SP 1.2 Analyze Common Causes Analyze common causes of variation to understand the inherent quality and process performance constraints. SP 1.3 Analyze Root Causes Perform causal analysis on selected problems to determine their root causes. SP 1.4 Propose Actions to Address Causes Propose actions to address selected common causes of variation and to prevent recurrence of selected problems.  Reflects the Thematic Change.  Establishes expectations for BOTH common causes and root causes. (.44, .57)
  • 21. CAR Proposed Change #5 of 7CAR Proposed Change #5 of 7 Add CAR SP 1.5 Current: <None> Proposed: SP 1.5 Predict Effects of Proposed Actions Use process performance models and statistical techniques to predict, in quantitative terms, the effects of the proposed actions, as appropriate.  Reflects the SEI’s expected use of PPMs and statistical methods in high maturity organizations  Supports proper cost/benefit analysis. (.52, .58)
  • 22. CAR Proposed Change #6 of 7CAR Proposed Change #6 of 7 Revise CAR SG 2, SP 2.1 – SP 2.2 Current: SG 2 Analyze Causes Root causes of defects and other problems are systematically addressed to prevent their future occurrence. SP 2.1 Implement the Action Proposals Implement the selected action proposals that were developed in causal analysis. SP 2.2 Evaluate the Effect of Changes Evaluate the effect of changes on process performance. Proposed: SG 2 Address Causes Common causes of variation and root causes of problems are systematically addressed to quantitatively improve quality and process performance. SP 2.1 Implement the Action Proposals Implement selected action proposals that are predicted to achieve a measurable improvement in quality and process performance. SP 2.2 Evaluate the Effect of Implemented Actions Evaluate the effect of implemented actions on quality and process performance.
  • 23. CAR Proposed Change #6 of 7CAR Proposed Change #6 of 7 Proposed: (Copied from previous slide) SG 2 Address Causes Common causes of variation and root causes of problems are systematically addressed to quantitatively improve quality and process performance. SP 2.1 Implement the Action Proposals Implement selected action proposals that are predicted to achieve a measurable improvement in quality and process performance. SP 2.2 Evaluate the Effect of Implemented Actions Evaluate the effect of implemented actions on quality and process performance.  Reflects the Thematic Change  Wording enhanced to focus on measurable improvement of “quality and process performance” – a phrase reserved for high maturity practices  SP 2.2 modified to include quality as well as process performance  A perceived oversight in the current practice. (.46, .64)
  • 24. CAR Proposed Change #7 of 7CAR Proposed Change #7 of 7 Revise CAR SP 2.3 Current: SP 2.3 Record Data Record causal analysis and resolution data for use across the project and organization. Proposed: SP 2.3 Submit Improvement Proposals Submit process- and technology-improvement proposals based on implemented actions, as appropriate.  Proposed practice relies on OID to determine “use across the project and organization”  Recognizes that CAR may have been applied locally but the resulting improvements may be more broadly applicable. (.48, .41)
  • 25. CAR Proposed Change #8 of 7CAR Proposed Change #8 of 7  CAR is the only high maturity process area with no lower-level foundation  OPP – OPD & MA  QPM – PP, PMC & IPM  OID – OPF & OPD  Several alternatives were explored via ATLAS: 0. Leave CAR exactly as it is 1. Add “Causal Analysis” PA at ML2 2. Add “Causal Analysis” PA at ML3 3. Add “Causal Analysis” practice to PMC SG2 4. ADD “Issue & Causal Analysis” PA at ML2 5. Add “Causal Analysis” goal to OPF (-.08,-.19) (-.45,-.55) (-.45,-.26) (+.09,+.16) (-.55,-.22) (-.45,-.22)
  • 27. OID Proposed Change #1 of 7OID Proposed Change #1 of 7 Revise OID SG 1 Current: SG 1 Select Improvements Process and technology improvements, which contribute to meeting quality and process-performance objectives, are selected. Proposed: SG 1 Select Improvements Process and technology improvements are identified proactively, evaluated quantitatively, and selected for deployment based on their contribution to quality and process performance.  Somewhat passive vs. very proactive  Focus on quantitative evaluation and ongoing improvement. (.66, .63)
  • 28. OID Proposed Change #2 of 7OID Proposed Change #2 of 7 Revise OID SP 1.1 Current: SP 1.1 Collect and Analyze Improvement Proposals Collect and analyze process- and technology-improvement proposals. Proposed: SP 1.1 Solicit Improvement Proposals Solicit proposals for incremental process and technology improvements.  “Solicit” is more proactive than “collect”  “Analysis” is deferred to SP 1.3 and SP 1.4  Explicitly targets incremental improvements. (.66, .43)
  • 29. OID Proposed Change #3 of 7OID Proposed Change #3 of 7 Revise OID SP 1.2 Current: SP 1.2 Identify and Analyze Innovations Identify and analyze innovative improvements that could increase the organization’s quality and process performance. Proposed: SP 1.2 Seek Innovations Seek and investigate innovative processes and technologies that have potential for significantly improving the organization’s quality and process performance.  “Seek and investigate” is more proactive than “identify”  “Analysis” is deferred to SP 1.3 and SP 1.4  Focuses on “significant” performance enhancement. (.65, .50)
  • 30. OID Proposed Change #4 of 7OID Proposed Change #4 of 7 Add OID SP 1.3 Current: <None> Proposed: SP 1.3 Model Improvements Use process performance models, as appropriate, to predict the effect of incremental and innovative improvements in quantitative terms.  Adds modeling as an additional “filter”  Supports quantitative cost/benefit analysis. (.68, .44)
  • 31. OID Proposed Change #5 of 7OID Proposed Change #5 of 7 Revise OID SP 1.3 (now SP 1.4) Current: SP 1.3 Pilot Improvements Pilot process and technology improvements to select which ones to implement. Proposed: SP 1.4 Pilot Improvements Pilot proposed improvements, as appropriate, to evaluate the actual effect on quality and process performance in quantitative terms.  Piloting performed “as appropriate”  Provides rationale for implementation. (.70, .61)
  • 32. OID Proposed Change #6 of 7OID Proposed Change #6 of 7 Revise OID SP 1.4 (now SP 1.5) Current: SP 1.5 Select Improvements for Deployment Select process and technology improvements for deployment across the organization. Proposed: SP 1.4 Select Improvements for Deployment Select process and technology improvements for deployment across the organization based on an evaluation of costs, benefits, and other factors.  Provides cost and benefits as the basis for selection  “Other factors” provides flexibility. (.67, .51)
  • 33. OID Proposed Change #7 of 7OID Proposed Change #7 of 7 Replace OID SP 2.3 Current: SP 2.3 Measure Improvement Effects Measure the effects of the deployed process and technology improvements. Proposed: SP 2.3 Measure Improvement Effects Evaluate the effects of deployed improvements on quality and process performance in quantitative terms.  Specifies evaluation criteria  Indicates “quantitative” evaluation  New informative material – update baselines/models. (.70, .63)
  • 35. Change RequestsChange Requests 1. Since the feedback related to the proposed changes was primarily supportive, all will be submitted as Change Requests to the SEI for consideration. 2. Change request submitted for UCHMP course – add exercise to re-write high maturity practices using ATLAS results as the base.
  • 36. Now It’s YOUR Turn!Now It’s YOUR Turn! Handout contains ATLAS #12Z proposing: Consolidating ML5 PAs into ML4 Changing ML5 to “Sustaining Excellence” Achieve ML4  ML4 = OPP, QPM, CAR, & OID  No additional process areas at ML5 Perform at high maturity for 2 contiguous years Demonstrate sustained business benefit as well Submit your input to PACT.otoole@att.net Results will be published to all submitters.
  • 38. Download & Contact InformationDownload & Contact Information Refer to the following websites to:  Contact the authors  Download the final SEPG 2008 presentation  Download the supporting ATLAS 12A – 12D results  Download the CMMI Model and SCAMPI Method Changes presentation from the May 2007 San Francisco Beyond CMMI v1.2 Workshop Herb Weiner Herb.Weiner@welchallyn.com www.highmaturity.com Pat O’Toole PACT.otoole@att.net www.pactcmmi.com

Hinweis der Redaktion

  1. The full wording of Item #1 and its rationale is: Move SP 1.3 to SP 1.1, shifting SP 1.1 and SP 1.2 to SP 1.2 and SP 1.3 respectively Current:SP 1.1 Select Processes SP 1.2 Establish Process-Performance Measures SP 1.3 Establish Quality and Process-Performance Objectives Proposed:SP 1.1 Establish Quality and Process-Performance Objectives SP 1.2 Select Processes SP 1.3 Establish Process-Performance Measures Rationale:There are three other process areas in which “objectives” are established – MA, OPF, and QPM. In each of these other process areas, objectives are established in SP 1.1, and the other practices focus on accomplishing them. It is suggested here that OPP be structured in the same manner. Granted, the current ordering may have more intuitive appeal to an “emerging” ML4 organization, but the proposed ordering reflects more of a steady state (i.e., institutionalized) condition.
  2. The full wording of Item #2 and its rationale is: OPP SP 1.4 Current:OPP SP 1.4: Establish and maintain the organization’s process-performance baselines. Proposed:OPP SP 1.4: Conduct process-performance analyses on the selected processes and subprocesses to verify process stability and to establish and maintain the organization’s process-performance baselines. Rationale:The current OPP SP 1.1 and SP 1.2 both imply that process-performance analysis will be conducted and yet that’s the last we hear of it – so it is proposed that such analyses are explicitly performed here. In addition, as currently emphasized in the informative material, the proposed practice wording suggests establishing baselines for stable processes, a necessary prerequisite for quantitative management.
  3. The full wording of Item #3 and its rationale is: OPP SP 1.5: Current:OPP SP 1.5: Establish and maintain the process-performance models for the organization’s set of standard processes. Proposed:OPP SP 1.5: Establish and maintain models that predict process performance related to the quality and process-performance objectives. Rationale:The new training courses emphasize the use of process-performance models to compose the defined process and to predict future performance throughout the life cycle with respect to the quantitative objectives. Focusing the expected model component on these objectives achieves better alignment between the model and training.
  4. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  5. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  6. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  7. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  8. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  9. The full wording of Item #4’s rationale is: Similar to the proposed change for QPM SP 1.4, the word “monitor” is much too weak a word; “analyze” is a word that better reflects the activity expected to be performed by high maturity organizations. Similar to the proposed change for QPM SP 1.4, it is suggested that the performance of selected subprocesses be used to “predict” rather than merely “determine” their capability to satisfy their quality and process-performance objectives. “Predict” implies a higher degree of sophistication than does “determine,” and is more closely aligned with the expected behavior of high maturity organizations. The current practice expects project personnel to “identify corrective action;” the propose practice expects them to “identify and take corrective action”. The informative material of this practice should be expanded to refer to the proposed practice SP 1.5 and its use of process-performance models to “identify, analyze, and execute corrective action when necessary.” It should also refer to refer to PMC SG 2 for corrective action that does not warrant the use of process-performance models.The informative material should also be enhanced to discuss managing the inherent variation of the measurement system to heighten the probability that the measurement system is providing more “signal” than “noise.”
  10. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  11. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  12. Full explanation of Rationale: The current wording of SG 1 is somewhat passive as it focuses on meeting current quality and process-performance objectives. It implies that once the objectives are being met, the urgency for ongoing improvement is diminished. The ML5 concept of “optimizing” demands that organizations continuously and proactively seek ways to exceed, not merely meet, these expectations (i.e., once they’ve achieved “world class” they then strive for “universe class!”) In addition to proactively soliciting improvement proposals, ML5 organizations should experiment with both existing and emerging technologies in an effort to push their quality and process performance to the next level. The proposed changes to the specific practices supporting SG 1 also reflect this more proactive posture. Note: new informative material should be added to every OID specific practice that uses the word “quantitatively.” Although not explicitly stated in the practices themselves, the informative material should strongly encourage the use of process performance baselines and statistical methods where appropriate.
  13. The full wording of the Rationale is: The proposed wording of SP 1.1 introduces the following changes:a. “Collect” is replaced by the more proactive verb, “Solicit.”b. The “analyze” portion of this practice is deferred to OID SP 1.3 and SP 1.4.c. The practice explicitly targets “incremental” improvements thereby differentiating SP 1.1 more clearly from the “innovative” improvements covered by SP 1.2.
  14. The full wording of the rationale is: The proposed wording of SP 1.2 introduces the following changes:a. “Identify” is replaced by the more proactive verbs, “seek and investigate.”b. The “analyze” portion of this practice is deferred to OID SP 1.3 and SP 1.4.c. The practice explicitly targets improvements that significantly enhance performance. Innovative change is disruptive and may not be warranted to achieve marginal benefits.
  15. The full wording of the rationale is: The proposed wording of SP 1.4 introduces the following changes:a. Use of the term, “as appropriate” was added to indicate that not all incremental or innovative improvements need to be piloted.b. “…to select which ones to implement” is somewhat mealy and does not reflect the the expected behavior of an ML5 organization. The reworded practice is much more explicit in this regard.
  16. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  17. The full wording of the rationale is: The wording of the existing practice provides little direction as to what should be measured and how the measures are to be used. The subpractices extend the measures beyond “the effects of the … improvements” suggesting that actual cost, effort, and schedule for deploying each improvement be captured as well.The proposed wording of SP 2.3 indicates that the organization should evaluate the improvement using the same metrics that were predicted via process modeling and/or initially achieved during the pilots.Furthermore, new informative material should indicate that this evaluation may result in the need to adjust the implementation of recently deployed improvements, to enhance the associated tailoring guidelines, and/or to initiate other forms of corrective action. It’s not enough to simply “measure the effects;” rather, the organization should strive to achieve the benefits.Finally, new informative material should remind the organization to update the corresponding process performance baselines and models based on the quantitative results achieved by the deployed improvements. If quality and process performance has, indeed, been improved, then the organization should expect to continue deriving these benefits in the future.