SlideShare a Scribd company logo
1 of 38
Applying Six Sigma to Financial Software Development   Ali Raza Khan Giant Plc. 1 New Oxford Street, London  ♦  United Kingdom (+44) 18443247700  ♦  [email_address] © 2009 Giant Plc.
Motivation ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],© 2009 Giant Plc. ,[object Object],[object Object],[object Object],[object Object]
But Software Development is Not a Typical Application ,[object Object],[object Object],[object Object],[object Object],[object Object],© 2009 Giant Plc.
Key Factors in Software Project Failures 50% Inadequate Configuration Management 55% Cost Overruns 60% Low Quality Execution Failures 65% Excessive Schedule Pressure Expectation Failures 80% Creeping Requirements Requirements Failures % of “MIS” Projects  Risk Factor
Applying Six Sigma to Software Development © 2009 Giant Plc.
Fuzzy Front End Six Sigma DFSS © 2009 Giant Plc
Balance the VOC and the VOB Voice of the Customer Voice of Business
VOC – Voice of the Customer ,[object Object],© 2009 Giant Plc.
Building a Customer Matrix Segments Types of Customers Lead User Demanding Lost  Lead Had But Lost U.S. Europe Asia © 2009 Giant Plc.
VOC – Voice of the Customer ,[object Object],[object Object],[object Object],[object Object],© 2009 Giant Plc.
Kano Analysis ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],© 2009 Giant Plc.
The Kano Model How the Customer Feels Satisfier Must-Be Delighter  Level of Functionality Delivered   for a particular requirement Low  to  None High Delighted © 2009 Giant Plc Neutral Very Dissatisfied Satisfied, but I expect it this way I can live with it
VOC Output: Prioritized CTQs © 2009 Giant Plc. 5 M Minimizing system response time Provide real-time user access 3.5 D Optimizing data transfer 3 M Moving client-server data Manage Network I/O 4 S Verifying data content integrity Manage database interfaces Priority Kano Use-case Requirement
VOC – Voice of the Customer ,[object Object],[object Object],[object Object],[object Object],[object Object],© 2009 Giant Plc.
VOC Output: Fully Characterized CTQs © 2009 Giant Plc. Top 10+ compression schemes supplied and fully integrated Top 5 compression schemes supplied Hooks for user supplied compression 3.5 D Optimizing data transfer 800 records/min. 500 records/min. 100 records/min. 3 M Moving client-server data Manage Network I/O ≤  1 record/100,000 ≤  1 record/10,000 ≤  1 record/1,000 4 S Verifying data content integrity Strong Average Minimum Manage Database Interfaces Measure Priority Kano Use-Case Requirements
VOB - Voice of Business ,[object Object],[object Object],[object Object],[object Object],[object Object],© 2009 Giant Plc.
Analyze Design Options Design Options Level of Effort © 2009 Giant Plc. Effort Score Customer Sat. Score 18000 12000 3 1 3.5 D Optimizing data transfer 7500 5500 3 1 3 M Moving client-server data Manage Network I/O 1500 1000 3 1 4 S Verifying data content integrity Manage database interfaces Full Effort Base Effort Full Base Priority Kano Use-Case Requirement
Analyze Design Options = F(Kano, Priority, Feature Level) © 2009 Giant Plc. Effort Score Customer Sat. Score 18000 12000 3 1 3.5 D Optimizing data transfer 7500 5500 3 1 3 M Moving client-server data Manage Network I/O 1500 1000 3 1 4 S Verifying data content integrity Manage database interfaces Full Effort BaseEffort Full Base Priority Kano Use-Case Requirement
Analyze Design Options =  ∑ Effort Estimates © 2009 Giant Plc. Effort Score Customer Sat. Score 18000 12000 3 1 3.5 D Optimizing data transfer 7500 5500 3 1 3 M Moving client-server data Manage Network I/O 1500 1000 3 1 4 S Verifying data content integrity Manage database interfaces Full Effort BaseEffort Full Base Priority Kano Use-Case Requirement
Concept Selection © 2009 Giant Plc.
Computing Productivity Historically, for each project we  should  know   Size, Effort, and Duration © 2009 Giant Plc. ) ( ) ( / / 3 4 3 1 * (SLOC) years Duration B StaffYears Effort Size PP     
Schedule Compression 3 ) ( ) ( to relates years Duration StaffYears Effort MBI Manpower Buildup Index, MBI  © 2009 Giant Plc. 89 Very Rapid 5 55 Rapid 4 26.9 Moderate 3 14.7 Mod. Slow 2 7.3 Slow 1 Equation Output Buildup Rate MBI
Rayleigh Curve © 2009 Giant Plc.
Balancing VOC and VOB © 2009 Giant Plc. Duration Adjustment $19,482,050 Net Value $239,700 Defect Repair Cost $966,250 Effort Cost 14.1 Released Defects 77.3 Effort (staff months) 15.2 Duration (months) Concept 1 MBI = 1 (Slow) $1,400,000 Duration Adjustment $20,141,600 Net Value $453,900 Defect Repair Cost $1,492,500 Effort Cost 267 Released Defects 119.4 Effort (staff months) 13 Duration (months) Concept 1 MBI = 3 (Moderate) $688,000 Feature Value $20,000,000 Business Value
Balancing VOC and VOB © 2009 Giant Plc. $1,400,000 Duration Adjustment $18,665,650 Net Value $663,600 Defect Repair Cost $2,758,750 Effort Cost 508 Released Defects 220.7 Effort (staff months) 11.7 Duration (months) Concept 1 MBI = 5 (Very Rapid) $1,400,000 Duration Adjustment $20,141,600 Net Value $453,900 Defect Repair Cost $1,492,500 Effort Cost 267 Released Defects 119.4 Effort (staff months) 13 Duration (months) Concept 1 MBI = 3 (Moderate) $688,000 Feature Value $20,000,000 Business Value
VOB - Voice of Business ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],© 2009 Giant Plc.
Capability to Deliver on Time Probabilistic Scheduling Frequency Chart Certainty is 94.90% from -Infinity to 289.17 days .000 .007 .014 .021 .028 0 7 14 21 28 250.00 262.50 275.00 287.50 300.00 1,000 Trials 4 Outliers Forecast: F52 ,[object Object],[object Object],[object Object],[object Object],Upper Spec Limit (USL) © 2009 Giant Plc.
Process Improvement Standard Six Sigma DMAIC Process © 2009 Giant Plc.
Application to Software X ,[object Object],[object Object],© 2009 Giant Plc.
DMAIC Example ,[object Object],[object Object],[object Object],[object Object],© 2009 Giant Plc.
Measure – Data Collection ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],© 2009 Giant Plc
Analysis ,[object Object],© 2009 Giant Plc.
Analysis ,[object Object],[object Object],[object Object],© 2009 Giant Plc.
Analysis ,[object Object],© 2009 Giant Plc.
Improve ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],© 2009 Giant Plc.
Improve ,[object Object],[object Object],[object Object],[object Object],[object Object],© 2009 Giant Plc.
Control ,[object Object],[object Object],[object Object],[object Object],© 2009 Giant Plc.
Questions © 2009 Giant Plc.

More Related Content

What's hot

Some "Do"s and "Dont'"s of Benchmarking
Some "Do"s and "Dont'"s of BenchmarkingSome "Do"s and "Dont'"s of Benchmarking
Some "Do"s and "Dont'"s of BenchmarkingPaulShawIBM
 
Outsourced Rebate Processing White Paper
Outsourced Rebate Processing White PaperOutsourced Rebate Processing White Paper
Outsourced Rebate Processing White PaperDATAMARK
 
Selected Significant Projects
Selected Significant ProjectsSelected Significant Projects
Selected Significant ProjectsSChobbs88
 
VMWorld 2009 - Gebelin Getting To Yes!
VMWorld 2009 -  Gebelin Getting To Yes!VMWorld 2009 -  Gebelin Getting To Yes!
VMWorld 2009 - Gebelin Getting To Yes!cgebelin
 
Profitable Product Introduction with SAP
Profitable Product Introduction with SAPProfitable Product Introduction with SAP
Profitable Product Introduction with SAPJulien Delvat
 
Building An ROI for 3DEXPERIENCE PLM
Building An ROI for 3DEXPERIENCE PLMBuilding An ROI for 3DEXPERIENCE PLM
Building An ROI for 3DEXPERIENCE PLMAdaptive Corporation
 

What's hot (8)

Some "Do"s and "Dont'"s of Benchmarking
Some "Do"s and "Dont'"s of BenchmarkingSome "Do"s and "Dont'"s of Benchmarking
Some "Do"s and "Dont'"s of Benchmarking
 
Outsourced Rebate Processing White Paper
Outsourced Rebate Processing White PaperOutsourced Rebate Processing White Paper
Outsourced Rebate Processing White Paper
 
C8 scheduling
C8 schedulingC8 scheduling
C8 scheduling
 
Midwest Consumer Electronics Program
Midwest Consumer Electronics ProgramMidwest Consumer Electronics Program
Midwest Consumer Electronics Program
 
Selected Significant Projects
Selected Significant ProjectsSelected Significant Projects
Selected Significant Projects
 
VMWorld 2009 - Gebelin Getting To Yes!
VMWorld 2009 -  Gebelin Getting To Yes!VMWorld 2009 -  Gebelin Getting To Yes!
VMWorld 2009 - Gebelin Getting To Yes!
 
Profitable Product Introduction with SAP
Profitable Product Introduction with SAPProfitable Product Introduction with SAP
Profitable Product Introduction with SAP
 
Building An ROI for 3DEXPERIENCE PLM
Building An ROI for 3DEXPERIENCE PLMBuilding An ROI for 3DEXPERIENCE PLM
Building An ROI for 3DEXPERIENCE PLM
 

Similar to Giant Plc 2009

4.9.2013 Continuous Delivery - Extending Agile Development; A Lean Approach
4.9.2013 Continuous Delivery - Extending Agile Development; A Lean Approach4.9.2013 Continuous Delivery - Extending Agile Development; A Lean Approach
4.9.2013 Continuous Delivery - Extending Agile Development; A Lean ApproachIBM Rational
 
Rational Development & Test for z Systems 9.5 Webinar with Rogers Communications
Rational Development & Test for z Systems 9.5 Webinar with Rogers CommunicationsRational Development & Test for z Systems 9.5 Webinar with Rogers Communications
Rational Development & Test for z Systems 9.5 Webinar with Rogers CommunicationsSherri Hanna
 
Assisted deployment services offering overview
Assisted deployment services offering overviewAssisted deployment services offering overview
Assisted deployment services offering overviewIBM Rational software
 
Test automation lessons from WebSphere Application Server
Test automation lessons from WebSphere Application ServerTest automation lessons from WebSphere Application Server
Test automation lessons from WebSphere Application ServerRobbie Minshall
 
Driving Innovation with Kanban at Jaguar Land Rover
Driving Innovation with Kanban at Jaguar Land RoverDriving Innovation with Kanban at Jaguar Land Rover
Driving Innovation with Kanban at Jaguar Land RoverLeanKit
 
How to Balance System Speed and Risk for Multi-Platform Innovation
How to Balance System Speed and Risk for Multi-Platform InnovationHow to Balance System Speed and Risk for Multi-Platform Innovation
How to Balance System Speed and Risk for Multi-Platform InnovationClaudia Ring
 
IDC & Gomez Webinar --Best Practices: Protect Your Online Revenue Through Web...
IDC & Gomez Webinar --Best Practices: Protect Your Online Revenue Through Web...IDC & Gomez Webinar --Best Practices: Protect Your Online Revenue Through Web...
IDC & Gomez Webinar --Best Practices: Protect Your Online Revenue Through Web...Compuware APM
 
Data center insights summit 2015 disruptive force of clouds
Data center insights summit 2015   disruptive force of cloudsData center insights summit 2015   disruptive force of clouds
Data center insights summit 2015 disruptive force of cloudscrbraun
 
6.13.2013 2013 - Software, System, & IT Architecture - Good Design is Good Bu...
6.13.2013 2013 - Software, System, & IT Architecture - Good Design is Good Bu...6.13.2013 2013 - Software, System, & IT Architecture - Good Design is Good Bu...
6.13.2013 2013 - Software, System, & IT Architecture - Good Design is Good Bu...IBM Rational
 
Rapid Deployment of BMC Remedy Solutions 2006
Rapid Deployment of BMC Remedy Solutions 2006Rapid Deployment of BMC Remedy Solutions 2006
Rapid Deployment of BMC Remedy Solutions 2006Antonio Rolle
 
Good IT Project Management
Good IT Project Management Good IT Project Management
Good IT Project Management William Francis
 
Keyword Driven Automation
Keyword Driven AutomationKeyword Driven Automation
Keyword Driven AutomationPankaj Goel
 
IBM Z for the Digital Enterprise 2018 - Automate Delivery Pipeline
IBM Z for the Digital Enterprise 2018 - Automate Delivery PipelineIBM Z for the Digital Enterprise 2018 - Automate Delivery Pipeline
IBM Z for the Digital Enterprise 2018 - Automate Delivery PipelineDevOps for Enterprise Systems
 
An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...
An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...
An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...Cisco Canada
 
Dops 1033 dev-ops_review_final
Dops 1033 dev-ops_review_finalDops 1033 dev-ops_review_final
Dops 1033 dev-ops_review_finalDaniel Berg
 
Past Experiences and Future Challenges using Automatic Performance Modelling ...
Past Experiences and Future Challenges using Automatic Performance Modelling ...Past Experiences and Future Challenges using Automatic Performance Modelling ...
Past Experiences and Future Challenges using Automatic Performance Modelling ...Paul Brebner
 
The Visual Environment Limited The Visual Environment Limited ...
The Visual Environment Limited The Visual Environment Limited ...The Visual Environment Limited The Visual Environment Limited ...
The Visual Environment Limited The Visual Environment Limited ...Videoguy
 
DevOps & Continuous Test for IIB and IBM MQ
DevOps & Continuous Test for IIB and IBM MQDevOps & Continuous Test for IIB and IBM MQ
DevOps & Continuous Test for IIB and IBM MQStuart Feasey
 

Similar to Giant Plc 2009 (20)

4.9.2013 Continuous Delivery - Extending Agile Development; A Lean Approach
4.9.2013 Continuous Delivery - Extending Agile Development; A Lean Approach4.9.2013 Continuous Delivery - Extending Agile Development; A Lean Approach
4.9.2013 Continuous Delivery - Extending Agile Development; A Lean Approach
 
Advanced Defect Management
Advanced Defect ManagementAdvanced Defect Management
Advanced Defect Management
 
Rational Development & Test for z Systems 9.5 Webinar with Rogers Communications
Rational Development & Test for z Systems 9.5 Webinar with Rogers CommunicationsRational Development & Test for z Systems 9.5 Webinar with Rogers Communications
Rational Development & Test for z Systems 9.5 Webinar with Rogers Communications
 
Assisted deployment services offering overview
Assisted deployment services offering overviewAssisted deployment services offering overview
Assisted deployment services offering overview
 
Test automation lessons from WebSphere Application Server
Test automation lessons from WebSphere Application ServerTest automation lessons from WebSphere Application Server
Test automation lessons from WebSphere Application Server
 
Driving Innovation with Kanban at Jaguar Land Rover
Driving Innovation with Kanban at Jaguar Land RoverDriving Innovation with Kanban at Jaguar Land Rover
Driving Innovation with Kanban at Jaguar Land Rover
 
How to Balance System Speed and Risk for Multi-Platform Innovation
How to Balance System Speed and Risk for Multi-Platform InnovationHow to Balance System Speed and Risk for Multi-Platform Innovation
How to Balance System Speed and Risk for Multi-Platform Innovation
 
IDC & Gomez Webinar --Best Practices: Protect Your Online Revenue Through Web...
IDC & Gomez Webinar --Best Practices: Protect Your Online Revenue Through Web...IDC & Gomez Webinar --Best Practices: Protect Your Online Revenue Through Web...
IDC & Gomez Webinar --Best Practices: Protect Your Online Revenue Through Web...
 
Data center insights summit 2015 disruptive force of clouds
Data center insights summit 2015   disruptive force of cloudsData center insights summit 2015   disruptive force of clouds
Data center insights summit 2015 disruptive force of clouds
 
Adopting DevOps for 2-Speed IT
Adopting DevOps for 2-Speed ITAdopting DevOps for 2-Speed IT
Adopting DevOps for 2-Speed IT
 
6.13.2013 2013 - Software, System, & IT Architecture - Good Design is Good Bu...
6.13.2013 2013 - Software, System, & IT Architecture - Good Design is Good Bu...6.13.2013 2013 - Software, System, & IT Architecture - Good Design is Good Bu...
6.13.2013 2013 - Software, System, & IT Architecture - Good Design is Good Bu...
 
Rapid Deployment of BMC Remedy Solutions 2006
Rapid Deployment of BMC Remedy Solutions 2006Rapid Deployment of BMC Remedy Solutions 2006
Rapid Deployment of BMC Remedy Solutions 2006
 
Good IT Project Management
Good IT Project Management Good IT Project Management
Good IT Project Management
 
Keyword Driven Automation
Keyword Driven AutomationKeyword Driven Automation
Keyword Driven Automation
 
IBM Z for the Digital Enterprise 2018 - Automate Delivery Pipeline
IBM Z for the Digital Enterprise 2018 - Automate Delivery PipelineIBM Z for the Digital Enterprise 2018 - Automate Delivery Pipeline
IBM Z for the Digital Enterprise 2018 - Automate Delivery Pipeline
 
An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...
An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...
An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...
 
Dops 1033 dev-ops_review_final
Dops 1033 dev-ops_review_finalDops 1033 dev-ops_review_final
Dops 1033 dev-ops_review_final
 
Past Experiences and Future Challenges using Automatic Performance Modelling ...
Past Experiences and Future Challenges using Automatic Performance Modelling ...Past Experiences and Future Challenges using Automatic Performance Modelling ...
Past Experiences and Future Challenges using Automatic Performance Modelling ...
 
The Visual Environment Limited The Visual Environment Limited ...
The Visual Environment Limited The Visual Environment Limited ...The Visual Environment Limited The Visual Environment Limited ...
The Visual Environment Limited The Visual Environment Limited ...
 
DevOps & Continuous Test for IIB and IBM MQ
DevOps & Continuous Test for IIB and IBM MQDevOps & Continuous Test for IIB and IBM MQ
DevOps & Continuous Test for IIB and IBM MQ
 

Giant Plc 2009

  • 1. Applying Six Sigma to Financial Software Development Ali Raza Khan Giant Plc. 1 New Oxford Street, London ♦ United Kingdom (+44) 18443247700 ♦ [email_address] © 2009 Giant Plc.
  • 2.
  • 3.
  • 4. Key Factors in Software Project Failures 50% Inadequate Configuration Management 55% Cost Overruns 60% Low Quality Execution Failures 65% Excessive Schedule Pressure Expectation Failures 80% Creeping Requirements Requirements Failures % of “MIS” Projects Risk Factor
  • 5. Applying Six Sigma to Software Development © 2009 Giant Plc.
  • 6. Fuzzy Front End Six Sigma DFSS © 2009 Giant Plc
  • 7. Balance the VOC and the VOB Voice of the Customer Voice of Business
  • 8.
  • 9. Building a Customer Matrix Segments Types of Customers Lead User Demanding Lost Lead Had But Lost U.S. Europe Asia © 2009 Giant Plc.
  • 10.
  • 11.
  • 12. The Kano Model How the Customer Feels Satisfier Must-Be Delighter Level of Functionality Delivered for a particular requirement Low to None High Delighted © 2009 Giant Plc Neutral Very Dissatisfied Satisfied, but I expect it this way I can live with it
  • 13. VOC Output: Prioritized CTQs © 2009 Giant Plc. 5 M Minimizing system response time Provide real-time user access 3.5 D Optimizing data transfer 3 M Moving client-server data Manage Network I/O 4 S Verifying data content integrity Manage database interfaces Priority Kano Use-case Requirement
  • 14.
  • 15. VOC Output: Fully Characterized CTQs © 2009 Giant Plc. Top 10+ compression schemes supplied and fully integrated Top 5 compression schemes supplied Hooks for user supplied compression 3.5 D Optimizing data transfer 800 records/min. 500 records/min. 100 records/min. 3 M Moving client-server data Manage Network I/O ≤ 1 record/100,000 ≤ 1 record/10,000 ≤ 1 record/1,000 4 S Verifying data content integrity Strong Average Minimum Manage Database Interfaces Measure Priority Kano Use-Case Requirements
  • 16.
  • 17. Analyze Design Options Design Options Level of Effort © 2009 Giant Plc. Effort Score Customer Sat. Score 18000 12000 3 1 3.5 D Optimizing data transfer 7500 5500 3 1 3 M Moving client-server data Manage Network I/O 1500 1000 3 1 4 S Verifying data content integrity Manage database interfaces Full Effort Base Effort Full Base Priority Kano Use-Case Requirement
  • 18. Analyze Design Options = F(Kano, Priority, Feature Level) © 2009 Giant Plc. Effort Score Customer Sat. Score 18000 12000 3 1 3.5 D Optimizing data transfer 7500 5500 3 1 3 M Moving client-server data Manage Network I/O 1500 1000 3 1 4 S Verifying data content integrity Manage database interfaces Full Effort BaseEffort Full Base Priority Kano Use-Case Requirement
  • 19. Analyze Design Options = ∑ Effort Estimates © 2009 Giant Plc. Effort Score Customer Sat. Score 18000 12000 3 1 3.5 D Optimizing data transfer 7500 5500 3 1 3 M Moving client-server data Manage Network I/O 1500 1000 3 1 4 S Verifying data content integrity Manage database interfaces Full Effort BaseEffort Full Base Priority Kano Use-Case Requirement
  • 20. Concept Selection © 2009 Giant Plc.
  • 21. Computing Productivity Historically, for each project we should know Size, Effort, and Duration © 2009 Giant Plc. ) ( ) ( / / 3 4 3 1 * (SLOC) years Duration B StaffYears Effort Size PP     
  • 22. Schedule Compression 3 ) ( ) ( to relates years Duration StaffYears Effort MBI Manpower Buildup Index, MBI © 2009 Giant Plc. 89 Very Rapid 5 55 Rapid 4 26.9 Moderate 3 14.7 Mod. Slow 2 7.3 Slow 1 Equation Output Buildup Rate MBI
  • 23. Rayleigh Curve © 2009 Giant Plc.
  • 24. Balancing VOC and VOB © 2009 Giant Plc. Duration Adjustment $19,482,050 Net Value $239,700 Defect Repair Cost $966,250 Effort Cost 14.1 Released Defects 77.3 Effort (staff months) 15.2 Duration (months) Concept 1 MBI = 1 (Slow) $1,400,000 Duration Adjustment $20,141,600 Net Value $453,900 Defect Repair Cost $1,492,500 Effort Cost 267 Released Defects 119.4 Effort (staff months) 13 Duration (months) Concept 1 MBI = 3 (Moderate) $688,000 Feature Value $20,000,000 Business Value
  • 25. Balancing VOC and VOB © 2009 Giant Plc. $1,400,000 Duration Adjustment $18,665,650 Net Value $663,600 Defect Repair Cost $2,758,750 Effort Cost 508 Released Defects 220.7 Effort (staff months) 11.7 Duration (months) Concept 1 MBI = 5 (Very Rapid) $1,400,000 Duration Adjustment $20,141,600 Net Value $453,900 Defect Repair Cost $1,492,500 Effort Cost 267 Released Defects 119.4 Effort (staff months) 13 Duration (months) Concept 1 MBI = 3 (Moderate) $688,000 Feature Value $20,000,000 Business Value
  • 26.
  • 27.
  • 28. Process Improvement Standard Six Sigma DMAIC Process © 2009 Giant Plc.
  • 29.
  • 30.
  • 31.
  • 32.
  • 33.
  • 34.
  • 35.
  • 36.
  • 37.
  • 38. Questions © 2009 Giant Plc.

Editor's Notes

  1. Why apply six sigma to software development? Let’s look at software development. Progress has been made since the term the “software crisis was first coined in the 1960’s to describe the poor quality of software development, but software projects are still notoriously late, over budget and fail to satisfy their requirements. Consider the following statistics. Now, let’s look at six sigma.
  2. However, software development is not a typical six sigma application While software development is process oriented, <click> inputs are often ill-defined <click> outputs are often difficult to fully evaluate – “you can’t use testing to verify the absence of errors” <click> performance is highly influenced by human factors, leading to a high degree of natural variation.
  3. “Assessment and Control of Software Risks”, Casper Jones 1994 (p.29)
  4. So, in applying Six Sigma to software development, we’ll use both techniques. We’ll apply DFSS to the fuzzy front end to improve our ability to Gather and interpret requirements Establish a product concept Develop product specifications Determine implementation requirements (i.e., CTPs – which for software could include things like the acquisition of new tools, new technology, new skill sets, etc.) Establish the project’s schedule and cost structure We’ll use DMAIC to strengthen the actual software development process, which begins for our purposes with design. Our primarily objective will be to Improve overall productivity Improve quality Reduce the number of introduced defects Improve defect containment Reduce the number of defects delivered to the customer
  5. Let’s begin by looking at how we can use DFSS to improve the fuzzy front end.
  6. For example, we may begin by looking at different geographic areas and the different types of customers in each of those areas (here a lead user is one who customizes the product themselves). In addition to geography, we could also divide customers based on Size (small, medium and large), or type of application (real-time financial services, real-time simulation, etc.) The idea is to identify all of the slices in the pie chart that represents the product’s market, so that our view of requirements is complete. <Draw this on the board)
  7. Kano analysis divides requirements into three basic categories: Must be’s Satisfiers Delighters Illustrate the difference using the airline example. Taking a shuttle flight between New York and Washington Getting there – must be Getting there is less time is a satisfier – the faster the better Surveying excellent American Champagne in crystal glasses during the flight is a delighter The relationship between the categories and customer satisfaction is shown in the next slide <click>
  8. The information can be can be summarized in a matrix that shows: The relationship between requirements and use cases The Kano classification and priority associated with each use case Note: requirements and use cases should be numbered for tracking
  9. < Walk through the diagram>
  10. In the matrix, we start by summarizing the voice of the customer by listing The requirements and use cases The customer’s view of each use case (Kano classification and priority) We then create two columns for each of our design options One of specifying the level of support for each use case, where 0 = no support, 1 = minimum support, 2 = average support, and 3 = strong support, based on the measures we determined earlier (<back two slides if necessary>) The other specifying the level of effort required in LOC to support the specified level of support for each use case Note: In defining the design options, you must be realistic. That is the resulting options must all be viable. Therefore, you must consider the relationships between use-cases as well as conflicts between use cases. For example, providing strong support for one use case may dictate the level of support provided for another use case. Or in order to provide strong support for one use case, we may have to provide strong support for another, even though this would not be our choice.
  11. For each option we can calculate a customer satisfaction score. This will be a function of the Kano classification, priority and level of support for each use case. This is not yet a science and there are different ways to this: The simplest is to ignore the Kano classification and calculate the customer satisfaction score by summing priority x level of support over of all of the use cases. More complete approaches take the Kano classifications into account. For example, Penalize the score for every must be that is not supported Scores for must bes = priority x level of support, with maximum = priority x average Scores of satisfiers = priority x level of support Scores for delighters = priority 2 x level of support The most important thing is to be consistent so that the comparison is valid, and to regularly tune your approach based on pre-release estimates of customer satisfaction and post release measurements of customer satisfaction.
  12. For each option, we can calculate an effort score by summing the level of effort estimates for all of the use cases.
  13. Finally, we can refine our options by performing a rough benefit/cost analysis. This can be done by Determining the center of the grid by Calculating the median customer satisfaction score for the set of options Calculating the median level of effort score for the set of options Plotting the options Selecting the options with the best benefit/cost profile for further analysis. This will like be options in Quadrant 2, or Quadrants 1 and 3; options in Quadrant 4 are not attractive
  14. We use Putnam’s equation as an illustration of an estimating model because it is based on thousands of real software projects, well documented, and well explained in published materials. However, there is no shortage of estimating models, and another model may be better for your situation. The important point is to use a consistent, quantitative approach for evaluating your capability to develop and deliver products based on the different options, and continuously improve your estimating model based on actual results. Note: B in the above equation is called a skill factor. It’s a function of project size and takes care to the additional effort required for integration. Note: E = (Size/Productivity x Duration 4/3 )/B Duration = (Size/Productivity x (Effort/B) 1/3 ) 3/4
  15. Next, we calculate a manpower buildup index, which represents how quickly we staff projects. This is another historical parameter that is useful in characterizing the organizations software development capability. The more schedule compression, the shorter the duration of projects, but with a disproportionate increase in staffing (cost) and risk.
  16. At the end of the project in month 10, we will have discovered 80.6% of the total defects. We can use this information in two ways: We can use this information together with historical data on the effort required to find and fix defects to sanity check our plan. For example, if it takes 23 hours to find and fix each defect, we can check if the allocated effort is sufficient given the defect discovery profile. Since we will be delivering approximately 20% of the defects to customers, we might want to revise our plan to start testing earlier, or to institute other defect containment strategies (e.g., inspections) to reduce the anticipated number of defects at the start of testing. The goal is ensure that our plans for staffing and testing are sufficient to deliver the required level of quality. Needless to say, if significant changes are made, earlier parts of our evaluation may have to be repeated.
  17. Here, we’re comparing one design option against two possible schedules, one more aggressive than the other. Business value = intrinsic value of the product Feature value = added value based on customer satisfaction rating Duration adjustment – estimated value of delivering early Net value = (business value + feature value) – (effort cost + defect repair cost)
  18. This slide continues the comparison, this time between the moderate and very rapid schedules. Results show that very compressed schedules are not always advantageous because of the disproportionate increase in cost and effort.
  19. Finally, once our overall plan has been established, the next step is transform this into a detailed project plan and perform simulation using schedule estimates (best, average and worst cases) for the critical path to determine the overall schedule risk. <click – next slide>
  20. Monte Carlo simulation of project schedule risk can be performed when a critical path plan has been developed (i.e., predecessors and successors identified for each task). For each task in the critical path the team is asked to provide ‘best case’, ‘expected’, and ‘worst case’ duration estimates (sometimes called “PERT” estimates). These are used as input to the simulation, typically run 1000 times, to get a probability distribution of expected completion times. Simulation results can be used to determine earliest and latest dates associated with a given probability (e.g., 95% as illustrated here), or to determine the probability associated with a particular date (e.g., ‘what is the probability the project will complete by 1/1/2003?’)
  21. Now, let’s look at the application of Six Sigma’s DMAIC process to the software development process.
  22. Before describing the actual process, let me mention a few prerequisites. First, The software development process must be well defined in order to apply DMAIC to achieve process improvements.
  23. Perhaps the best way to illustrate how DMAIC can be applied to improve the software development process is to walk through an example. This slide shows the problem statement and goal statement developed during the Define phase <review slide>
  24. The team decided to collect three types of metrics during the measure phase to more fully characterize the problem: Total problems fixed prior to release per project Total post release problems per project Types of post release problems, overall and per project These metrics were selected to provide a more complete picture of the company’s defect containment capability. This data will allow the team to determine overall defect containment and study defect containment as A function of project characteristics A function of error type
  25. The analyzed the collected data in several ways. First, they looked at the relationship between pre-release defects and project size and found a strong correlation.
  26. Next, they looked at the relationship between escaped defects and pre-release defects and found that it was fairly linear. The first two results taken together suggest that there is no significant variation in defect containment effectiveness across projects.
  27. Next, the team created a histogram of escaped defects showing the distribution of the different types of problems. This showed that code related problems are the most common. The team decided that the root cause of the increased maintenance effort was the increased size of recent projects, which is unlikely to change, and poor error containment for code related problems,
  28. The team decided that they could improve the situation by improving the effectiveness of code inspections. They decided that the effectiveness of code inspections (the number of identified defects) was a function of The size of the unit inspected The preparation time The inspection time The number of reviewers and decided to conduct a designed experiment to determine the optimal combination of these factors. <click – next slide>
  29. Once they determined the optimal combination they conducted a pilot test using real projects to verify the results.
  30. In order to ensure that the improvement will be maintained and managed the team established a performance standard for code inspections based on defects/KLOC, and established a plan for monitoring the process and responding tom situations where unacceptable performance is observed.