Software project estimation - the differences between expert estimates and parametric estimates. The results of a study on the accuracy (effort, schedule and cost) of the two types of estimates is part of the presentation
Van Heeringen - Seminar Software Metrics Network Sweden 18-04-2013
1. Software Estimation and
Performance Measurement
Harold van Heeringen
Sogeti Sizing, Estimating & Control (SEC)
Sogeti Nederland B.V.
Stockholm, April 18 2013@haroldveendam
2. 2
Topics
• Software projects
• Software estimation
• Expert vs. Parametric Estimates
• Challenge for parametric estimates
• E&PM process
• Accuracy of the different estimation types
• The Swedish software industry?
3. 3
Software projects
• Software project industry : low maturity
− Low estimation maturity
− No or little formal estimation processes
− No or little use of historical data
• Lots of schedule and cost overruns
− Standish Chaos reports: Most projects fail or are at least
unsuccessful
• Low customer satisfaction rates
− In Europe: only slightly higher than the financial sector
4. 4
Software project estimation
• Most of the projects are estimated by ‘experts’
− Bottom up, task by task effort estimation
• Usually very optimistic (>30%)
− Experts estimate, but other people (juniors) do the job
− Forgotten activities (e.g. testscript reviews)
− No feedback loop with past projects: experts don’t learn from
past estimates and actuals
− No scenario’s: duration, team size, etc.
− Not objective, transparent, verifiable or repeatable
• Not defendable!
− ‘Easy’ to push back by stakeholders
• No risk assessment (distribution of the
estimate)
6. 6
Project Estimates
• Two types of project estimation:
− Expert estimation
− Parametric estimation
• Expert
− Knowledge and experience of experts
− Assign effort hours to tasks (bottom-up)
− Subjective, but always applicable
• Parametric
− Size measurement, historical data and tooling
− Size measurement methods: NESMA FPA, COSMIC, IFPUG
− Objective, but well documented specifications required
7. 7
Comparing the two types
• Expert Estimates
− Bottom-up estimation
− Usually optimistic (up to 30% under estimation is common)
− Forgotten activities
− Hard to defend
− The expert is not going to do all the work
− The expert may not be an expert on the new project
− Are the real experts available?
• Parametric Estimates
− Top-down estimation
− Estimating & Performance Measurement process needed
− Effort = size * Productivity
◦ Size is objectively measureable (COSMIC, FPA)
◦ Productivity from historical data (organization / ISBSG)
◦ scenario’s through tools (QSM / SEER-SEM / ISBSG)
8. 8
Eestimate & Performance Measurement
PLAN
Estimate
AdministrateEvaluate
Adjust &
Report
Size measurement: FPA
Historical data
Estimation tools
Finetune Estimation model
Analyse productivity,
Report productivity
Start: Estimate request
Start: Project completed
Results:
- Parametric Estimation
- Expert Estimation
Result:
-Management report,
-Adjusted model
Result:
-Growing project DB,
-Performance measurement
-Updated expert knowledge
ACT
CHECK DO
Start: Project start
Continuous data collection
• effort hours registration
• defect registration
• change measurement
• project characteristics
Result: Project data
Data collection and
administration
• Collect project data
• Measure size
• Benchmark the project
Start: Periodically
Expert Estimate
9. 9
Challenge – ‘sell parametric estimates’
• Project/Bid management still believe experts
more than parametrics
− ‘More detail must mean more accurate, right’?
− ‘This project is very different from past projects’
− ‘I see that we don’t get any function points for module XYZ,
but we have to do a lot of work for that!’
− ‘I think the project is quite easy and I think that the
parametric estimate overestimates the effort’
− But what about: team size, duration, forgotten activities,
past performance.
• How can we convince project management ??
11. 11
Project at different durations
Effort(hours)
Duration
Plan A
Duration: 6 months
Effort: 4.500 hours
Max. team size: 5,8 fte
MTTD: 1,764 days
Plan B
Duration: 7 months
Effort: 2.400 hours
Max. team size: 2,7 fte
MTTD: 2,816 days
Size and productivity being constant
Which duration have the experts in mind??
12. 12
Sogeti SEC
• Sizing, Estimating & Control
− Certified (COSMIC) Function Point Analysts
− Metrics consultants
• Responsible for metrics part of a quotation.
− Size: FPA/COSMIC
− Estimation: SEER-SEM / QSM / Sogeti tool / ISBSG
− Product: Methodical Estimation Report (scenario’s)
− Pricing: EUR/FP
− Quality: Defects/FP
• Centers of Excellence: MS.Net, Java, Oracle,
mobile, drupal, sharepoint, BI, etc.
13. 13
Assignment 2005
• Build estimation instrument
− Gain time and effort in estimating bids
− Accurate enough to depend and rely on
− Flexible:
◦ Estimate onshore / offshore and hybrids
◦ Calculate different test strategies
◦ Take into account complexity
◦ Implement Deming cycle (PDCA)
• Give scenario’s for duration !!!
14. 14
First version Estimating Wizard (2005)
• Try to grasp the duration / effort tradeoff in a
model
− Tuned with experience data
Hour/FP: Average Complexity
Duration in months 3½ 4 4½ 5 5½ 6 6½ 7 7½ 8
0-250 FP 10,1 8,9 8,1 7,7 6,9
250-500 FP 9,1 8,0 7,3 6,9 6,2
500-750 FP 8,6 7,6 6,9 6,5 5,9
750-1000 FP 8,3 7,3 6,6 6,3 5,6
1000 -1250 FP 8,1 7,1 6,5 6,2 5,5
1250 - 1500 FP 7,9 6,9 6,3 6,0 5,4
15. 15
Estimating Wizard 2013
Input
Functional design parameters
Functional Design Yes
Overlap Yes, manually 5
Language English
Availability key users Low
Location On site
Build and test parameters
Development tool Java
Onshore Offshore
Construction 30% 70%
Translation FD required No
System test approach TMap Heavy
System test strategy
Tools/methodologies 5 - Average
Complexity 6
Development team 5 - Average
Reuse 0 - None 5%
General parameters
Size 367 FP
Start date 01-05-13
Project control 2 hrs/week
Risk surcharge (%) 10 %
Warranty (%) 4 %
Organization type Banking
Quality documentation 6
Non functional req.
Scenario interval 2,0 weeks
m20v20120224
Scripting and design NL, execution in India
Average (0)
16. 16
EW - output
• Functional Design – no schedule scenarios
Functional design phase
Duration in weeks 17,6
Design complete 4-05-11
Total effort 1.975
Effort per FP 2,53
Effort cost € 208.531
Additional cost € 14.815
Totaal cost € 223.346
Cost per FP € 286
Average team size 2,80
Data altered due to company security reasons
22. 22
Estimation Accuracy results
• Average time spent
− Expert: 36,1 hours
− EW (including FPA / COSMIC measurement): 22,5 hours
Expert Estimate Est. Wizard Estimate
Effort Accuracy
Average 0,778 0,886
St.Dev. 0,319 0,207
Median 0,696 0,904
Duration Accuracy
Average 0,742 0,862
St.Dev. 0,405 0,272
Median 0,825 0,871
Cost Accuracy
Average 0,772 1,184
St.Dev. 0,316 0,423
Median 0,708 1,151
23. 23
Results
• Expert estimates take a lot of time too!
− on average 60% more than Parametric Estimates
− More than 1 expert has to read through all the
documentation
− Discussions and agreements take time
• Parametric estimates are more acurate!
− Effort and duration estimates on average still optimistic, but
less optimistic than expert estimates
− Cost estimate pessimistic, but still closer to actuals than
Expert estimate
• Experts might win the project, but the result will
be overruns!
24. 24
Cost of high and low estimates
Non-linear extra costs
- Planning errors
- Team enlargement more expensive, not faster
- Extra management attention / overhead
- Stress: More defects, lower maintainability !!
Linear extra costs
Extra hours will be used
25. 25
Conclusions of the study
• Estimating wizard
− Higher effort estimation accuracy
− Higher duration estimation accuracy
− Higher cost estimation accuracy (although >1)
− Less hours spent than expert estimates
− Standard WBS, historical data collection and parameter
calibration improve the maturity of the process
• Next steps
− Analyze why costs are overestimated
− Try to identify the projects where EW estimate is enough
− Use the results to convince project management and bid
management to use parametric estimating more
26. 26
Do we have time for some other thoughts?
• What about the maturity in the Swedish market?
29. 29
Where are the Swedish projects??
• Please submit project data to ISBSG !
• Independent, anonymous
• Free benchmark report
• Flyer ISBSG available
www.isbsg.org
30. 30
Thank you for your attention !!
@haroldveendam
President ISBSG (International Software Benchmarking Standards Group (www.isbsg.org))
Board member NESMA (Netherlands Software Metrics Association (www.nesma.nl))
IAC member COSMIC (www.cosmicon.com)
Harold.van.heeringen@sogeti.nl
Harold van Heeringen
Senior Consultant Software Metrics /Software Cost Engineer
Sogeti Sizing, Estimating & Control (SEC)