17. In chapter 4 of Peopleware by DeMarco and Lister they discuss a study done in 1985 by researchers at the University of New South Wales. The study analyzed 103 actual industrial programming projects and assigned each project a value on a “weighted metric of productivity”. They then compared the average productivity scores of projects grouped by how the projects’ estimates were arrived at. They found that programmers are more productive when working against their own estimates as opposed to estimates created by their boss or even estimates created jointly with their boss The study also found that on projects where estimates were made by third-party system analysts the average productivity was even higher. This last result was a bit of a surprise, ruling out the theory that programmers are more productive when trying to meet their own estimates because they have more vested in them. But the real surprise was that the highest average productivity was on those projects that didn’t estimate at all.
Editor's Notes
Recognised by PMI and Agile communities as a way to add a range of error to estimates. Idea is you can use this as a guide when giving estimates – the further away from the estimated completion date you are the greater the error range in your estimates. SOMETHING SMELLS I want to take you through a few concepts from the world of science that might help show us
Optimism bias is the demonstrated systematic tendency for people to be over-optimistic about the outcome of planned actions. This includes over-estimating the likelihood of positive events and under-estimating the likelihood of negative events. It is one of several kinds of positive illusion to which people are generally susceptible. The UK government explicitly acknowledges that optimism bias is a problem in planning and budgeting and has developed measures for how to deal with optimism bias in government ( HM Treasury 2003 ). The UK Department for Transport and department for health requires project planners to use so-called "optimism bias uplifts“ Another great example is smokers perception of their likelihood of getting cancer
Planning fallacy the tendency to underestimate task-completion times tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also cost overruns and benefit shortfalls . Planners tend to underestimate time for sickness, vacation, meetings, and other "overhead" tasks. not to plan projects to a detail level that allows estimation of individual tasks, like placing one brick in one wall Dan North has a good analogy about measuring the coast of britain – the more detailed you try and make your measurement the longer it will get - infinitum
A cognitive bias is the human tendency to make systematic errors in certain circumstances based on cognitive factors rather than evidence. Evolutionary failings e.g. our love of sugary foods when they are bad for us
Suggests that estimates get better as the project progresses. I would suggest it only shows that as the backlog of work decreases there’s less to go wrong
Cone of uncertainty is a best case scenario. In reality things are changing all the time. When you know the reality, how can you use the cone of uncertainty to help you provide useful estimates?
Mediocristan follows the bell curve The central theme is that the Gaussian is centered around an average, and the chance of deviation from this average reduces quickly.
Donald Rumsfeld got a lot of flak for using this phrase. " The Black Swan: The Impact of the Highly Improbable " by Nassim Nicholas Taleb Pareto's law is a type of Fat Tail distribution. With a fat tail there is not a real average, only events that are likely to occur, and event that are unlikely to occur, but who's effects are enormous in comparison with the likely ones.
People are fallible, we forget, don’t think of everything are overly optimistic Ah we know that know we won’t do it next time, but then there is always something else, always. We make mistakes and learn by them, but what about the next team? We can tell them our mistakes but they will have their own. This is not a problem though as they’re only estimates anyway right?
A Complex Adaptive System (CAS) is a dynamic network of many agents acting in parallel, constantly acting and reacting to what the other agents are doing. The control of a CAS tends to be highly dispersed and decentralized. The overall behavior of the system is the result of a huge number of decisions made every moment by many individual agents. A CAS behaves/evolves according to three key principles: order is emergent as opposed to predetermined (c.f. Neural Networks ) the system's history is irreversible, and the system's future is often unpredictable. The basic building blocks of the CAS are agents. Agents scan their environment and develop schema representing interpretive and action rules. These schema are subject to change and evolution