3. RECENT EVIDENCE
• Obama Care - only 1% of people managed to successfully enroll with the site in
its first week of operation
• The Surrey Integrated Reporting Enterprise Network (SIREN). Not fit for
purpose. Team was not capable to finish the project. (15M GBP failure)
• Digital Media Initiative (By 2013, the project was judged to be obsolete (as
much cheaper commercial off the shelf alternatives by then existed) and was
scrapped by BBC management. (98M GBP)
• Expeditionary Combat Support System . US Air Force. No significant
capabilities ready on time; would have cost $1.1bn more just to get to 1/4 of
the original scope. (1.1 bn USD )
http://en.wikipedia.org/wiki/List_of_failed_and_overbudget_custom_sof
tware_projects
5. RISKS ARE INEVITABLE IN SOFTWARE.
WHY???
• Misunderstanding of requirements
• Lack of top management commitment and support
• Lack of adequate user involvement
• Failure to gain user commitment
• Failure to manage end user expectation
• Changes to requirements
• Lack of an effective project management methodology
Top Ten Lists of Software Project Risks :
Evidence from the Literature Survey, 2011
6. WHY WE DON’T DO RISKS
MANAGEMENT
• Boring/Hard to do
• Does not work
• We should think positively toward the project goals
• The data needed to do risk management effectively is lacking.
• Agile handles risks as a part of methodology
• One person can not manage risks effectively
• People minimize the need for risk management by the absence of evidence
(nothing bad has happened yet).
• We don’t really what is a RISK
8. TYPE OF RISKS
• known known – general software development risks
• known unknowns - project specific risks
• unknown unknowns - “Black Swans”
9. KNOWN KNOWS
• Misunderstanding of requirements
• Lack of top management commitment and support
• Lack of adequate user involvement
• Failure to gain user commitment
• Failure to manage end user expectation
• Changes to requirements
• Lack of an effective project management methodology
10. KNOWN KNOWS. WHAT WE CAN DO ?
• Be prepared, they will come
• Identify using checklists (top 5 - Demarko , top 10 Bohem, taxonomy method)
• Have a risk response plan in place
• Put avoidance strategies into estimates
• Have contingency plans for mitigation
11. BOHEM CHEKLIST. TOP 10
• 1 Personnel Shortfalls - avoid
• 2 Unrealistic Schedules and Budgets - mitigate
• 3 Developing the wrong software functions - avoid
• 4 Developing the wrong user interface - avoid
• 5 Gold-plating - avoid
• 6 Continuing stream of requirements changes - mitigate
• 7 Shortfalls in externally-performed tasks – avoid, transfer
• 8 Shortfalls in externally-furnished components – avoid, transfer
• 9 Real-time performance shortfalls – avoid, mitigate
• 10 Straining computer science capabilities - avoid
13. KNOWN UNKNOWNS
• Project/product specific risks – new business area
• New technology (e.g. Hadoop)
• Team from Ukraine , cultural differences
• High speed/performance objectives
14. KNOWN UNKNOWNS. WHAT WE CAN DO
• Be prepared, they MAY come
• Identify using brainstorming, Crawford Slip method, Weekly Meetings,
Documentation analysis etc.
• Have a risk response plan in place
• Put avoidance strategies into estimates
• Have contingency plans/reserves for risk management
18. UNKNOWN UNKNOWNS
• 50% of your team decided to quit
• Virus broken source code repo
• Server crashed with no backups
19. BLACK SWANS. AMOUNT VS DAMAGE
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Amount Damage
Black Swans Other risks
According to Tom Kendrick most severe 20% of the risks are black swans.
Black swans bring 50% of the damage
20. UNKNOWN UNKNOWNS. WHAT WE CAN
DO
• Be prepared they may occur
• Have a reserve (100% from initial risks estimates)
• Don’t get shocked with the risk
• Have a swot team to handle that
• Enjoy
21. WHERE ELSE I CAN LOOK FOR RISKS
User
18%
Requirement
21%
Complexity
5%
Planning and
Control
34%
Team
11%
Organizational
Environment
11%
User Requirement Complexity Planning and Control Team Organizational Environment
22. CONCLUSION
• Don’t be afraid of risks
• Identify as soon as possible
• Use checklists and people around for risks identification and management
Прийшов вас усіх позитивно заряджених лякати ризиками….
Але краще бути підготовленим ніж не знаючим…
Доречі ризики бувають не тільки негативні але й позитивні
ввід
* чому софт проекти фейляться...
- ми ж плануємо, дехто детальніше дехто не так детально
- але в кінці або не то заімплементимо, або невчасно здамо проект
ризиків є багато і тут нічо не зробиш - мусимо з тим жити
НАЗВІТЬ ОДНУ З ОСНОВНИХ ПРИЧИН ЧОМУ ПРОЕКТИ ФЕЙЛЯТЬСЯ ?
В чому ж причина… ? Не вміють ? Ще не навчилися ? В штатах – та ну там все ж по процесах ?
The technology programme was initiated by the director of BBC Technology Ashley Highfield in 2008.[1] It aimed to streamline broadcast operations by moving to a fully digital, tapeless production workflow at a cost of £81.7 million. Forecast to deliver cost savings to the BBC of around £18 million, DMI was contracted out to the technology services provider Siemens with consulting by Deloitte. Among the production features to be provided by DMI were a media ingest system; a media asset management system, unifying audio, video and stills archival; an online storyboarding system; and metadata storage and sharing. A core part of the system was formed by using Cinegy, a production suite originally developed prior to the DMI project by the BBC and selected by Siemens in 2008.[4][5] The DMI Programme Director was television producer and entrepreneur Raymond P. Le Gué.
The Expeditionary Combat Support System (ECSS) was a failed enterprise resource planning software project undertaken by the United States Air Force (USAF) between 2005 and 2012. The goal of the project was to automate and streamline the USAF's logistics operations by, in part, consolidating and replacing over 200 separate legacy systems. Development of the system was originally contracted to the Oracle Corporation in 2005, and was later supervised by Computer Sciences Corporation.
Harvard review
Although uncertainty is accelerating, it isn’t affecting all industries the same way. That’s because there are two primary types of uncertainty - demand uncertainty (will customers buy your product?) and technological uncertainty (can we make a desirable solution?) - and how much uncertainty your industry faces depends on the interaction of the two.
Demand uncertainty arises from the unknowns associated with solving any problem, such as hidden customer preferences.
Technological uncertainty results from unknowns regarding the technologies that might emerge or be combined to create a new solution.
Тут наведено основні причини :
http://mooc.ee/MTAT.03.243/2015_spring/uploads/Main/top-10.pdf
12 досліджень на тему ризиків. і вони доводять що нема проблеми в складності проектів. це все від того що або не так зрозуміли вимоги, або проект не важливий, або не питають користувачів і не думають про них.
про вимоги - Business Analysis Benchmark report каже що
The companies using best requirements practices will estimate a project at $3 million and
better than half the time will spend $3 million on that project. Including all failures, scope
creep, and mistakes across the entire portfolio of projects, this group will spend, on average,
$3.63 million per project.
The Business Analysis Benchmark
Page 3 of 30
• The companies using poor requirements practices will estimate a project at $3 million and will
be on budget less than 20% of the time. 50% of time, the overrun on the project both in time
and budget will be massive. Across the entire portfolio of successes and failures, this
company with poor requirements practices will
Не менеджимо експектейшени / user commitment– блумберг….. Eikon In 2010, Thomson Reuters took on the mighty Bloomberg with a new trading terminal of its own. It was to be a bloody fight—but it’s already over.
In construction it’s hard too, but they learned for that and we IT project managers don’t. Also they have unique projects with unique risks
Boring/Hard to do/Does not work - we do risks management every day, so this is intuitive. (before cross the road we look around)
We should think positively toward the project goals . Any "manage for success"
approach based on making sure risks don't materialize just sets a project up for disaster when they
do. For any sensibly organized project, the risks are not incidental to the project goal; they come
with the terrain.
No data for risks ::::: However, the major risks facing most projects are common to all IT projects.
Вплив невизначеності на цілі
Success of the project depends on project risks management
No effect on objectives – no risk
Ризики часто путають з їхміни причинами та наслідками.
Причини: need to use an unproven new technology, the lack of skilled personnel, or the fact that the organization has never done a similar project before.
Ризики: the possibility that planned productivity targets might not be met, interest or exchange rates might fluctuate, the chance that client expectations may be misunderstood
Наслідки: being early for a milestone, exceeding the authorized budget, or failing to meet contractually agreed performance targets
Приклад ризику з життя: через екстраповільну соляріс машину є ймовірність того що ми потратимо дуже багато часу на дебагінг на тому середовищі, що призведе до збільшення зусиль/вартості тої роботи і змістить час поставки.
we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don't know we don't know. And if one looks throughout the history of our country and other free countries, it is the latter category that tend to be the difficult ones.
Donald Rumsfeld, 2002
Це прості ризики про які ми знаємо і які завжди є. тут чеклісти канають.
Що можна робити:
There is the knowledge of the risk and we understand an impact
Це прості ризики про які ми знаємо і які завжди є. Ми про них знаємо. Не хочемо помічати , так але знаємо
тут чеклісти канають.
Що можна робити:
Personnel Shortfalls: Staffing with top talent; job matching;
team-building; morale-building; cross-training; prescheduling
key people.
2 Unrealistic Schedules and Budgets: Detailed, multisource
cost and schedule estimation; design to cost; incremental
development; software reuse; requirements scrubbing.
3 Developing the wrong software functions: Organizational
analysis; mission analysis; operational concept formulation; user
surveys; prototyping; early users’ manuals.
4 Developing the wrong user interface: Prototyping; scenarios;
task analysis.
5 Gold-plating. Requirements scrubbing: prototyping; costbenefit
analysis; design to cost.
6 Continuing stream of requirements changes: High change
threshold; information-hiding; incremental development (defer
changes to later increments).
7 Shortfalls in externally-performed tasks: Reference-checking;
pre-award audits; award-fee contracts; competitive design or
prototyping; team-building.
8 Shortfalls in externally-furnished components:
Benchmarking; inspections; reference checking; compatibility
analysis.
9 Real-time performance shortfalls: Simulation; benchmarking;
modelling; prototyping; instrumentation; tuning.
10 Straining computer science capabilities: Technical analysis;
cost-benefit analysis; prototyping; reference checking. ------- ЦЕ КОЛИ ВИКОРИСТОВУЮТЬ СОФТВАРЕ ДЕВ. НЕ ПО ПРИЗНАЧЕННЮ
Це прості ризики про які ми знаємо і які завжди є. Ми про них знаємо. Не хочемо помічати , так але знаємо
тут чеклісти канають.
Що можна робити:
Personnel Shortfalls: Staffing with top talent; job matching;
team-building; morale-building; cross-training; prescheduling
key people.
2 Unrealistic Schedules and Budgets: Detailed, multisource
cost and schedule estimation; design to cost; incremental
development; software reuse; requirements scrubbing.
3 Developing the wrong software functions: Organizational
analysis; mission analysis; operational concept formulation; user
surveys; prototyping; early users’ manuals.
4 Developing the wrong user interface: Prototyping; scenarios;
task analysis.
5 Gold-plating. Requirements scrubbing: prototyping; costbenefit
analysis; design to cost.
6 Continuing stream of requirements changes: High change
threshold; information-hiding; incremental development (defer
changes to later increments).
7 Shortfalls in externally-performed tasks: Reference-checking;
pre-award audits; award-fee contracts; competitive design or
prototyping; team-building.
8 Shortfalls in externally-furnished components:
Benchmarking; inspections; reference checking; compatibility
analysis.
9 Real-time performance shortfalls: Simulation; benchmarking;
modelling; prototyping; instrumentation; tuning.
10 Straining computer science capabilities: Technical analysis;
cost-benefit analysis; prototyping; reference checking. ------- ЦЕ КОЛИ ВИКОРИСТОВУЮТЬ СОФТВАРЕ ДЕВ. НЕ ПО ПРИЗНАЧЕННЮ. Hadup – бо хочеться попробувати
194 питання
Product Engineering. (project risks) The technical aspects of the work to be accomplished.
Development Environment. (project risks) The methods, procedures, and tools used to produce the product.
Program Constraints. (business risks) The contractual, organizational, and operational factors within which the software is developed but which are generally outside
of the direct control of the local management.
-----Requirements
Requirements are poorly documented
Requirements have been baselined but continue to change
Defined, and further definition expands the scope of the project
Additional requirements are added
Vaguely specified areas of the product are more time-consuming than expected
-----Development environment
Facilities are not available on time
Development tools are not chosen based on their technical merits and do not
provide the planned productivity
Learning curve for new development tool is longer or steeper than expected
End-users
End-user insists on new requirements
End-user ultimately finds product to be unsatisfactory, requiring redesign and
Rework
Schedule creation
Schedule, resources, and product definition have all been dictated by the
customer or upper management and are not in balance
Schedule is optimistic, "best case" (rather than realistic, "expected case")
Schedule omits necessary tasks
There is the knowledge of the risk but we don’t understand an impact
There is the knowledge of the risk but we don’t understand an impact
Якщо бурндовн показує що ризиків не зменшується , то тім може зробити окрему ітерацію де буде фіксати лиш ризики
Risks burndown chart
Our simple risk census here describes each risk, provides an estimate of how likely the risk is to occur, the impact to the schedule if the risk did occur, and then the expected exposure to the risk which is the probability multiplied by the size of the loss. I recommend creating a risk census during the first sprint planning meeting and then updating it quickly during subsequent planning meetings as new risks are identified or as the probabilities or sizes of known risks change.
A Black Swan is an improbable and unexpected event that has three characteristics. First, it takes us completely by surprise, typically because it’s outside of our models.
Second, a Black Swan has a disproportionately large impact. Many rare and surprising events happen that aren’t such a big deal.
Third, after a Black Swan, people have a tendency to say that they saw it coming.
Якщо ви гарно зробили домашню роботу і запланували всю роботу по ризикам, закладіть ще раз так на unknown unknowns