Diese Präsentation wurde erfolgreich gemeldet.
Wir verwenden Ihre LinkedIn Profilangaben und Informationen zu Ihren Aktivitäten, um Anzeigen zu personalisieren und Ihnen relevantere Inhalte anzuzeigen. Sie können Ihre Anzeigeneinstellungen jederzeit ändern.

Crowdsourced Probability Estimates: A Field Guide

58 Aufrufe

Veröffentlicht am

Slides from Tony Martin-Vegue's presentation at SIRAcon 2018, February 7, 2018

"Crowdsourced Probability Estimates: A Field Guide"

Abstract:

Crowdsourced Probability Estimates: A Field Guide

Probability estimates are the cornerstone of any good risk assessment in which data is sparse or expensive to come by, and are often thought of as one of the best ways to supplement existing information with subject matter expertise. Many risk analysts, however, can run into issues when trying to integrate the opinions of many subject matter experts into a risk management program.

Some of these problems are: seemingly contradictory probability estimates, bias that can creep into results and the challenge of collecting and using large amounts of data.

This talk covers the presenter's own experience in building a program within a company to crowdsource probability estimates from a varied group of subject matter experts, controlling for bias, weeding out those that aren’t exactly experts and scaling the program for large companies. Participants will be surprised to find that they already have many of the tools they need to get started, such as the ability to email surveys and simple models to create distributions from many the probability estimates they collect.

Veröffentlicht in: Business
  • Als Erste(r) kommentieren

  • Gehören Sie zu den Ersten, denen das gefällt!

Crowdsourced Probability Estimates: A Field Guide

  1. 1. @tdmv #siracon2018 Crowdsourced Probability Estimates: a Field Guide
  2. 2. 2 I’m Bob Best city ever! FinTech firm Risk!
  3. 3. Ransomware epidemic! What’s the risk?
  4. 4. “More than 91 percent [of] clients victimized by ransomware”
  5. 5. 5 The research shows that the ransomware epidemic affects over 91% of companies. What is the probability of this happening here? “ ”
  6. 6. 6 100%... really? Show me your workpapers. How many experts did you poll?
  7. 7. Problem! Junk research Cognitive bias 1 expert
  8. 8. 8 •Collect and vet research quality •Identify and control for bias •Crowdsource expert assessments The Fix
  9. 9. 9 Repeatable
  10. 10. 10 •Gauge the level of calibration among experts •Weigh the expert opinions, combine (math or group) •Integrate into risk assessment Assessment
  11. 11. Expert Judgement • Interdisciplinary • Acquired knowledge • Predictive judgements • Use sparingly
  12. 12. Expert elicitation should build on the best available research and analysis and be undertaken only when, given those, the state of knowledge will remain insufficient to support timely informed assessment and decision making. - M. Granger Morgan ” “
  13. 13. 13 “More than 91 percent [of] clients victimized by ransomware”1 • First Google result • “Clients” are MSP’s • Probably not statistically significant (not disclosed) Source: Datto’s 2016 Global Ransomware Report http://pages.datto.com/rs/572-ZRG-001/images/DattoStateOfTheChannelRansomwareReport2016_RH.pdf
  14. 14. 14 “More than 91 percent [of] clients victimized by ransomware”1 • First Google result • “Clients” are MSP’s • Probably not statistically significant (not disclosed) Source: Datto’s 2016 Global Ransomware Report http://pages.datto.com/rs/572-ZRG-001/images/DattoStateOfTheChannelRansomwareReport2016_RH.pdf Survey
  15. 15. 1 infection last year
  16. 16. 16 Ransomware epidemic! Objection, Your Honor. Leading the witness!
  17. 17. 17 Literature Review • Common in Social Sciences • Cornerstone of any research project • Vet quality of sources
  18. 18. 18 • Gathered and read 12 reports on Ransomware • Vary from surveys, empirical studies, research • Excluded 6
  19. 19. 19
  20. 20. Controlling for Bias
  21. 21. What actually happens in the cyber threat landscape What we read about Availability Bias
  22. 22. Overconfident professionals sincerely believe they have expertise, act as experts and look like experts. You will have to struggle to remind yourself that they may be in the grip of an illusion. - Daniel Kahneman ” “ Overconfidence Effect
  23. 23. 23 InfoSec Folklore Effect
  24. 24. “60% of small companies that suffer a cyber attack are out of business within six months.” “80% of all cyber attacks originate from the inside” “75 percent of companies have experienced a data breach in the past 12 months” 24
  25. 25. Eliciting Experts • Crowdsourcing • Identified some bias • Using 6 sources
  26. 26. 15 respondents (SIRA and FAIR Institute) Self-selected InfoSec experts Finding Experts
  27. 27. Perfectly Calibrated Seed Questions
  28. 28. Control for Bias Determine Calibration Seed Questions General Trivia Questions Over Confident Discard or Weigh Lower Perfectly Calibrated Use Estimate Under Confident Discard or Weigh Lower 28
  29. 29. Calibration Test
  30. 30. Tally the Responses • Convert percentages to a decimal • Add up – this is “expected” number correct • Compare against total number correct* *From “How to Measure Anything in Cyber Risk,” | Doug Hubbard, Richard Seiersen
  31. 31. Name Question score Calibration Score Calibration Respondent 1 8/10 8.2 Perfectly calibrated Respondent 2 7/10 8.2 Slightly overconfident Respondent 3 8/10 8.6 Perfectly calibrated Respondent 4 7/10 7.4 Perfectly calibrated Respondent 5 8/10 7.7 Perfectly calibrated Respondent 6 6/10 7.2 Slightly overconfident Respondent 7 9/10 7.2 Underconfident Respondent 8 6/10 7.8 Overconfident Respondent 9 7/10 6.9 Perfectly calibrated Respondent 10 5/10 6.7 Overconfident Respondent 11 8/10 7.7 Perfectly calibrated Respondent 12 5/10 7.4 Overconfident Respondent 13 7/10 7.1 Perfectly calibrated Respondent 14 8/10 7.1 Perfectly calibrated Respondent 15 8/10 8.7 Perfectly calibrated Results
  32. 32. Respondent Calibrated Min Mode Max Respondent 1 Yes 10 25 35 Respondent 2 No 27 32 34 Respondent 3 Yes 15 35 65 Respondent 4 Yes 1 5 36 Respondent 5 Yes 1 2 65 Respondent 6 No 20 25 40 Respondent 7 Yes 10 20 60 Respondent 8 Yes 1 50 100 Respondent 9 No 27 30 34 Respondent 10 No 25 31 35 Respondent 11 Yes 0 5 40 Respondent 12 No 5 10 20 Respondent 13 No 1 5 20 Respondent 14 No 5 35 80 Respondent 15 Yes 20 30 40 Probability Estimates
  33. 33. Diversity of Opinions
  34. 34. Are they calibrated? • Discard probability estimates; or • Coach on ranges and calibration; or • Integrate into final assessment, but weigh lower Misunderstood the question, research or assumptions • Follow-up with the expert; review their understanding of the request • If a misunderstanding, ask for a reassessment Different world-view • Let the expert challenge your assumptions • Consider multiple risk assessments Checklist for Vastly Differing Opinions
  35. 35. 35 Source: Doran & Zimmermann 2009, Anderegg et al 2011 and Cook et al 2013.
  36. 36. Science is not a matter of majority vote. Sometimes it is the minority outlier who ultimately turns out to have been correct. Ignoring that fact can lead to results that do not serve the needs of decision makers. - M. Granger Morgan ” “
  37. 37. Respondent Calibrated Min Mode Max Respondent 1 Yes 10 25 35 Respondent 2 No 27 32 34 Respondent 3 Yes 15 35 65 Respondent 4 Yes 1 5 36 Respondent 5 Yes 1 2 65 Respondent 6 No 20 25 40 Respondent 7 Yes 10 20 60 Respondent 8 Yes 1 50 100 Respondent 9 No 27 30 34 Respondent 10 No 25 31 35 Respondent 11 Yes 0 5 40 Respondent 12 No 5 10 20 Respondent 13 No 1 5 20 Respondent 14 No 5 35 80 Respondent 15 Yes 20 30 40 Probability Estimates
  38. 38. Behavioral • Delphi Technique • Nominal Group Technique • Negotiation to reach a consensus Mathematical • Averaging (don’t use) • Linear Opinion Pool • Methods Using Bayes Methods for Combining
  39. 39. All Respondents, Equal Weight
  40. 40. 40 All Respondents, Equal Weight
  41. 41. 41 All Respondents, Weighted on Calibration
  42. 42. 42 All Respondents, Weighted on Calibration
  43. 43. 43 Calibrated Only
  44. 44. 44 Calibrated Only
  45. 45. Free • Excalibur • R Paid • Model Risk • Crystal Ball • @Risk Software for Combining
  46. 46. Happy! Thanks! Nice work!
  47. 47. 47
  48. 48. • Everyone that participated in my exercise • Wade Baker • Jay Jacobs • Cynentia Institute • SIRA • The FAIR Institute Thank You

×