Diese Präsentation wurde erfolgreich gemeldet.
Die SlideShare-Präsentation wird heruntergeladen. ×
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Nächste SlideShare
Risk Analysis Webinar
Risk Analysis Webinar
Wird geladen in …3
×

Hier ansehen

1 von 59 Anzeige

Weitere Verwandte Inhalte

Diashows für Sie (20)

Ähnlich wie R af d (20)

Anzeige

R af d

  1. 1. Risk Analysis for Dummies<br />Presented by Nick Leghorn<br />
  2. 2. Credentials<br />B.S., Security and Risk AnalysisThe Pennsylvania State University<br />Risk Analyst for a government contractor<br />NSA Certified INFOSEC Professional<br />Speaker at The Last HOPE:“The NYC Taxi System: Privacy Vs. Utility”<br />
  3. 3. This talk is for…<br />IT Professionals<br />Penetration testers<br />Network security folk<br />Anyone who needs to explain “risk”<br />
  4. 4. WARNING<br />The risk analysis process depends on the imagination,creativity and integrity of the individuals doing the analysis. The mere application of these techniques without appropriately talented staff does not ensure a proper and thorough risk analysis product.<br />
  5. 5. NOTICE<br />The data, charts and information contained within this presentation are completely notional and do not represent any real data. No sensitive or otherwise classified information is contained within this presentation.<br />FBI, please don’t arrest me.<br />
  6. 6. The Story of Nate and Cliff<br />
  7. 7. What is “Risk”?<br />Seriously.<br />There are microphones, use them!<br />
  8. 8. What is “Risk”?<br />Any uncertainty about the future<br />Technically can be both positive and negative<br />Security questions focus only on negative outcomes<br />
  9. 9. The Six Questions of Risk Management<br />Risk Assessment<br />Risk management<br />What can happen?<br />How likely is it to happen?<br />What are the consequences if it happens?<br />What can be done?<br />What are the benefits, costs and risks of each option?<br />What are the impacts of each option on future options?<br />
  10. 10. The Risk Equation<br />probability of an outcome given that event <br />is the combination of<br />probability of an event<br />the value of that event and outcome pair<br />Risk<br />For every event and outcome<br />
  11. 11. Scope<br />Scope <br />protector<br />threat<br />asset<br />is the set of<br />
  12. 12. Scope<br />Asset<br />Something which provides a benefit to the possessor<br />Something which the protector is charged with safekeeping<br />Protector<br />The entity charged with safekeeping of the asset<br />An entity where the loss of the asset would be harmful<br />Threat<br />An entity with the desire to deny the asset to the protector<br />A force which could destroy, disrupt, or otherwise harm the asset<br />
  13. 13. For Nate and Cliff…<br />Protector: Nate and the NOC<br />Threat: “Hackers”<br />Asset: Company information<br />
  14. 14. Back to the equation…<br />Probability?<br />
  15. 15. Calculating probability<br />“Of all the things than can happen, how likely is each one?”<br />Universe as a box…<br />Coin Flip<br />
  16. 16. Calculating probability<br />“Of all the things than can happen, how likely is each one?”<br />Universe as a box…<br />Coin Flip<br />Heads<br />Tails<br />
  17. 17. Calculating probability<br />“Of all the things than can happen, how likely is each one?”<br />Universe as a box…<br />The size of each “box” is the probability<br />Strive for MECE<br />Coin Flip<br />Heads<br />Tails<br />Heads<br />Tails<br />Coin rolls away and is lost<br />
  18. 18. “You must not say ‘never.’ That is a lazy slurring-over of the facts. Actually, [risk analysis] predicts only probabilities. A particular event may be infinitesimally probable, but the probability is always greater than zero.”<br />Second Foundation (Isaac Asimov)<br />
  19. 19. Calculating probability<br />Past data<br />Events of concern / total events<br />3 successful attacks / 30,000 attempts = 0.0001 probability<br />“Binning your gut”<br />Low, Medium, High<br />
  20. 20. Remember:<br />Probability must be calculated for BOTH<br />Probability of an event<br />Probability of an outcome GIVEN that the event has taken place<br />
  21. 21. Why does “valuation” matter?<br />Some events are more concerning than others<br />Death in a car accident<br />Death in a plane crash<br />Value of the (e,o) pair can be monetary, time based, goodwill based, whatever is of most concern<br />
  22. 22. The process<br />
  23. 23. The process<br />
  24. 24. The process<br />
  25. 25. The process<br />
  26. 26. The process<br />
  27. 27. The process<br />
  28. 28. The process<br />
  29. 29. The process<br />
  30. 30. The process<br />
  31. 31. Method 1: The Simple Chart<br />THIS IS NOT A “RISK MATRIX”!<br />
  32. 32. Method 2: The Probabilistic Chart<br />(Probability of event)*(Probability of outcome given event)<br />
  33. 33. Method 3: Annualized Loss Expectancy<br />(Probability from last page)*(Loss from event)<br />
  34. 34. Shortcuts and Methodologies<br />
  35. 35. How to use a “Factor based Model”<br />“Factor Based Models” provide a formula for quick and easy assessment of a range of items and rank ordering of them.<br />WARNING: This system only provides a RELATIVE ranking of the items listed. <br />
  36. 36. How to use a “Factor based Model”<br />Assign a range of numbers to each factor<br />Try to use even ranges of numbers (1-4)<br />Ensure that the higher the number, the more it points towards whatever the issue at hand is<br />Evaluate each factor using that range<br />Add up the combined score<br />
  37. 37. CARVER: Target Selection<br />Criticality<br />Accessibility<br />Recoverability<br />Vulnerability<br />Effect <br />Recognizability<br />
  38. 38. CARVER Analysis: The Next HOPE<br />P: HOPE Staff | A: Enjoyment of attendees | T: Rouge attendee<br />Scale: 1-6<br />6 = Contributes highly to attack success probability<br />1 = Does not contribute to attack success probability<br />
  39. 39. CARVER Analysis: The Next HOPE<br />P: HOPE Staff | A: Enjoyment of attendees | T: Rouge attendee<br />Scale: 1-6<br />6 = Contributes highly to attack success probability<br />1 = Does not contribute to attack success probability<br />
  40. 40. EVIL DONE: Target Selection<br />Exposed<br />Vital<br />Iconic<br />Legitimate<br />Destructible<br />Occupied<br />Near<br />Easy<br />
  41. 41. DSHARPP: Target Selection<br />Demography<br />Symbology<br />History<br />Accessibility<br />Recuperability<br />Population<br />Proximity<br />
  42. 42. CRAVED: Attractiveness of Assets<br />Concealable<br />Removable<br />Available<br />Valuable<br />Enjoyable<br />Disposable<br />
  43. 43. MURDEROUS: Weapon Selection<br />Multipurpose<br />Undetectable<br />Removable<br />Destructive<br />Enjoyable<br />Reliable<br />Obtainable<br />Uncomplicated<br />Safe<br />
  44. 44. ESEER: Facilitation of crime<br />Easy<br />Safe<br />Excusable<br />Enticing<br />Rewarding<br />
  45. 45. HOPE: Ease of social engineering<br />Hour of the day<br />Oversight by manager<br />Pressure<br />Encouragement<br />
  46. 46. Scales<br />
  47. 47. Scales are IMPORTANT<br />Let’s assume a FBM of: A+B+C+D<br />A: 1-4 Vulnerability<br />B: $ of damages<br />C: Time to return to operation (Seconds)<br />D: Lives lost<br />For:<br />Ships?<br />Buildings?<br />Troops?<br />
  48. 48. Types of scales<br />Nominal<br />Binning, no order (apples, pears, oranges)<br />Ordinal<br />Hierarchical, no calculations (High, medium, low)<br />Interval<br />Hierarchy and calculations (1, 2, 4, 8, 16)<br />Natural<br />Interval with countable items (deaths, $, time)<br />
  49. 49. Let’s bring this all together<br />Nate’s presentation<br />
  50. 50. Risk Analysis of Corporate Systems<br />Presented by Nate<br />
  51. 51. Attackers are attempting to penetrate our network to steal, destroy or alter corporate data<br />NOC has been tasked with securing against these attacks<br />Problem at Issue<br />
  52. 52. Attacks over the last 3 years<br />
  53. 53. Andrews Co.<br />Victim of a penetration, customer data leaked<br />Loss of revenue from loss of goodwill: $2.4M<br />Revenue dedicated to fixing systems: $10M<br />TNH Inc.<br />Victim of a lengthy Denial of Service attack<br />Loss of revenue from inability to do business: $30M<br />Revenue dedicated to upgrading systems: $12M<br />Effects of attacks on other companies<br />
  54. 54. Implement an IDS<br />Detects attacks<br />$10,000 to install, $1,000/year in upkeep<br />Tighten firewall<br />Stops intruders<br />$5,000 to install, $500/year in upkeep<br />Install WEP at POS facilities<br />Tightens security<br />$10 in equipment & $5 in labor per facility<br />($10+$5)*50,000 = $750,000<br />No upkeep costs<br />Recommendations<br />
  55. 55. Cost benefit analysis<br />As we can see by the above numbers, by spending 766,500 this year we can mitigate the possible effects of an attack which (on average) will cost $15M. Thus, the loss will be approx. $14,233,500 less than without the recommended upgrades.<br />
  56. 56. Annualized Loss Expectancy<br />
  57. 57. The End<br />(Of the presentation within a presentation)<br />
  58. 58. Remember these?<br />Risk Assessment<br />Risk management<br />What can happen?<br />How likely is it to happen?<br />What are the consequences if it happens?<br />What can be done?<br />What are the benefits, costs and risks of each option?<br />What are the impacts of each option on future options?<br />
  59. 59. Things to remember…<br />Use common sense!<br />If something looks wrong, it usually is<br />Scope the question<br />Don’t bite off more than you can chew<br />Use proper scales<br />Remember the 6 questions of risk<br />FBMs are quick and easy, but be careful!<br />Check your work!<br />Academic integrity BEFORE making managers happy<br />
  60. 60. Questions?<br />Full presentation (including slides, resources, audio & video):<br />Blog.NickLeghorn.com<br />
  61. 61. “You must not say ‘never.’ That is a lazy slurring-over of the facts. Actually, [risk analysis] predicts only probabilities. A particular event may be infinitesimally probable, but the probability is always greater than zero.”<br />Second Foundation (Isaac Asimov)<br />

×