Diese Präsentation wurde erfolgreich gemeldet.
Die SlideShare-Präsentation wird heruntergeladen. ×

"Cognitive Traps in Security Planning"

Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Nächste SlideShare
Afp weeks 4_-_5
Afp weeks 4_-_5
Wird geladen in …3
×

Hier ansehen

1 von 49 Anzeige

Weitere Verwandte Inhalte

Ähnlich wie "Cognitive Traps in Security Planning" (20)

Aktuellste (20)

Anzeige

"Cognitive Traps in Security Planning"

  1. 1. Recognizing….& Avoiding Cognitive Traps in Security Planning (Are we safe?) © Ian MacVicar, CD, DSS, MDS, MA Intl Affairs, BA (Hon.), pcsc, plsc (ic.macv@gmail.com) pdsummit2016 27 April 2016 1
  2. 2. Aim To RECOGNIZE and AVOID cognitive traps in security (& other) plans. Relevance • Classical decision theory assumes that people make rational decisions. • Public/private institutions rely on individuals to take rational (i.e. non- emotionally biased) judgements on complex problems. • Information is (almost) NEVER complete and people are NOT as rational as they would like to believe, BUT…. • Presentation provides strategies on how to take (more) rational decisions. • Small to medium businesses are equally at risk as large corporations. 2
  3. 3. Scope • Think about thinking – anticipatory governance, artificial intelligence, biases, bounded rationality, complicated vs. complex problems, intelligence analysis, marketing, policy vs. plan, technological limits, threats (keywords) • RECOGNIZE cognitive traps: • Extent of individual behavioral and institutional biases • Selected thinkers list (partial) with key definitions • AVOID cognitive traps through: • Pre-programmed responses to cognitive traps • Human, machine, procedural, conceptual approaches to critical thinking • Immediate Actions to suspicious inquiries 3
  4. 4. Definitions • BLUF – Bottom Line Up Front, the straight goods, or best assessment • Bounded rationality – the limits imposed by human cognitive capacity, the information available, and time. • Cognitive bias (trap) – systematic deviation from rationality in judgement. • Decision support tools – procedures or machines to assist problem solving. • Rationality – thought process based on reasoning using empirical (i.e. factual) evidence. Rational Actor model – seeks to maximize utility, long assumed norm in decision making in economics, evolutionary biology, political science • Satisficing = satisfactory + sufficient (Herbert A. Simon, 1956) 4
  5. 5. Complicated vs. Complex Problems • Complicated problem – MANY (even Ks) of variables with shifting relevance, but whose properties known; can be predicted. • (e.g. trans-oceanic aviation, landing on the moon). • Complex problem – evolving (or even unknown) variables, with partially or poorly understood properties, which can even be influenced by observation, hard to measure due to an emotive component. • (e.g. crime, security, social inequity, war) 5
  6. 6. Threats • Kinetic (i.e. physical) attack, such as arson, theft, active intruder. • Non-kinetic attack, i.e. phishing , or social engineering by email or telephone; generic hacking. • How: • Insider threat; • Failure to consider all factors; and, • Overconfidence in physical and cyber defences. 6
  7. 7. Macro Problem - 9/11 Conspiracy 7
  8. 8. How could this have happened? • “The most important failure was one of imagination.” • The Report of the 9/11 Commission, Executive Summary, p.9. (2004). Was it simply this - or was it…. • Agency Adaptation Failure: • Amy Zegart (2007). Spying Blind: The CIA, the FBI, and the Origins of 9/11. • 23 cases of possible intervention in the conspiracy, 12 to CIA and 13 to FBI. • lack inter- and intra-agency cooperation. 8
  9. 9. National Security Failures – 40 Year Scale • Pentagon Papers – June 1971: • Daniel Ellsberg, 4.1K pgs, U.S.– Vietnam relations, 1945-1967 • WikiLeaks – Feb – Oct 2010: • Julian Assange et al. • U.S. Army Pvt. Chelsea (né Bradley) Manning • 734,885 files on Iraq/Afghanistan • Global Surveillance Disclosures – Jun 2013: • Edward Snowden, 1.5 - 1.77M documents • WikiLeaks – 2011 – 2015 (10M+ documents): • Apr 2011 – Guantanamo Bay files • Nov 2013 – Trans Pacific Partnership IP chapter • Jul 2015 – NSA tapping Angela Merkel’s phone 9
  10. 10. Commercial IT Security Failures • Stratfor (Strategic Forecasting, Inc.): • Dec 2011. Anonymous claims obtaining client list & credit cards (200 GB data). • Feb 2012 – WikiLeaks publishes 5M emails provided by Anonymous • Ashley Madison – Aug 2015 • Impact Team reveals 20 GB of client info 10 STRATFOR
  11. 11. Commercial IT Security Failures • TJX, 2007 - 45.7M CC details • Target Corporation, 2014 – 40M CCs • Home Depot, 2014 • 53-56M CCs • At risk: $, IP, PII, reputations • From: Criminals, nation-states, hacktivists, industry rivals, insiders. • Known security challenges : • too much data, AND • not enough time. • Lesser known challenges: • Cognitive traps: individual and institutional biases; • How to improve human aspects of Governance, Risk, and Compliance (GRC). 11
  12. 12. Overlap – Commerce & Government • Panama Papers - Apr 2016 (off-shore money client list) • 4th largest offshore law firm, 600 ppl in 42 nations • 11.5M documents, 2.6TB on 214K offshore companies leaked to ICIJ • 100K companies in British Virgin Islands alone, (pop. 28K) • 200 countries/territories, 12 current/former leaders • Named – Vladimir Putin, Nawaz Sharif, PM Pakistan; Ayad Allawi, former VP of Iraq; Petro Poroshenko, President of Ukraine; Alaa Mubarak, son of Egypt’s former President; FIFA president Gianni Infantino; UK PM David Cameron’s father • PM Iceland Sigmundur Davíð Gunnlaugsson resigns 5 Apr 16 12
  13. 13. Why? 13
  14. 14. Example of John Sileo www.sileo.com • “It won’t happen to me.” • Software company; 15 ppl, $2M revenue. • Victim of identity fraud to Florida lady; lost $300K to his business partner. • Modus operandi - authority, greed, humour, reciprocity, urgency. • Judge security breach or learn from it. • Now advises US Govt departments, big pharma, media, insurance agencies. 14
  15. 15. Overview of NS Businesses • Not only big companies are vulnerable. • SMEs make up 97.4% of the total businesses in NS (Stats Can 2009) 15
  16. 16. How many times have you…. • …heard a boss (or a colleague) say: “In my (i.e. vast) experience…” or “been there, done that…?” • seen someone rely on “SALT” or “SALY” when confronting a new problem? • seen an innovative presentation dismissed within seconds of delivery…? • heard a boss promote a solution that does not work, works for a time, or under-delivers on initial promises…? • seen a colleague (or other department) fail to share information…and then impresses the C-Suite to your chagrin.…? • seen a nation drop the ball…. (i.e. fail in security planning)? 16
  17. 17. Biases • Ambiguity effect – no info tends to decision on certainty (e.g. fixed rate mortgage) • Anchor effect – rely on the first piece of information received (e.g. initial price) • Availability effect – easily recalled items “stick” (e.g. violent bad news) • Confirmation bias – have the solution before weighing all the facts • Echo – read/hear multiple reports of the same phenomenon using the same verbiage – and the same source • Groupthink – tendency of a group to reach one solution • Halo effect – a positive impression of one aspect of a person or company causes discounting of negative factors (e.g. physical attractiveness) • Hindsight bias – “I knew it all along…” • Overconfidence – a leader who displays no self-doubt • Recency illusion - defer to the latest information received • Vividness bias – consider degree of lost blood, money, or data 17
  18. 18. Thinking about Thinking…. • Brain – original energy saver. • People tend to seek: • Coherence - based on repetition of an observable pattern OR the uniqueness of a problem • Common attributes similar to previous situations (inductive reasoning based on analogy) OR • particular aspects of the problem set (deduction) • ….they (often) display biases due to an early leap to a conclusion OR a sustained focus on specific factors. • 100+ known shortcuts, heuristics, biases…. 18
  19. 19. 19
  20. 20. Most Common Cognitive Biases Name Description Fundamental Attribution Error (FAE) Also known as the correspondence bias (Baumeister & Bushman, 2010) is the tendency for people to over-emphasize personality-based explanations for behaviours observed in others. At the same time, individuals under-emphasize the role and power of situational influences on the same behaviour. Jones and Harris’ (1967)[28] classic study illustrates the FAE. Despite being made aware that the target’s speech direction (pro-Castro/anti-Castro) was assigned to the writer, participants ignored the situational pressures and attributed pro-Castro attitudes to the writer when the speech represented such attitudes. Confirmation bias The tendency to search for or interpret information in a way that confirms one's preconceptions. In addition, individuals may discredit information that does not support their views.[29] The confirmation bias is related to the concept of cognitive dissonance. Whereby, individuals may reduce inconsistency by searching for information which re-confirms their views (Jermias, 2001, p. 146).[30] Self-serving bias The tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests. Belief bias When one's evaluation of the logical strength of an argument is biased by their belief in the truth or falsity of the conclusion. Framing Using a too-narrow approach and description of the situation or issue. Hindsight bias Sometimes called the "I-knew-it-all-along" effect, is the inclination to see past events as being predictable. 20
  21. 21. Wikipedia “Listicle” • Preceding laundry list of biases is of limited inherent value. • Value is in knowing when, why, and where biases can arise, AND • How to counter them with individual, team, procedural analytic techniques, or with machine assisted analysis. 21
  22. 22. Language & Prejudice • S.I. Hayakawa. (1990 [1940]). Language in Thought and Action. • Pioneer in General semantics • Words are symbols used to approximate an experience, to describe an object, a concept, or – a person or a group. • Two-varied (i.e. B & W, good vs. evil) as opposed to a multi- varied orientation which accepts grey. • “The map is not the territory” (A. Korzybsky, 1922) 22
  23. 23. System 1 & System 2 • Daniel Kahneman. (2011). Thinking, Fast and Slow. • 2002 Nobel Laureate in Economic Sciences for Prospect Theory, i.e. judgment defies rational actor thinking, and seeks more to avoid loss than to risk gains. • Default is System 1, which conserves energy. 23
  24. 24. System 1 & System 2 Maintenance • System 1 – fight or flight, efficient • System 2 – analytic, burns calories, • Practice turns System 2 tasks into System 1 tasks - as habits form. • Overcome System 1 overuse: • 1) SEEK multiple sources of evidence; • 2) VERIFY independent sources; • 3) SLOW DOWN • 4) WYSIATI – don’t free-associate with System 1. 24 Complex, instinctive, or both?
  25. 25. Black Swans • Nassim Nicholas Taleb (2010 [2007]). The Black Swan: The Impact of the Highly Improbable. • Characterized by: rarity, extreme impact, retrospective predictability: • e.g. 9/11, 2008 market crash • Past does not predict the future: • e.g. 1001 days of a turkey’s life • Platonic fold – gap between what you know and what you think you know. • Randomness and uncertainty abound • Unknown unknowns 25
  26. 26. Selective Attention - Focus • How many times do the players in white pass the ball? • (Anything else?) 26
  27. 27. Responses to cognitive traps Human, machine, procedural, conceptual 27
  28. 28. Human – “Curing” Analytic Pathologies • Morgan Jones. (1998). The Thinker’s Toolkit: 14 Powerful Techniques for Problem Solving. (2nd ed.) • Richards Heuer. (1999). Psychology of Intelligence Analysis. • Jeffrey Cooper. (2005). Curing Analytic Pathologies. • Richards Heuer & Randolph Pherson. (2011). Structured Analytic Techniques for Intelligence Analysis. • BLUF: • Externalize thought processes for independent logic test • Systemic examination of all factors for relevance • Multiple sources of corroborating evidence best 28
  29. 29. Human - Analogies • Possibly oldest form of logic. • Compare present circumstance to past source analogs (events). • Superficial similarities vs. structural similarities. • Only relevant analogies should be used, which are usually structural. • BUT experts tend to rely too much on specific features and patterns of behaviour: • Can be misled by novel circumstances 29
  30. 30. CIA Center for the Study of Intelligence • Richards J. Heuer, Jr. (1999). Psychology of Intelligence Analysis, retrieved from https://www.cia.gov/library/center-for-the-study-of- intelligence/csi-publications/books-and-monographs/psychology-of- intelligence-analysis/PsychofIntelNew.pdf • Articles written for the CIA Directorate of Intelligence, 1978 - 1996. • Describes cognitive limits, biases, and traps. • Introduces Structured Analytic Techniques • Analysis of Competing Hypotheses (ACH) Method to reduce bias when comparing multiple hypotheses. 30
  31. 31. Who is the complex problem solver? (Hint: System 1 or 2) 31
  32. 32. Bond or Holmes? • “Ninety percent of intelligence comes from open sources. The other ten percent, the clandestine work, is just the more dramatic. The real intelligence hero is Sherlock Holmes, not James Bond.” • Lieutenant General Sam Wilson, U.S. Army (Ret.), former Director Defense Intelligence Agency (DIA), 1997 32
  33. 33. Maria Konnikova, Ph.D. (b. 1985) • Mastermind: How to Think Like Sherlock Holmes (2013). • Dr. Watson – “sees,” rushes to judgment • Mr. Holmes – “observes” all aspects of a problem • Key concept – the brain attic has limited storage so furnish it carefully with useful skills and information. • How? • Start with a skeptical attitude, i.e. question all planning or decision-making assumptions – and record them. • Take time to ponder, e.g. Holmes’ “three-pipe problems,” go for a walk, take a shower, have a cuppa while staring at nothing. 33
  34. 34. Mastermind: How to Think Like Sherlock Holmes • Prioritize what you need to remember. • e.g. Selective access storage – memorize Google sites, authors, concepts NOT all the information they contain. • Recognize multi-tasking as – rapid task switching. • Consider all factors in a systematic fashion – weighing their initial relevance equally, leading to... • Establishing context for reducing relevance of, or eliminating factors from further consideration, • Leading to questions which confirm gaps or disprove hypotheses.... • “Rinse, repeat….” 34
  35. 35. Changing Ingrained Habits - Individual • Habits are sooo System 1…. • Charles Duhigg. (2012). The Power of Habit: Why We Do What We Do in Life and Business. • Cue, routine, reward. • Change the routine response. • Have a pre-planned routine to avoid failure at inflection points. • Delay gratification. • Build on “small wins.” 35
  36. 36. Changing Ingrained Habits - Corporate • Chip & Dan Heath. (2010). How to Change Things When Change is Hard. • Analogy of Rider, Path and Elephant 36
  37. 37. Easier said than done! (emotion + instinct) 37
  38. 38. Machine-Procedural - CALM • Computer Assisted Learning Method • a.k.a. Computer Aided Lean Management • Management philosophy which uses computational software in an Integrated System Model (ISM) to reduce risk and drive out inefficiencies. • Integrates business assets, business processes, and machine learning. • e.g. Motorola and GE – Six Sigma; Toyota Production System; • Key – - feedback and adaptation 38
  39. 39. Machine – Watsons • 1997 – IBM DeepBlue computer beats Gary Kasporvov in chess. • 2011 – Watson (description + caps/lims) • Developed to answer questions on the quiz show Jeopardy! • Q & A computer system that answers questions in natural language. • 200M pgs of structured/unstructured content consuming 4 TB of disk storage including Wikipedia. • IBM's DeepQA software and the Apache UIMA (Unstructured Information Management Architecture) framework. • Runs on the SUSE Linux Enterprise Server 11 OS using Apache Hadoop framework to provide distributed computing. • Languages: Java, C++, and Prolog. • 2013 – Lung cancer treatment at Sloan-Kettering Cancer Center decisions in conjunction with insurance company WellPoint. 39
  40. 40. Machine – Deep Mind • Google’s DeepMind unit AlphaGo AI program • Demis Hassabis, founder • Neural network self-training, pattern memorization • Defeats Go player Lee Sedol 4-1, in Seoul, South Korea, 9-14 Mar 16. • DeepMind Health launched “Clinician led technology” 1 Mar 16 in the U.K. Hassabis Sedol vs. AlphaGo 40
  41. 41. Futurist Giles Crouch post on AI caps/lims • AI is immature; limited to application of strictly complicated defined processes, such as: • Clerical & Process roles; • Insurance Agents; • Financial Planners; • 2015 - Fundamental shift in thinking as designers stopped trying to make AI work like a human mind; but to work like a machine. • Bellwether – IBM Watson, will it succeed at complex thought? 41
  42. 42. IT Security Guru – Bruce Schneier 42 https://www.schneier.com/
  43. 43. Anticipatory Governance 101 - BLUF • Bottom Line Up Front (BLUF) – linear cause & effect problem solving using logic and pattern analysis, (even machine-aided) appear to be failing! Why? • What can we do about it? • How can we collectively improve on anticipating consequences in 2016? 43
  44. 44. Anticipatory Governance (AG) • Leon Fuerth (2009; 2011; 2013); Leon Fuerth & Evan Faber (2012) • Complex adaptive problems evolve faster than society can cope; • Coping based on: 1) foresight; 2) networked governance; 3) adaptation • David Guston (2008; 2012; 2013). AG use in emerging technology, i.e., cloud computing, dual use bio-agents, nanotechnology. 44
  45. 45. Foresight • Need to hear “faint signals” of impending change, i.e. early recognition of clues – and where they are leading (Fuerth, 2012). • Non-government sources in academia, commerce, industry, social media - then fused to government policy analysis. • (e.g. bio-agents, nanotechnology, petroleum exploration, and social trends) • Internet forums and propaganda sites extolling virtues of VEO offer clues as to the structure of their social networks. 45
  46. 46. Networked Governance • Mission-based management using a strategies-to- task or end-to-end planning approach. • Mass collaboration has shown great promise in driving change through the Internet of Things. • “The curious shall inherit the future” (Tapscott, 2006) • Crowdsourcing – soliciting ideas or contributions from online and often anonymous community. 46
  47. 47. Adaptation • Feedback from networked governance which permits adjustment from the initial understanding of an issue as it evolves due to internal and external interactions. • Nassim Nicholas Taleb. (2013). Antifragile: Things That Gain From Disorder. • Concludes that disorder can lead to better long-term solutions as some processes can gain strength through graduated adaptation to stress. • How can this adaptation occur? 47
  48. 48. Conclusions • Take-aways for staff: • RECOGNIZE, AVOID, and MITIGATE cognitive traps if explaining options to the C-Suite. • Take-aways for the C-Suite: • Avoid biases when taking security decisions. • Chief Security Officer in the C-Suite. • Take-aways for all: • Rational decisions cannot be assumed. • Bias influences decisions even AFTER considering decision support tool data. • Bias can be overcome through strategies. • Trust but VERIFY data (multiple sources) • Security policy, contingency plans, PRACTICE! 48
  49. 49. Conclusions • Basic security plan must consider: • Identify High Value assets; • Segment networks, business components; • Chief Security Officer in C-Suite; • External audit to verify Governance Risk Compliance; • Security policy (goals), • Security plan (SMART); • Recurrent habit-based training; • Vet 3rd parties; • Resilience plan (reset?) 49

×