SlideShare ist ein Scribd-Unternehmen logo
1 von 6
Downloaden Sie, um offline zu lesen
Abstract
This forecasting methodology identifies 68 indi-
cators of terrorism and employs proven analytic
techniques in a systematic process that safe-
guards against 36 of the 42 common warning pit-
falls that experts have identified throughout his-
tory. The complete version of this research pro-
vides: 1) a step-by-step explanation of how to
forecast terrorism, 2) an evaluation of the fore-
casting system against the 42 common warning
pitfalls that have caused warning failures in the
past, and 3) recommendations for implementa-
tion. The associated CD has the website interface
to this methodology to forecast terrorist attacks.
This methodology could be applied to any intel-
ligence topic (not just terrorism) by simply
changing the list of indicators. The complete
version of this research is available in Forecast-
ing Terrorism: Indicators and Proven Analytic
Techniques, Scarecrow Press, Inc., ISBN 0-
8108-5017-6, for which 100% of the author roy-
alties are being donated to the non-profit Na-
tional Memorial Institute for the Prevention of
Terrorism (www.mipt.org) and the Joint Military
Intelligence College Foundation which supports
the Defense Intelligence Agency, Joint Military
Intelligence College (www.dia.mil.Jmic).
1. Introduction: Correcting Misconcep-
tions
Important lessons have arisen from the study of intelli-
gence warning failures, but some common misconcep-
tions have prevented the Intelligence Community from
recognizing and incorporating these lessons. Analysis,
Rather Than Collection, Is the Most Effective Way to
Improve Warning. The focus to improve warning nor-
mally turns to intelligence collection, rather than analysis
(Kam 1988). That trend continues after September 11th
(Anonymous intelligence source 2002). However, warning
failures are rarely due to inadequate intelligence collec-
tion, are more frequently due to weak analysis, and are
most often due to decision makers ignoring intelligence
(Garst 2000). Decision makers, however, ignore intelli-
gence largely because analytical product is weak (Hey-
mann 2000). Hiring Smart People Does Not Necessarily
Lead to Good Analysis. Studies show that, “frequently
groups of smart, well-motivated people . . . agree . . . on
the wrong solution. . . . They didn’t fail because they
were stupid. They failed because they followed a poor
process in arriving at their decisions” (Russo and Schoe-
maker 1989). A Systematic Process Is the Most Effective
Way to Facilitate Good Analysis. The nonstructured
approach has become the norm in the Intelligence Com-
munity. A key misunderstanding in the debate over intui-
tion versus structured technique is that an analyst must
choose either intuition or structured technique (Folker
2000). In fact, both intuition and structured technique can
be used together in a systematic process. “Anything that
is qualitative can be assigned meaningful numerical val-
ues. These values can then be manipulated to help us
achieve greater insight into the meaning of the data and
to help us examine specific hypotheses” (Trochim 2002). It
is not only possible to combine intuition and structure in
a system, research shows the combination is more effec-
tive than intuition alone. “Doing something systematic is
better in almost all cases than seat-of-the-pants predic-
tion” (Russo 1989). Moreover, decision makers have
called on the Intelligence Community to use methodol-
ogy. “The Rumsfeld Commission noted that, ‘. . . an ex-
pansion of the methodology used by the IC [Intelligence
Community] is needed.’ . . . Keeping chronologies, main-
taining databases and arraying data are not fun or glam-
orous. These techniques are the heavy lifting of analysis,
but this is what analysts are supposed to do. If decision
makers only needed talking heads, those are readily
available elsewhere” (McCarthy 1998).
Forecasting Terrorism: Indicators and Proven Analytic Techniques
Captain Sundri K. Khalsa
USAF
PO BOX 5124
Alameda, California, 94501, United States
SundriKK@hotmail.com
Keywords: Terrorism, Predictive Analysis and Hypothesis Management, Multiple Competing Hypotheses, Question
Answering, Structured Argumentation, Novel Intelligence from Massive Data, Knowledge Discovery and Dissemina-
tion, Information Sharing and Collaboration, Multi-INT/fusion, All Source Intelligence, Visualization, Fusion.
2. How to Forecast Terrorism: Abbrevi-
ated Step-By-Step Explanation of the
Methodology
This forecasting system is based on indicators. The ex-
planation of this methodology begins at the lowest level of
indicators and then builds up to the big picture of countries
within a region. The forecasting assessments of this meth-
odology are maintained on a website display, which is
available on the associated CD. Figure 1 shows a break-
down of the 3 primary types of warning picture views from
the web homepage: 1) country list view, 2) target list view,
and 3) indicator list view. Each potential-terrorist target is
evaluated in a webpage hypothesis matrix based on the
status of 68 indicators of terrorism. The indicators are up-
dated near-real-time with incoming raw intelligence re-
ports/evidence The methodology consists of 23 tasks and 6
phases of warning analysis, which are described very briefly
in this paper. The 23 tasks include 14 daily tasks, 3 monthly
tasks, 4 annual tasks, and 2 as-required tasks. The 14 daily
tasks can be completed in 1 day because tasks have been
automated wherever possible. Three types of analysts are
required to operate this methodology: Raw Reporting Pro-
filers, Indicator Specialists, and Senior Warning Officers.
2.1 Phase I: Define/Validate Key Ele-
ments of the Intelligence Problem (Using
Indicators)
Task 1: Identify/Validate Indicators (Annually). Indi-
cators are the building blocks of this forecasting system.
Indicators are “those [collectable] things that would have
to happen and those that would likely happen as [a] sce-
nario unfolded” (McDevitt 2002). For a terrorist attack,
those would be things like: terrorist travel, weapons
movement, terrorist training, target surveillance, and tests
of security. This project research has identified 68 indica-
tors of terrorism encompassing terrorist intentions, terror-
ist capability, and target vulnerability, which are the
three components of risk. The indicators are also identi-
fied as either quantitative (information that can be
counted) or qualitative (information that cannot be
counted, such as terrorist training). Only 7 of the 68 Ter-
rorism Indicators are quantitative. For security reasons,
many indicators are not shown here.
In task 1 of this methodology, the leading counterter-
rorism experts meet on at least an annual basis to deter-
mine if indicators should be added to or removed from
the list. A list of indicators should never be considered
final and comprehensive. To determine if the list of indi-
cators should be altered, analysts:
1. Review the raw intelligence reports filed under the
Miscellaneous Indicators to determine if any kinds
of significant terrorist activity have been over-
looked.
2. Review U.S. collection capabilities to determine if
the U.S. has gained or lost the capability to collect
on any terrorist activities.
3. Review case studies of terrorist operations to iden-
tify changes in terrorist modus operandi and de-
termine if terrorists are conducting any new activi-
ties against which U.S. intelligence can collect.
Task 2: Prioritize Indicators (Annually). Some indi-
cators are more significant than others. For instance,
among the indicators of terrorist intentions, weapons
movement to the target area must take place before an
attack; whereas an increase in terrorist propaganda is not
a prerequisite. Therefore, weapons movement would
carry a higher significance/priority than increased propa-
ganda. On at least an annual basis, the leading counterter-
rorism experts determine if the priority of any of the in-
dicators needs to be adjusted on a scale of 1 through 3
according to definitions defined in the complete explana-
tion of this methodology.
Task 3: Develop/Validate Key Question Sets on
Each Indicator (Annually). Since indicators are the
foundation and building blocks of this methodology, the
type of information required to assess the status of an
indicator should be clearly defined and recorded. Thus,
on at least an annual basis, the leading counterterrorism
experts validate a list of key questions for each indicator.
The question sets identify the key factors that experts
have determined are necessary to assess the status of a
given indicator. They also function as the list of priori-
tized Collection Requirements for intelligence collectors.
The entire list of indicators and corresponding key ques-
tion sets form an intelligence collection plan against ter-
rorism.
Task 4: Prioritize Questions in Key Question Sets
(Annually). The leading counterterrorism experts also
prioritize the key questions on a scale of 1 through 3 with
1 being the most significant. These priorities affect both
intelligence collection priorities and analysts’ assess-
ments.
2.2 Phase II: Consolidate Information (Us-
ing Master Database)
Task 5: Intelligence Community Master Database Re-
ceives All Raw Intelligence Reports from Intelligence
Collectors (Daily). The daily process begins with the
requirement that all fifteen Member Organizations of the
Intelligence Community and other U.S. government or-
ganizations that may have terrorism-related information
forward all their raw intelligence reports (on all intelli-
gence topics, not just terrorism) to an Intelligence Com-
munity Master Database. The major benefit of hindsight
investigation after intelligence warning failures is that it
is the first time all the information has been consolidated.
2.3 Phase III: Sort Information (Using
Hypothesis Matrices)
Task 6: Enter All Terrorism Related Raw Intelligence
Reports into Terrorism Forecasting Database Under
Appropriate Indicators, Key Questions, Targets,
Countries, Terrorist Groups, and Other Data Profile
Elements (Daily). A large group of junior analysts,
called Raw Reporting Profilers, reads through the Intelli-
gence Community’s incoming raw intelligence reports
(an estimated 2500 per day that are already marked as
terrorism related) and enters all of them into a Terrorism
Forecasting Database according to the indicators, key
questions, targets, countries, terrorist groups, and other
terrorism forecasting-specific data profile elements iden-
tified in this methodology.
Task 7: Terrorism Forecasting Database Creates
Potential-Target Hypothesis Matrices with Raw Intelli-
gence Reports Filed by Indicators (Daily). After ana-
lysts enter raw intelligence reports into the Terrorism
Forecasting Database, the computer program automati-
cally creates corresponding Potential-Target Hypothesis
Matrix webpages and displays hyperlinks to the reports
under the appropriate indicator(s) within the hypothesis
matrices.
Task 8: Terrorism Forecasting Database Feeds Raw
Intelligence Reports into Appropriate Indicator Key
Questions, Answers, & Evidence Logs within the Hy-
pothesis Matrices (Daily). The master database also
feeds the raw reports into the appropriate Indicator Key
Questions, Answers, & Evidence Logs within the Hy-
pothesis Matrices, as shown in figure 2.
Task 9: Assess Raw Intelligence Reports’ Informa-
tion Validity (Daily). The computer program combines
Source Credibility and Information Feasibility/Viability
according to rules in utility matrix logic to determine a
report’s Information Validity [on a 5-level scale of: 1)
“Almost Certainly Valid (~90%),” color coded red on the
website, 2) “Probably Valid (~70%),” orange 3) “Proba-
bly Not Valid (~30%),” yellow, 4) “Almost Certainly Not
Valid (~10%),” gray, and 5) “Unknown Validity (or
~50%),” black]. Source Credibility and Information Fea-
sibility/Viability are determined by analysts when they
check a list of boxes for each in the database on a 5-level
scale of: 1) “Almost Certainly (~90%),” about 90 percent
probability, 2) “Probably (~70%),” 3) “Probably Not
(~30%),” 4) “Almost Certainly Not (~10%),” and 5)
“Unknown (or ~50%)”].
2.4 Phase IV: Draw Conclusions (Using
Intuitive and Structured Techniques)
Task 10: Assess Indicator Warning Levels (Daily). The
computer program combines the Indicator Priority and
an Indicator Activity Level to determine an Indicator
Warning Level [on a 5-level scale of: 1) Critical (~90%),
2) Significant (~70%), 3) Minor (~30%), 4) Slight
(~10%), and 5) Unknown (or ~50%)]. The Indicator Ac-
tivity Level is determined by a second group of analysts
using either rules in utility matrix logic (for the quantita-
tive indicators) or the Indicator Key Questions, Answers,
& Evidence Logs (for the qualitative indicators). These
analysts are Indicator Specialists, the Counterterrorism
Community’s designated experts in determining an Indi-
cator Activity Level.
From this point forward, all the warning level calcula-
tions are automated. The third group of analysts, Senior
Warning Officers, is responsible for monitoring and ap-
proving all the warning levels that the computer applica-
tion automatically produces and updates on the web-
pages.
Task 11: Assess Terrorist Intention Warning Level
(Daily). Now that analysts have assessed an Indicator
Warning Level for each of the 68 indicators of terrorist
intentions, terrorist capability, and target vulnerability
(the 3 components of risk), the computer can calculate a
warning level for each of those 3 components. The com-
puter calculates the Terrorist Intention Warning Level for
a target [on a 5-level scale of: 1) Critical (~90%), 2) Sig-
nificant (~70%), 3) Minor (~30%), 4) Slight (~10%), and
5) Unknown (or ~50%)] from the Indicator Warning
Levels of the active terrorist intention indicators using
averages and rules in utility matrix logic.
Task 12: Assess Terrorist Capability Warning Level
(Daily). The computer determines the Terrorist Capabil-
ity Warning Level for a given country [on a 5-level scale
of: 1) Critical (~90%), 2) Significant (~70%), 3) Minor
(~30%), 4) Slight (~10%), and 5) Unknown (or ~50%)]
by taking the highest of all the Indicator Warning Levels
of the active terrorist capability, lethal agent/technique
indicators.
Task 13: Assess Target Vulnerability Warning Level
(Daily). The computer program calculates a target’s Vul-
nerability Warning Level [on a 5-level scale of: 1) Criti-
cal (~90%), 2) Significant (~70%), 3) Minor (~30%), 4)
Slight (~10%), and 5) Unknown (or ~50%)] from the
Indicator Warning Levels of the active terrorist vulner-
ability indicators using averages and rules in utility ma-
trix logic.
Task 14: Assess Target Risk Warning Level (Daily).
Now that analysts have a warning level for each of the 3
components of risk (terrorist intentions, terrorist capabil-
ity, and target vulnerability), the computer can calculate
a risk warning level for a given target. The computer
program calculates the Target Risk Warning Level [on a
5-level scale of: 1) Critical (~90%), 2) Significant
(~70%), 3) Minor (~30%), 4) Slight (~10%), and 5) Un-
known (or ~50%)] by averaging the Terrorist Intention,
Terrorist Capability, and Target Vulnerability Warning
Levels.
Task 15: Assess Country Risk Warning Level
(Daily). The computer program determines the Country
Risk Warning Level by taking the highest Target Risk
Warning Level in the country.
Task 16: Update/Study Trend Analysis of Indicator
Warning Levels (Monthly). This explanation has now
shown how the methodology determines various warning
levels, and that they are displayed in 3 primary views:
indicator list, target list, and country list. There is also
trend analysis for each view. The computer automatically
captures the Indicator Warning Level as it stood for the
majority of each month and plots it on a graph. Senior
Warning Officers write analytical comments discussing
warning failures and successes and how the methodology
will be adjusted, if necessary.
Task 17: Update/Study Trend Analysis of Target
Risk Warning Levels (Monthly). A target-oriented
trend analysis is also maintained in the same manner.
Task 18: Update/Study Trend Analysis of Country
Risk Warning Levels (Monthly). Finally, a country-
oriented trend analysis is maintained in the same manner.
2.5 Phase V: Focus Collectors on Intelli-
gence Gaps to Refine/Update Conclusions
(Using Narratives that Describe What We
Know, Think, and Need to Know)
Task 19: Write/Update Indicator Warning Narrative:
What We Know, Think, & Need to Know (Daily). Thus
far, the methodology has provided color-coded graphic
representations of warning assessments, but narratives
are necessary to explain the details behind each color-
coded warning level. Narratives are provided for each of
the 3 graphic views—indicators, targets, and countries.
The key question set provides an outline of all the major
points that the indicator narrative should address. The
narrative begins with a description of what the analyst
knows and thinks about the indicator, then is followed by
a list of questions from the Indicator Key Questions, An-
swers, & Evidence Logs on what he doesn't know—
Intelligence Gaps for intelligence collectors.
Task 20: Write/Update Executive Summary for
Target Warning Narrative: What We Know, Think, &
Need to Know (Daily). The computer program combines
the indicator narratives into a potential-target narrative.
Senior Warning Officers write and maintain executive
summaries for each potential-target narrative.
Task 21: Write/Update Executive Summary for
Country Warning Narrative: What We Know, Think, &
Need to Know (Daily). Finally, the computer program
combines the potential-target executive summaries into a
country narrative. Again, Senior Warning Officers are
responsible for maintaining executive summaries for the
country narratives.
2.6 Phase VI: Communicate Conclu-
sions/Give Warning (Using Website Tem-
plates)
Task 22: Brief Decision Maker with Website Tem-
plates (As Required). The final task of a warning system
is to convey the warning. Senior Warning Officers brief
the warning levels to a decision maker using the website
templates. The website templates are designed to convey
the structured reasoning process behind each warning
level.
Task 23: Rebrief Decision Maker with New Evi-
dence in Website Templates (As Required). If a deci-
sion maker does not heed the warning and does not alter
a security posture/FPCON after hearing the analyst’s
warning assessment, the analyst returns to the decision
maker with new evidence to press the assessment until
the decision maker heeds the warning. Former Secretary
of State Colin Powell describes this requirement elo-
quently, “[Analysts must] recognize that more often than
not, I will throw them out, saying ‘na doesn’t sound
right, get outta here.’ What I need from my I&W system
at that point is, ‘That old bastard, I’m going to prove him
wrong.’ And go back and accumulate more information,
come back the next day and give me some more and get
thrown out again. Constantly come back . . . and per-
suade me that I better start paying attention” (Powell
1991). This is in line with the DCI’s statement that the
purpose of intelligence is “not to observe and comment,
but to warn and protect” (DCI Warning Committee 2002).
3. Conclusion
Rather than face the War on Terrorism with the traditional
intuition-dominated approach, this methodology offers a
systematic forecasting tool that:
Guards against nearly 81 percent of common warning
pitfalls, and ultimately, improves the terrorism warning
process.
Coordinates analysts in a comprehensive, systematic ef-
fort.
Automates many proven analytic techniques into a com-
prehensive system, which is near-real-time, saves time,
saves manpower, and ensures accuracy in calculations
and consistency in necessary, recurring judgments.
Enables collection to feed analysis, and analysis to also
feed collection, which is the way the intelligence cycle is
supposed to work.
Fuses interagency intelligence into a meaningful warning
picture while still allowing for compartmenting necessary
to protect sensitive sources and methods.
Provides a continuously updated analysis of competing
hypotheses for each potential-terrorist target based on the
status of the 68 indicators of terrorism.
Is the first target specific terrorism warning system; thus
far, systems have only been country specific.
Is the first terrorism warning system with built in trend
analysis.
Combines threat (adversary intentions and adversary ca-
pability) with friendly vulnerability to determine risk and
provide a truer risk assessment than typical intelligence
analysis.
Includes a CD that is the tool to implement this terrorism
forecasting system.
Officials in the FBI and the Defense Intelligence Agency
(DIA) characterized this terrorism forecasting system as
“light-years ahead,” “the bedrock for the evolving approach
to terrorism analysis,” and an “unprecedented forecasting
model.”
Declaration of Originality
This paper has not already been accepted by and is not cur-
rently under review for a journal or another conference, nor
will it be submitted for such during IA’s review period.
Website Homepage
Select a region of the
world
Country List View
Select a country
Target List View
Select a potential
target
Indicator List Views
Select an indicator of:
Terrorist Intentions,
Terrorist Capability, or
Target Vulnerability
Figure 1. The 3 Primary Warning Picture Views.
References
A source, mid-level intelligence professional at a national intelli-
gence organization, who wishes to remain anonymous. Inter-
view by author, 10 July 2002.
Colin Powell on I&W: Address to the Department of Defense
Warning Working Group. Distributed by the Joint Military
Intelligence College, Washington, D.C. 1991. Videocas-
sette.
Donald Rumsfeld, press conference, quoted in Mary O.
McCarthy, “The Mission to Warn: Disaster Looms,” Defense
Intelligence Journal 7 no. 2 (Fall 1998): 21.
Folker, Robert D. Jr., MSgt, USAF. Intelligence Analysis in
Theater Joint Intelligence Centers: An Experiment in Apply-
ing Structured Methods. Joint Military Intelligence College
Occasional Paper, no. 7. Washington, D.C.: Joint Military
Intelligence College, January 2000.
Garst, Ronald D. “Fundamentals of Intelligence Analysis.” 5-7
in Intelligence Analysis ANA 630, no. 1, edited by Joint
Military Intelligence College. Washington, D.C.: Joint Mili-
tary Intelligence College, 2000.
Heymann, Hans, Jr. “The Intelligence—Policy Relationship.”
53-62 in Intelligence Analysis ANA 630, no. 1, edited by
Joint Military Intelligence College. Washington, D.C.: Joint
Military Intelligence College, 2000.
In tasks 19, 20, and
21, this question,
answered “Unknown
(or ~50%)”, auto-
matically appears as
a Collection Re-
quest on the appro-
priate Warning Nar-
rative: What We
Know, Think, &
Need to Know.
Figure 2. Indicator Key Questions, Answers, & Evidence Log (in Hypothesis Matrix),
Kam, Ephraim. Surprise Attack: The Victim’s Perspective.
Cambridge, MA: Harvard University Press, 1988.
McDevitt, James J. Summary of Indicator-Based-Methodology.
Unpublished handout, n.p., n.d. Provided in January 2002 at
the Joint Military Intelligence College.
National Warning Staff, DCI Warning Committee. “National
Warning System.” Handout provided in January 2002 at the
Joint Military Intelligence College.
Russo, J. Edward, and Paul J. H. Schoemaker. Decision Traps:
The Ten Barriers to Brilliant Decision-Making and How to
Overcome Them. New York: Rockefeller Center, 1989.
Trochim, William M. K. “Qualitative Data.” Cornell Univer-
sity: Research Methods Knowledge Base. 2002. tro-
chim.human.cornell.edu/kb/qualdata.htm (31 May 2002).

Weitere ähnliche Inhalte

Ähnlich wie paper on forecasting terrorism

2016 ISSA Conference Threat Intelligence Keynote philA
2016 ISSA Conference Threat Intelligence Keynote philA2016 ISSA Conference Threat Intelligence Keynote philA
2016 ISSA Conference Threat Intelligence Keynote philAPhil Agcaoili
 
Please read the instructions and source that provided, then decide.docx
Please read the instructions and source that provided, then decide.docxPlease read the instructions and source that provided, then decide.docx
Please read the instructions and source that provided, then decide.docxLeilaniPoolsy
 
Sans cyber-threat-intelligence-survey-2015
Sans cyber-threat-intelligence-survey-2015Sans cyber-threat-intelligence-survey-2015
Sans cyber-threat-intelligence-survey-2015Roy Ramkrishna
 
Analysis and ProductionInformation collected in the previous ste.docx
Analysis and ProductionInformation collected in the previous ste.docxAnalysis and ProductionInformation collected in the previous ste.docx
Analysis and ProductionInformation collected in the previous ste.docxnettletondevon
 
How To Turbo-Charge Incident Response With Threat Intelligence
How To Turbo-Charge Incident Response With Threat IntelligenceHow To Turbo-Charge Incident Response With Threat Intelligence
How To Turbo-Charge Incident Response With Threat IntelligenceResilient Systems
 
Cyber Threat Intelligence
Cyber Threat IntelligenceCyber Threat Intelligence
Cyber Threat IntelligenceMarlabs
 
Indicators for Monitoring Implementation of a National Human Rights Plan
Indicators for Monitoring Implementation of a National Human Rights PlanIndicators for Monitoring Implementation of a National Human Rights Plan
Indicators for Monitoring Implementation of a National Human Rights PlanMaksym Klyuchar
 
A Behavior Based Intrusion Detection System Using Machine Learning Algorithms
A Behavior Based Intrusion Detection System Using Machine Learning AlgorithmsA Behavior Based Intrusion Detection System Using Machine Learning Algorithms
A Behavior Based Intrusion Detection System Using Machine Learning AlgorithmsCSCJournals
 
Cybersecurity Risk Management Tools and Techniques (1).pptx
Cybersecurity Risk Management Tools and Techniques (1).pptxCybersecurity Risk Management Tools and Techniques (1).pptx
Cybersecurity Risk Management Tools and Techniques (1).pptxClintonKelvin
 
Exploring the Psychological Mechanisms used in Ransomware Splash Screens
Exploring the Psychological Mechanisms used in Ransomware Splash ScreensExploring the Psychological Mechanisms used in Ransomware Splash Screens
Exploring the Psychological Mechanisms used in Ransomware Splash ScreensJeremiah Grossman
 
IRJET- A Survey Paper of Blood Spatter Trajectory Analysis for Forensic Crime
IRJET- A Survey Paper of Blood Spatter Trajectory Analysis for Forensic CrimeIRJET- A Survey Paper of Blood Spatter Trajectory Analysis for Forensic Crime
IRJET- A Survey Paper of Blood Spatter Trajectory Analysis for Forensic CrimeIRJET Journal
 
Brussels Privacy Hub: SATORI and iTRACK
Brussels Privacy Hub: SATORI and iTRACKBrussels Privacy Hub: SATORI and iTRACK
Brussels Privacy Hub: SATORI and iTRACKTrilateral Research
 
Project 4 Threat Analysis and ExploitationTranscript (backgroun.docx
Project 4 Threat Analysis and ExploitationTranscript (backgroun.docxProject 4 Threat Analysis and ExploitationTranscript (backgroun.docx
Project 4 Threat Analysis and ExploitationTranscript (backgroun.docxstilliegeorgiana
 
Cyber Threat Intelligence
Cyber Threat IntelligenceCyber Threat Intelligence
Cyber Threat Intelligenceseadeloitte
 
kumar.pptx
kumar.pptxkumar.pptx
kumar.pptxsheryl90
 
Social Media Monitoring tools as an OSINT platform for intelligence
Social Media Monitoring tools as an OSINT platform for intelligenceSocial Media Monitoring tools as an OSINT platform for intelligence
Social Media Monitoring tools as an OSINT platform for intelligenceE Hacking
 

Ähnlich wie paper on forecasting terrorism (20)

2016 ISSA Conference Threat Intelligence Keynote philA
2016 ISSA Conference Threat Intelligence Keynote philA2016 ISSA Conference Threat Intelligence Keynote philA
2016 ISSA Conference Threat Intelligence Keynote philA
 
Brm unit iii - cheet sheet
Brm   unit iii - cheet sheetBrm   unit iii - cheet sheet
Brm unit iii - cheet sheet
 
Please read the instructions and source that provided, then decide.docx
Please read the instructions and source that provided, then decide.docxPlease read the instructions and source that provided, then decide.docx
Please read the instructions and source that provided, then decide.docx
 
Intelligence Analysis
Intelligence AnalysisIntelligence Analysis
Intelligence Analysis
 
Sans cyber-threat-intelligence-survey-2015
Sans cyber-threat-intelligence-survey-2015Sans cyber-threat-intelligence-survey-2015
Sans cyber-threat-intelligence-survey-2015
 
Analysis and ProductionInformation collected in the previous ste.docx
Analysis and ProductionInformation collected in the previous ste.docxAnalysis and ProductionInformation collected in the previous ste.docx
Analysis and ProductionInformation collected in the previous ste.docx
 
How To Turbo-Charge Incident Response With Threat Intelligence
How To Turbo-Charge Incident Response With Threat IntelligenceHow To Turbo-Charge Incident Response With Threat Intelligence
How To Turbo-Charge Incident Response With Threat Intelligence
 
Cyber Threat Intelligence
Cyber Threat IntelligenceCyber Threat Intelligence
Cyber Threat Intelligence
 
Uddin
UddinUddin
Uddin
 
Indicators for Monitoring Implementation of a National Human Rights Plan
Indicators for Monitoring Implementation of a National Human Rights PlanIndicators for Monitoring Implementation of a National Human Rights Plan
Indicators for Monitoring Implementation of a National Human Rights Plan
 
A Behavior Based Intrusion Detection System Using Machine Learning Algorithms
A Behavior Based Intrusion Detection System Using Machine Learning AlgorithmsA Behavior Based Intrusion Detection System Using Machine Learning Algorithms
A Behavior Based Intrusion Detection System Using Machine Learning Algorithms
 
Cybersecurity Risk Management Tools and Techniques (1).pptx
Cybersecurity Risk Management Tools and Techniques (1).pptxCybersecurity Risk Management Tools and Techniques (1).pptx
Cybersecurity Risk Management Tools and Techniques (1).pptx
 
Exploring the Psychological Mechanisms used in Ransomware Splash Screens
Exploring the Psychological Mechanisms used in Ransomware Splash ScreensExploring the Psychological Mechanisms used in Ransomware Splash Screens
Exploring the Psychological Mechanisms used in Ransomware Splash Screens
 
IRJET- A Survey Paper of Blood Spatter Trajectory Analysis for Forensic Crime
IRJET- A Survey Paper of Blood Spatter Trajectory Analysis for Forensic CrimeIRJET- A Survey Paper of Blood Spatter Trajectory Analysis for Forensic Crime
IRJET- A Survey Paper of Blood Spatter Trajectory Analysis for Forensic Crime
 
Brussels Privacy Hub: SATORI and iTRACK
Brussels Privacy Hub: SATORI and iTRACKBrussels Privacy Hub: SATORI and iTRACK
Brussels Privacy Hub: SATORI and iTRACK
 
Project 4 Threat Analysis and ExploitationTranscript (backgroun.docx
Project 4 Threat Analysis and ExploitationTranscript (backgroun.docxProject 4 Threat Analysis and ExploitationTranscript (backgroun.docx
Project 4 Threat Analysis and ExploitationTranscript (backgroun.docx
 
E2112733
E2112733E2112733
E2112733
 
Cyber Threat Intelligence
Cyber Threat IntelligenceCyber Threat Intelligence
Cyber Threat Intelligence
 
kumar.pptx
kumar.pptxkumar.pptx
kumar.pptx
 
Social Media Monitoring tools as an OSINT platform for intelligence
Social Media Monitoring tools as an OSINT platform for intelligenceSocial Media Monitoring tools as an OSINT platform for intelligence
Social Media Monitoring tools as an OSINT platform for intelligence
 

Mehr von Ajay Ohri

Introduction to R ajay Ohri
Introduction to R ajay OhriIntroduction to R ajay Ohri
Introduction to R ajay OhriAjay Ohri
 
Introduction to R
Introduction to RIntroduction to R
Introduction to RAjay Ohri
 
Social Media and Fake News in the 2016 Election
Social Media and Fake News in the 2016 ElectionSocial Media and Fake News in the 2016 Election
Social Media and Fake News in the 2016 ElectionAjay Ohri
 
Download Python for R Users pdf for free
Download Python for R Users pdf for freeDownload Python for R Users pdf for free
Download Python for R Users pdf for freeAjay Ohri
 
Install spark on_windows10
Install spark on_windows10Install spark on_windows10
Install spark on_windows10Ajay Ohri
 
Ajay ohri Resume
Ajay ohri ResumeAjay ohri Resume
Ajay ohri ResumeAjay Ohri
 
Statistics for data scientists
Statistics for  data scientistsStatistics for  data scientists
Statistics for data scientistsAjay Ohri
 
National seminar on emergence of internet of things (io t) trends and challe...
National seminar on emergence of internet of things (io t)  trends and challe...National seminar on emergence of internet of things (io t)  trends and challe...
National seminar on emergence of internet of things (io t) trends and challe...Ajay Ohri
 
Tools and techniques for data science
Tools and techniques for data scienceTools and techniques for data science
Tools and techniques for data scienceAjay Ohri
 
How Big Data ,Cloud Computing ,Data Science can help business
How Big Data ,Cloud Computing ,Data Science can help businessHow Big Data ,Cloud Computing ,Data Science can help business
How Big Data ,Cloud Computing ,Data Science can help businessAjay Ohri
 
Training in Analytics and Data Science
Training in Analytics and Data ScienceTraining in Analytics and Data Science
Training in Analytics and Data ScienceAjay Ohri
 
Software Testing for Data Scientists
Software Testing for Data ScientistsSoftware Testing for Data Scientists
Software Testing for Data ScientistsAjay Ohri
 
A Data Science Tutorial in Python
A Data Science Tutorial in PythonA Data Science Tutorial in Python
A Data Science Tutorial in PythonAjay Ohri
 
How does cryptography work? by Jeroen Ooms
How does cryptography work?  by Jeroen OomsHow does cryptography work?  by Jeroen Ooms
How does cryptography work? by Jeroen OomsAjay Ohri
 
Using R for Social Media and Sports Analytics
Using R for Social Media and Sports AnalyticsUsing R for Social Media and Sports Analytics
Using R for Social Media and Sports AnalyticsAjay Ohri
 
Kush stats alpha
Kush stats alpha Kush stats alpha
Kush stats alpha Ajay Ohri
 
Analyze this
Analyze thisAnalyze this
Analyze thisAjay Ohri
 

Mehr von Ajay Ohri (20)

Introduction to R ajay Ohri
Introduction to R ajay OhriIntroduction to R ajay Ohri
Introduction to R ajay Ohri
 
Introduction to R
Introduction to RIntroduction to R
Introduction to R
 
Social Media and Fake News in the 2016 Election
Social Media and Fake News in the 2016 ElectionSocial Media and Fake News in the 2016 Election
Social Media and Fake News in the 2016 Election
 
Pyspark
PysparkPyspark
Pyspark
 
Download Python for R Users pdf for free
Download Python for R Users pdf for freeDownload Python for R Users pdf for free
Download Python for R Users pdf for free
 
Install spark on_windows10
Install spark on_windows10Install spark on_windows10
Install spark on_windows10
 
Ajay ohri Resume
Ajay ohri ResumeAjay ohri Resume
Ajay ohri Resume
 
Statistics for data scientists
Statistics for  data scientistsStatistics for  data scientists
Statistics for data scientists
 
National seminar on emergence of internet of things (io t) trends and challe...
National seminar on emergence of internet of things (io t)  trends and challe...National seminar on emergence of internet of things (io t)  trends and challe...
National seminar on emergence of internet of things (io t) trends and challe...
 
Tools and techniques for data science
Tools and techniques for data scienceTools and techniques for data science
Tools and techniques for data science
 
How Big Data ,Cloud Computing ,Data Science can help business
How Big Data ,Cloud Computing ,Data Science can help businessHow Big Data ,Cloud Computing ,Data Science can help business
How Big Data ,Cloud Computing ,Data Science can help business
 
Training in Analytics and Data Science
Training in Analytics and Data ScienceTraining in Analytics and Data Science
Training in Analytics and Data Science
 
Tradecraft
Tradecraft   Tradecraft
Tradecraft
 
Software Testing for Data Scientists
Software Testing for Data ScientistsSoftware Testing for Data Scientists
Software Testing for Data Scientists
 
Craps
CrapsCraps
Craps
 
A Data Science Tutorial in Python
A Data Science Tutorial in PythonA Data Science Tutorial in Python
A Data Science Tutorial in Python
 
How does cryptography work? by Jeroen Ooms
How does cryptography work?  by Jeroen OomsHow does cryptography work?  by Jeroen Ooms
How does cryptography work? by Jeroen Ooms
 
Using R for Social Media and Sports Analytics
Using R for Social Media and Sports AnalyticsUsing R for Social Media and Sports Analytics
Using R for Social Media and Sports Analytics
 
Kush stats alpha
Kush stats alpha Kush stats alpha
Kush stats alpha
 
Analyze this
Analyze thisAnalyze this
Analyze this
 

paper on forecasting terrorism

  • 1. Abstract This forecasting methodology identifies 68 indi- cators of terrorism and employs proven analytic techniques in a systematic process that safe- guards against 36 of the 42 common warning pit- falls that experts have identified throughout his- tory. The complete version of this research pro- vides: 1) a step-by-step explanation of how to forecast terrorism, 2) an evaluation of the fore- casting system against the 42 common warning pitfalls that have caused warning failures in the past, and 3) recommendations for implementa- tion. The associated CD has the website interface to this methodology to forecast terrorist attacks. This methodology could be applied to any intel- ligence topic (not just terrorism) by simply changing the list of indicators. The complete version of this research is available in Forecast- ing Terrorism: Indicators and Proven Analytic Techniques, Scarecrow Press, Inc., ISBN 0- 8108-5017-6, for which 100% of the author roy- alties are being donated to the non-profit Na- tional Memorial Institute for the Prevention of Terrorism (www.mipt.org) and the Joint Military Intelligence College Foundation which supports the Defense Intelligence Agency, Joint Military Intelligence College (www.dia.mil.Jmic). 1. Introduction: Correcting Misconcep- tions Important lessons have arisen from the study of intelli- gence warning failures, but some common misconcep- tions have prevented the Intelligence Community from recognizing and incorporating these lessons. Analysis, Rather Than Collection, Is the Most Effective Way to Improve Warning. The focus to improve warning nor- mally turns to intelligence collection, rather than analysis (Kam 1988). That trend continues after September 11th (Anonymous intelligence source 2002). However, warning failures are rarely due to inadequate intelligence collec- tion, are more frequently due to weak analysis, and are most often due to decision makers ignoring intelligence (Garst 2000). Decision makers, however, ignore intelli- gence largely because analytical product is weak (Hey- mann 2000). Hiring Smart People Does Not Necessarily Lead to Good Analysis. Studies show that, “frequently groups of smart, well-motivated people . . . agree . . . on the wrong solution. . . . They didn’t fail because they were stupid. They failed because they followed a poor process in arriving at their decisions” (Russo and Schoe- maker 1989). A Systematic Process Is the Most Effective Way to Facilitate Good Analysis. The nonstructured approach has become the norm in the Intelligence Com- munity. A key misunderstanding in the debate over intui- tion versus structured technique is that an analyst must choose either intuition or structured technique (Folker 2000). In fact, both intuition and structured technique can be used together in a systematic process. “Anything that is qualitative can be assigned meaningful numerical val- ues. These values can then be manipulated to help us achieve greater insight into the meaning of the data and to help us examine specific hypotheses” (Trochim 2002). It is not only possible to combine intuition and structure in a system, research shows the combination is more effec- tive than intuition alone. “Doing something systematic is better in almost all cases than seat-of-the-pants predic- tion” (Russo 1989). Moreover, decision makers have called on the Intelligence Community to use methodol- ogy. “The Rumsfeld Commission noted that, ‘. . . an ex- pansion of the methodology used by the IC [Intelligence Community] is needed.’ . . . Keeping chronologies, main- taining databases and arraying data are not fun or glam- orous. These techniques are the heavy lifting of analysis, but this is what analysts are supposed to do. If decision makers only needed talking heads, those are readily available elsewhere” (McCarthy 1998). Forecasting Terrorism: Indicators and Proven Analytic Techniques Captain Sundri K. Khalsa USAF PO BOX 5124 Alameda, California, 94501, United States SundriKK@hotmail.com Keywords: Terrorism, Predictive Analysis and Hypothesis Management, Multiple Competing Hypotheses, Question Answering, Structured Argumentation, Novel Intelligence from Massive Data, Knowledge Discovery and Dissemina- tion, Information Sharing and Collaboration, Multi-INT/fusion, All Source Intelligence, Visualization, Fusion.
  • 2. 2. How to Forecast Terrorism: Abbrevi- ated Step-By-Step Explanation of the Methodology This forecasting system is based on indicators. The ex- planation of this methodology begins at the lowest level of indicators and then builds up to the big picture of countries within a region. The forecasting assessments of this meth- odology are maintained on a website display, which is available on the associated CD. Figure 1 shows a break- down of the 3 primary types of warning picture views from the web homepage: 1) country list view, 2) target list view, and 3) indicator list view. Each potential-terrorist target is evaluated in a webpage hypothesis matrix based on the status of 68 indicators of terrorism. The indicators are up- dated near-real-time with incoming raw intelligence re- ports/evidence The methodology consists of 23 tasks and 6 phases of warning analysis, which are described very briefly in this paper. The 23 tasks include 14 daily tasks, 3 monthly tasks, 4 annual tasks, and 2 as-required tasks. The 14 daily tasks can be completed in 1 day because tasks have been automated wherever possible. Three types of analysts are required to operate this methodology: Raw Reporting Pro- filers, Indicator Specialists, and Senior Warning Officers. 2.1 Phase I: Define/Validate Key Ele- ments of the Intelligence Problem (Using Indicators) Task 1: Identify/Validate Indicators (Annually). Indi- cators are the building blocks of this forecasting system. Indicators are “those [collectable] things that would have to happen and those that would likely happen as [a] sce- nario unfolded” (McDevitt 2002). For a terrorist attack, those would be things like: terrorist travel, weapons movement, terrorist training, target surveillance, and tests of security. This project research has identified 68 indica- tors of terrorism encompassing terrorist intentions, terror- ist capability, and target vulnerability, which are the three components of risk. The indicators are also identi- fied as either quantitative (information that can be counted) or qualitative (information that cannot be counted, such as terrorist training). Only 7 of the 68 Ter- rorism Indicators are quantitative. For security reasons, many indicators are not shown here. In task 1 of this methodology, the leading counterter- rorism experts meet on at least an annual basis to deter- mine if indicators should be added to or removed from the list. A list of indicators should never be considered final and comprehensive. To determine if the list of indi- cators should be altered, analysts: 1. Review the raw intelligence reports filed under the Miscellaneous Indicators to determine if any kinds of significant terrorist activity have been over- looked. 2. Review U.S. collection capabilities to determine if the U.S. has gained or lost the capability to collect on any terrorist activities. 3. Review case studies of terrorist operations to iden- tify changes in terrorist modus operandi and de- termine if terrorists are conducting any new activi- ties against which U.S. intelligence can collect. Task 2: Prioritize Indicators (Annually). Some indi- cators are more significant than others. For instance, among the indicators of terrorist intentions, weapons movement to the target area must take place before an attack; whereas an increase in terrorist propaganda is not a prerequisite. Therefore, weapons movement would carry a higher significance/priority than increased propa- ganda. On at least an annual basis, the leading counterter- rorism experts determine if the priority of any of the in- dicators needs to be adjusted on a scale of 1 through 3 according to definitions defined in the complete explana- tion of this methodology. Task 3: Develop/Validate Key Question Sets on Each Indicator (Annually). Since indicators are the foundation and building blocks of this methodology, the type of information required to assess the status of an indicator should be clearly defined and recorded. Thus, on at least an annual basis, the leading counterterrorism experts validate a list of key questions for each indicator. The question sets identify the key factors that experts have determined are necessary to assess the status of a given indicator. They also function as the list of priori- tized Collection Requirements for intelligence collectors. The entire list of indicators and corresponding key ques- tion sets form an intelligence collection plan against ter- rorism. Task 4: Prioritize Questions in Key Question Sets (Annually). The leading counterterrorism experts also prioritize the key questions on a scale of 1 through 3 with 1 being the most significant. These priorities affect both intelligence collection priorities and analysts’ assess- ments. 2.2 Phase II: Consolidate Information (Us- ing Master Database) Task 5: Intelligence Community Master Database Re- ceives All Raw Intelligence Reports from Intelligence Collectors (Daily). The daily process begins with the requirement that all fifteen Member Organizations of the Intelligence Community and other U.S. government or- ganizations that may have terrorism-related information forward all their raw intelligence reports (on all intelli- gence topics, not just terrorism) to an Intelligence Com- munity Master Database. The major benefit of hindsight investigation after intelligence warning failures is that it is the first time all the information has been consolidated. 2.3 Phase III: Sort Information (Using Hypothesis Matrices) Task 6: Enter All Terrorism Related Raw Intelligence Reports into Terrorism Forecasting Database Under Appropriate Indicators, Key Questions, Targets, Countries, Terrorist Groups, and Other Data Profile Elements (Daily). A large group of junior analysts,
  • 3. called Raw Reporting Profilers, reads through the Intelli- gence Community’s incoming raw intelligence reports (an estimated 2500 per day that are already marked as terrorism related) and enters all of them into a Terrorism Forecasting Database according to the indicators, key questions, targets, countries, terrorist groups, and other terrorism forecasting-specific data profile elements iden- tified in this methodology. Task 7: Terrorism Forecasting Database Creates Potential-Target Hypothesis Matrices with Raw Intelli- gence Reports Filed by Indicators (Daily). After ana- lysts enter raw intelligence reports into the Terrorism Forecasting Database, the computer program automati- cally creates corresponding Potential-Target Hypothesis Matrix webpages and displays hyperlinks to the reports under the appropriate indicator(s) within the hypothesis matrices. Task 8: Terrorism Forecasting Database Feeds Raw Intelligence Reports into Appropriate Indicator Key Questions, Answers, & Evidence Logs within the Hy- pothesis Matrices (Daily). The master database also feeds the raw reports into the appropriate Indicator Key Questions, Answers, & Evidence Logs within the Hy- pothesis Matrices, as shown in figure 2. Task 9: Assess Raw Intelligence Reports’ Informa- tion Validity (Daily). The computer program combines Source Credibility and Information Feasibility/Viability according to rules in utility matrix logic to determine a report’s Information Validity [on a 5-level scale of: 1) “Almost Certainly Valid (~90%),” color coded red on the website, 2) “Probably Valid (~70%),” orange 3) “Proba- bly Not Valid (~30%),” yellow, 4) “Almost Certainly Not Valid (~10%),” gray, and 5) “Unknown Validity (or ~50%),” black]. Source Credibility and Information Fea- sibility/Viability are determined by analysts when they check a list of boxes for each in the database on a 5-level scale of: 1) “Almost Certainly (~90%),” about 90 percent probability, 2) “Probably (~70%),” 3) “Probably Not (~30%),” 4) “Almost Certainly Not (~10%),” and 5) “Unknown (or ~50%)”]. 2.4 Phase IV: Draw Conclusions (Using Intuitive and Structured Techniques) Task 10: Assess Indicator Warning Levels (Daily). The computer program combines the Indicator Priority and an Indicator Activity Level to determine an Indicator Warning Level [on a 5-level scale of: 1) Critical (~90%), 2) Significant (~70%), 3) Minor (~30%), 4) Slight (~10%), and 5) Unknown (or ~50%)]. The Indicator Ac- tivity Level is determined by a second group of analysts using either rules in utility matrix logic (for the quantita- tive indicators) or the Indicator Key Questions, Answers, & Evidence Logs (for the qualitative indicators). These analysts are Indicator Specialists, the Counterterrorism Community’s designated experts in determining an Indi- cator Activity Level. From this point forward, all the warning level calcula- tions are automated. The third group of analysts, Senior Warning Officers, is responsible for monitoring and ap- proving all the warning levels that the computer applica- tion automatically produces and updates on the web- pages. Task 11: Assess Terrorist Intention Warning Level (Daily). Now that analysts have assessed an Indicator Warning Level for each of the 68 indicators of terrorist intentions, terrorist capability, and target vulnerability (the 3 components of risk), the computer can calculate a warning level for each of those 3 components. The com- puter calculates the Terrorist Intention Warning Level for a target [on a 5-level scale of: 1) Critical (~90%), 2) Sig- nificant (~70%), 3) Minor (~30%), 4) Slight (~10%), and 5) Unknown (or ~50%)] from the Indicator Warning Levels of the active terrorist intention indicators using averages and rules in utility matrix logic. Task 12: Assess Terrorist Capability Warning Level (Daily). The computer determines the Terrorist Capabil- ity Warning Level for a given country [on a 5-level scale of: 1) Critical (~90%), 2) Significant (~70%), 3) Minor (~30%), 4) Slight (~10%), and 5) Unknown (or ~50%)] by taking the highest of all the Indicator Warning Levels of the active terrorist capability, lethal agent/technique indicators. Task 13: Assess Target Vulnerability Warning Level (Daily). The computer program calculates a target’s Vul- nerability Warning Level [on a 5-level scale of: 1) Criti- cal (~90%), 2) Significant (~70%), 3) Minor (~30%), 4) Slight (~10%), and 5) Unknown (or ~50%)] from the Indicator Warning Levels of the active terrorist vulner- ability indicators using averages and rules in utility ma- trix logic. Task 14: Assess Target Risk Warning Level (Daily). Now that analysts have a warning level for each of the 3 components of risk (terrorist intentions, terrorist capabil- ity, and target vulnerability), the computer can calculate a risk warning level for a given target. The computer program calculates the Target Risk Warning Level [on a 5-level scale of: 1) Critical (~90%), 2) Significant (~70%), 3) Minor (~30%), 4) Slight (~10%), and 5) Un- known (or ~50%)] by averaging the Terrorist Intention, Terrorist Capability, and Target Vulnerability Warning Levels. Task 15: Assess Country Risk Warning Level (Daily). The computer program determines the Country Risk Warning Level by taking the highest Target Risk Warning Level in the country. Task 16: Update/Study Trend Analysis of Indicator Warning Levels (Monthly). This explanation has now shown how the methodology determines various warning levels, and that they are displayed in 3 primary views: indicator list, target list, and country list. There is also trend analysis for each view. The computer automatically captures the Indicator Warning Level as it stood for the majority of each month and plots it on a graph. Senior Warning Officers write analytical comments discussing warning failures and successes and how the methodology will be adjusted, if necessary.
  • 4. Task 17: Update/Study Trend Analysis of Target Risk Warning Levels (Monthly). A target-oriented trend analysis is also maintained in the same manner. Task 18: Update/Study Trend Analysis of Country Risk Warning Levels (Monthly). Finally, a country- oriented trend analysis is maintained in the same manner. 2.5 Phase V: Focus Collectors on Intelli- gence Gaps to Refine/Update Conclusions (Using Narratives that Describe What We Know, Think, and Need to Know) Task 19: Write/Update Indicator Warning Narrative: What We Know, Think, & Need to Know (Daily). Thus far, the methodology has provided color-coded graphic representations of warning assessments, but narratives are necessary to explain the details behind each color- coded warning level. Narratives are provided for each of the 3 graphic views—indicators, targets, and countries. The key question set provides an outline of all the major points that the indicator narrative should address. The narrative begins with a description of what the analyst knows and thinks about the indicator, then is followed by a list of questions from the Indicator Key Questions, An- swers, & Evidence Logs on what he doesn't know— Intelligence Gaps for intelligence collectors. Task 20: Write/Update Executive Summary for Target Warning Narrative: What We Know, Think, & Need to Know (Daily). The computer program combines the indicator narratives into a potential-target narrative. Senior Warning Officers write and maintain executive summaries for each potential-target narrative. Task 21: Write/Update Executive Summary for Country Warning Narrative: What We Know, Think, & Need to Know (Daily). Finally, the computer program combines the potential-target executive summaries into a country narrative. Again, Senior Warning Officers are responsible for maintaining executive summaries for the country narratives. 2.6 Phase VI: Communicate Conclu- sions/Give Warning (Using Website Tem- plates) Task 22: Brief Decision Maker with Website Tem- plates (As Required). The final task of a warning system is to convey the warning. Senior Warning Officers brief the warning levels to a decision maker using the website templates. The website templates are designed to convey the structured reasoning process behind each warning level. Task 23: Rebrief Decision Maker with New Evi- dence in Website Templates (As Required). If a deci- sion maker does not heed the warning and does not alter a security posture/FPCON after hearing the analyst’s warning assessment, the analyst returns to the decision maker with new evidence to press the assessment until the decision maker heeds the warning. Former Secretary of State Colin Powell describes this requirement elo- quently, “[Analysts must] recognize that more often than not, I will throw them out, saying ‘na doesn’t sound right, get outta here.’ What I need from my I&W system at that point is, ‘That old bastard, I’m going to prove him wrong.’ And go back and accumulate more information, come back the next day and give me some more and get thrown out again. Constantly come back . . . and per- suade me that I better start paying attention” (Powell 1991). This is in line with the DCI’s statement that the purpose of intelligence is “not to observe and comment, but to warn and protect” (DCI Warning Committee 2002). 3. Conclusion Rather than face the War on Terrorism with the traditional intuition-dominated approach, this methodology offers a systematic forecasting tool that: Guards against nearly 81 percent of common warning pitfalls, and ultimately, improves the terrorism warning process. Coordinates analysts in a comprehensive, systematic ef- fort. Automates many proven analytic techniques into a com- prehensive system, which is near-real-time, saves time, saves manpower, and ensures accuracy in calculations and consistency in necessary, recurring judgments. Enables collection to feed analysis, and analysis to also feed collection, which is the way the intelligence cycle is supposed to work. Fuses interagency intelligence into a meaningful warning picture while still allowing for compartmenting necessary to protect sensitive sources and methods. Provides a continuously updated analysis of competing hypotheses for each potential-terrorist target based on the status of the 68 indicators of terrorism. Is the first target specific terrorism warning system; thus far, systems have only been country specific. Is the first terrorism warning system with built in trend analysis. Combines threat (adversary intentions and adversary ca- pability) with friendly vulnerability to determine risk and provide a truer risk assessment than typical intelligence analysis. Includes a CD that is the tool to implement this terrorism forecasting system. Officials in the FBI and the Defense Intelligence Agency (DIA) characterized this terrorism forecasting system as “light-years ahead,” “the bedrock for the evolving approach to terrorism analysis,” and an “unprecedented forecasting model.” Declaration of Originality This paper has not already been accepted by and is not cur- rently under review for a journal or another conference, nor will it be submitted for such during IA’s review period.
  • 5. Website Homepage Select a region of the world Country List View Select a country Target List View Select a potential target Indicator List Views Select an indicator of: Terrorist Intentions, Terrorist Capability, or Target Vulnerability Figure 1. The 3 Primary Warning Picture Views. References A source, mid-level intelligence professional at a national intelli- gence organization, who wishes to remain anonymous. Inter- view by author, 10 July 2002. Colin Powell on I&W: Address to the Department of Defense Warning Working Group. Distributed by the Joint Military Intelligence College, Washington, D.C. 1991. Videocas- sette. Donald Rumsfeld, press conference, quoted in Mary O. McCarthy, “The Mission to Warn: Disaster Looms,” Defense Intelligence Journal 7 no. 2 (Fall 1998): 21. Folker, Robert D. Jr., MSgt, USAF. Intelligence Analysis in Theater Joint Intelligence Centers: An Experiment in Apply- ing Structured Methods. Joint Military Intelligence College Occasional Paper, no. 7. Washington, D.C.: Joint Military Intelligence College, January 2000. Garst, Ronald D. “Fundamentals of Intelligence Analysis.” 5-7 in Intelligence Analysis ANA 630, no. 1, edited by Joint Military Intelligence College. Washington, D.C.: Joint Mili- tary Intelligence College, 2000. Heymann, Hans, Jr. “The Intelligence—Policy Relationship.” 53-62 in Intelligence Analysis ANA 630, no. 1, edited by Joint Military Intelligence College. Washington, D.C.: Joint Military Intelligence College, 2000.
  • 6. In tasks 19, 20, and 21, this question, answered “Unknown (or ~50%)”, auto- matically appears as a Collection Re- quest on the appro- priate Warning Nar- rative: What We Know, Think, & Need to Know. Figure 2. Indicator Key Questions, Answers, & Evidence Log (in Hypothesis Matrix), Kam, Ephraim. Surprise Attack: The Victim’s Perspective. Cambridge, MA: Harvard University Press, 1988. McDevitt, James J. Summary of Indicator-Based-Methodology. Unpublished handout, n.p., n.d. Provided in January 2002 at the Joint Military Intelligence College. National Warning Staff, DCI Warning Committee. “National Warning System.” Handout provided in January 2002 at the Joint Military Intelligence College. Russo, J. Edward, and Paul J. H. Schoemaker. Decision Traps: The Ten Barriers to Brilliant Decision-Making and How to Overcome Them. New York: Rockefeller Center, 1989. Trochim, William M. K. “Qualitative Data.” Cornell Univer- sity: Research Methods Knowledge Base. 2002. tro- chim.human.cornell.edu/kb/qualdata.htm (31 May 2002).