5. Cognitive Security: Online Information Harms
• Disinformation: The deliberate attempt to influence perception and decision making by
presenting information that is incomplete, incorrect, or out of context.
• Misinformation: Unwittingly propagating misleading or incorrect information. See “useful idiots.”
• Malinformation: The attempt to influence perception by leaking ostensibly true information that
may be out of context.
• Hate Speech
5
7. Disinformation Risk Landscaping
Mis/disinformation is everywhere:
Where do you put your resources?
● Detection, mitigation, response
● People, technologies, time, attention
● Connections
Manage the risks, not the artifacts
● Attack surfaces, vulnerabilities, potential
losses / outcomes
● Risk assessment, reduction, remediation
● Risks: How bad? How big? Who to?
8. Landscapes
Information
Landscape
• Information seeking
• Information sharing
• Information sources
• Information voids
Threat
Landscape
• Motivations
• Sources/ Starting points
• Effects
• Misinformation Narratives
• Hateful speech narratives
• Crossovers
• Tactics and Techniques
• Artifacts
Response
Landscape
• Monitoring organisations
• Countering organisations
• Coordination
• Existing policies
• Technologies
• etc
9. Disinformation Actors
Persistent
Manipulators
Advanced teams
• Internet Research Agency
• China, Iran teams etc
For-profit website networks
• Antivax websites
• Pink slime sites
• “Stolen” US election sites
Nationstate media
• Sputnik
• Russia Today
Service
Providers
Disinformation as a Service
• Factories
• Ex-marketing, spam etc
Ad-Hoc paid teams
• EBLA Ghana
• PeaceData USA
Opportunists
Wares Sellers
• Clicks
• T-shirts
• Books etc.
Groups
• Conspiracy groups
• Extremists
Individuals
• Attention-seekers
• Jokers etc
10. Response Actors
Disinformation SOCs
Large actors
• ISAOs
• Platforms
• Other large actors
Event-specific
• War rooms
• Agencies
Disinformation
Teams
Disinformation “desk"
• In existing SOC
• Standalone unit
Investigators
• Journalists
• Academics
• Independent journalists
Other Responders
Policymakers
Law enforcement
Corporations
Influencers
Nonprofits
Educators
Individual researchers
Concerned citizens
13. CONNECT RESPONSE EFFORTS
Hundreds of groups, large
and small, working on
different pieces /
approaches
Help them identify and
connect with one another
Facilitate collaboration and
communication
15. COGSOCS: Cognitive Security SOCs
• Inform: Summarise and share information about ongoing incidents
• Neutralise: Disinformation incident response: triage, takedown, escalation.
• Prevent: Collate disinformation indicators of compromise (IoCs) and vulnerabilities; supply to
organisations.
• Support: Assess the possibility of direct attack, and ways to be ready for that.
• Clearinghouse: Collate and share incident data, including with organizations focusing on
response and counter-campaigns.
1
5
16. CogSOC Top-level Activities
Risk Mitigation
Secure system
* Simulations
* Red teaming
* Penetration testing
* Team exercises
Check compliance
* compliance analysis
Enablement
Foundation work
* Data engineering
* Information frameworks
* Politics
* Training
Real-time Operations
Incident response
* Discover
* Investigate
* Respond to threats
Research
* Threat intelligence
* Deeper investigations
22. Planning
Strategic
Planning
Objective
Planning
Preparation
Develop
People
Develop
Networks
Microtargeting
Develop
Content
Channel
Selection
Execution
Pump Priming Exposure
Prebunking
Humorous counter
narratives
Mark content with
ridicule / decelerants
Expire social media
likes/ retweets
Influencer disavows
misinfo
Cut off banking
access
Dampen emotional
reaction
Remove / rate limit
botnets
Social media amber
alert
Etc
Go Physical Persistence
Evaluation
Measure
Effectiveness
Have a
disinformation
response plan
Improve stakeholder
coordination
Make civil society
more vibrant
Red team
disinformation, design
mitigations
Enhanced privacy
regulation for social
media
Platform regulation
Shared fact checking
database
Repair broken social
connections
Pre-emptive action
against disinformation
team infrastructure
Etc
Media literacy
through games
Tabletop simulations
Make information
provenance
available
Block access to
disinformation
resources
Educate influencers
Buy out troll farm
employees / offer
jobs
Legal action against
for-profit
engagement farms
Develop compelling
counter narratives
Run competing
campaigns
Etc
Find and train
influencers
Counter-social
engineering training
Ban incident actors
from funding sites
Address truth in
narratives
Marginalise and
discredit extremist
groups
Ensure platforms are
taking down
accounts
Name and shame
disinformation
influencers
Denigrate funding
recipient / project
Infiltrate in-groups
Etc
Remove old and
unused accounts
Unravel Potemkin
villages
Verify project before
posting fund requests
Encourage people to
leave social media
Deplatform message
groups and boards
Stop offering press
credentials to
disinformation outlets
Free open library
sources
Social media source
removal
Infiltrate
disinformation
platforms
Etc
Fill information
voids
Stem flow of
advertising money
Buy more advertising
than disinformation
creators
Reduce political
targeting
Co-opt disinformation
hashtags
Mentorship: elders,
youth, credit
Hijack content
and link to
information
Honeypot social
community
Corporate research
funding full disclosure
Real-time updates to
factcheck database
Remove non-relevant
content from special
interest groups
Content moderation
Prohibit images in
political Chanels
Add metadata to
original content
Add warning labels
on sharing
Etc
Rate-limit
engagement
Redirect searches
away from disinfo
Honeypot: fake
engagement system
Bot to engage and
distract trolls
Strengthen
verification methods
Verified ids to
comment or
contribute to poll
Revoke whitelist /
verified status
Microtarget likely
targets with
counter
messages
Train journalists to
counter influence
moves
Tool transparency
and literacy in
followed channels
Ask media not to
report false info
Repurpose images
with counter
messages
Engage payload and
debunk
Debunk/ defuse fake
expert credentials
Don’t engage with
payloads
Hashtag jacking
Etc
DMCA takedown
requests
Spam domestic
actors with lawsuits
Seize and analyse
botnet servers
Poison monitoring
and evaluation
data
Bomb link shorteners
with calls
Add random links to
network graphs
AMITT Blue: Countermeasures Framework
25. ACTION
MONITORING
RESPONSIBLE FOR
DISINFORMATION SOC: ORGANISATION BOUNDARIES
Internet
Domains
Social Media
Platforms
Organization’s
Platforms
Lawmakers
Organization’s
Business Units
COG SOC
Infosec SOC
Organization’s
Communities
Media
27. COGSOC Internal Organization: Tiers
Tier1 Triage
• Scanning systems
• Triaging alerts
• Gathering data
• Starting tickets
Tier2 Incident
Response
• Analysis
• Remediation
• Tactical response
Tier3 SMEs
• Threat hunting
• Deep analysis
• Strategic response
Tier4 Management
• Business connections
• Plans, audits, organization
Tickets Responses Reports
Crisis Plan
Platform alerts
Social media
External alerts
Business Units
Partners &
Responders
Disinformation Knowledge
• Artifacts, narratives, actors,
segments etc
Specialist Knowledge
• Politics, industry, marketing etc
28. Resource Allocation and Measurement
● You can’t manage what you can’t measure
○ Backed by disinformation and response measurement
● Resource allocation and depletion on both sides
○ Strategic objectives
○ People, process, technology, time, money, attention, reach, etc
○ We can learn a lot from games
● Extending capacity
○ Surge capacity
○ Automation - using ML to take strain during times of heavy loads
29. Pillars of a SOC
• People
• Enough people to make a difference, in time
• Enough connections / levers to make a difference
• Culture
• Safety processes: mental health and opsec
• Process
• Understand disinformation, understand threat response
• Fast, lightweight processes
• Technology
• Speed - supporting analysis, storage etc
• Sharing - get data to responders in ways they understand (whatever works)
31. Information Landscape
• Traditional Media
• Newspapers
• Radio - including community radio
• TV
• Social Media
• Facebook
• Whatsapp
• Twitter
• Youtube/ Telegram/ etc
• Others
• Word of mouth
32. Threat Landscape
• Motivations
• Geopolitics mostly absent
• Party politics (internal, inter-party)
• Actors
• Activities
• Manipulate faith communities
• discredit election process
• Discredit/discourage journalists
• Attention (more drama)
• Risks / severities
• Sources
• WhatsApp
• Blogs
• Facebook pages
• Online newspapers
• Media
• Routes
• Hijacked narratives
• Whatsapp to blogs, vice versa
• Whatsapp forwarding
• facebook to whatsapp
• Social media to traditional media
• Social media to word of mouth
34. Response Landscape (Needs / Work / Gaps)
Risk Reduction
● Media and influence
literacy
● information landscaping
● Other risk reduction
Monitoring
● Radio, TV, newspapers
● Social media platforms
● Tips
Analysis
● Tier 1 (creates tickets)
● Tier 2 (creates
mitigations)
● Tier 3 (creates reports)
● Tier 4 (coordination)
Response
● Messaging
○ prebunk
○ debunk
○ counternarratives
○ amplification
● Actions
○ removal
○ other actions
● Reach
35. Responder Behaviours
● C00009: Educate high profile influencers on best practices
● C00008: Create shared fact-checking database
● C00042: Address truth contained in narratives
● C00030: Develop a compelling counter narrative (truth based)
● C00093: Influencer code of conduct
● C00193: promotion of a “higher standard of journalism”
● C00073: Inoculate populations through media literacy training
● C00197: remove suspicious accounts
● C00174: Create a healthier news environment
● C00205: strong dialogue between the federal government and
private sector to encourage better reporting
36. Practical Resource Allocation
• Tagging needs and groups with AMITT labels
• Building collaboration mechanisms to reduce lost tips and repeated collection
• Designing for future potential surges
• Automating repetitive jobs to reduce load on humans
37. THANK YOU
Sara-Jayne “SJ” Terp @bodaceacat
Dr. Pablo Breuer @Ngree_H0bit
https://cogsec-collab.org/
https://threet.consulting/
37