2. SJ
Terp|
The
business
of
cognitive
security
|
NYU
Dec
2021
Cognitive Security is Infosec applied to disinformation
“Cognitive security is the application of information security principles, practices, and tools
to misinformation, disinformation, and influence operations.
It takes a socio-technical lens to high-volume, high-velocity, and high-variety forms of
“something is wrong on the internet”.
Cognitive security can be seen as a holistic view of disinformation from a security
practitioner’s perspective
“Cognitive Security is the
application of artificial intelligence
technologies, modeled on human
thought processes, to detect
security threats.” - XTN
“Cognitive Security (COGSEC) refers to
practices, methodologies, and efforts
made to defend against social
engineering attempts‒intentional and
unintentional manipulations of and
disruptions to cognition and sensemaking”
- cogsec.org
3. SJ
Terp|
The
business
of
cognitive
security
|
NYU
Dec
2021
Cognitive Security is risk management
Confidentiality, integrity, availability
■ Confidentiality: data should only be visible
to people who authorized to see it
■ Integrity: data should not be altered in
unauthorized ways
■ Availability: data should be available to be
used
Possession, authenticity, utility
■ Possession: controlling the data media
■ Authenticity: accuracy and truth of the
origin of the information
■ Utility: usefulness (e.g. losing the
encryption key)
3
Image: Parkerian Hexad, from
https://www.sciencedirect.com/topics/computer-
science/parkerian-hexad
5. SJ
Terp|
The
business
of
cognitive
security
|
NYU
Dec
2021
Behaviour models: AMITT Red, AMITT Blue
5
Planning
Strate
gic
Plann
ing
Obje
ctive
Plann
ing
Preparation
Devel
op
Peopl
e
Devel
op
Netw
orks
Micro
targe
ting
Devel
op
Cont
ent
Chan
nel
Selec
tion
Execution
Pump
Primi
ng
Expos
ure
Prebu
nking
Humoro
us
counter
narrativ
es
Mark
content
with
ridicule /
deceler
ants
Expire
social
media
likes/
retweets
Influenc
er
disavow
s misinfo
Cut off
banking
access
Dampe
n
emotion
al
reaction
Remove
/ rate
limit
botnets
Social
media
amber
alert
Etc
Go
Physi
cal
Persis
tenc
e
Eval
uati
on
Meas
ure
Effect
ivene
ss
Have
a
disinfor
matio
n
respon
se
plan
Improve
stakehol
der
coordin
ation
Make
civil
society
more
vibrant
Red
team
disinfor
mation,
design
mitigati
ons
Enhanc
ed
privacy
regulati
on for
social
media
Platform
regulati
on
Shared
fact
checkin
g
databas
e
Repair
broken
social
connect
ions
Pre-
emptive
action
against
disinfor
mation
team
infrastru
cture
Etc
Media
literac
y
throug
h
games
Tableto
p
simulati
ons
Make
informat
ion
provena
nce
availabl
e
Block
access
to
disinfor
mation
resourc
es
Educate
influenc
ers
Buy out
troll
farm
employ
ees /
offer
jobs
Legal
action
against
for-profit
engage
ment
farms
Develop
compelli
ng
counter
narrativ
es
Run
competi
ng
campai
gns
Etc
Find
and
train
influen
cers
Counter
-social
enginee
ring
training
Ban
incident
actors
from
funding
sites
Address
truth in
narrativ
es
Margina
lise and
discredit
extremis
t groups
Ensure
platform
s are
taking
down
account
s
Name
and
shame
disinfor
mation
influenc
ers
Denigra
te
funding
recipien
t /
project
Infiltrate
in-
groups
Etc
Remo
ve old
and
unuse
d
accou
nts
Unravel
Potemki
n
villages
Verify
project
before
posting
fund
requests
Encoura
ge
people
to leave
social
media
Deplatf
orm
messag
e
groups
and
boards
Stop
offering
press
credenti
als to
disinfor
mation
outlets
Free
open
library
sources
Social
media
source
removal
Infiltrate
disinfor
mation
platform
s
Etc
Fill
inform
ation
voids
Stem
flow of
advertisi
ng
money
Buy
more
advertisi
ng than
disinfor
mation
creators
Reduce
political
targetin
g
Co-opt
disinfor
mation
hashtag
s
Mentors
hip:
elders,
youth,
credit
Hijack
conte
nt and
link to
inform
ation
Honeyp
ot social
commu
nity
Corpora
te
researc
h
funding
full
disclosur
e
Real-
time
updates
to
factche
ck
databas
e
Remove
non-
relevant
content
from
special
interest
groups
Content
modera
tion
Prohibit
images
in
political
Chanels
Add
metada
ta to
original
content
Add
warning
labels
on
sharing
Etc
Rate-
limit
engag
ement
Redirect
searche
s away
from
disinfo
Honeyp
ot: fake
engage
ment
system
Bot to
engage
and
distract
trolls
Strength
en
verificati
on
method
s
Verified
ids to
comme
nt or
contribu
te to
poll
Revoke
whitelist
/
verified
status
Microt
arget
likely
targets
with
count
er
messa
ges
Train
journalist
s to
counter
influenc
e moves
Tool
transpar
ency
and
literacy
in
followed
channel
s
Ask
media
not to
report
false
info
Repurpo
se
images
with
counter
messag
es
Engage
payload
and
debunk
Debunk
/ defuse
fake
expert
credenti
als
Don’t
engage
with
payload
s
Hashtag
jacking
Etc
DMCA
taked
own
reques
ts
Spam
domesti
c actors
with
lawsuits
Seize
and
analyse
botnet
servers
Poison
monito
ring
and
evalu
ation
data
Bomb
link
shorten
ers with
calls
Add
random
links to
network
graphs
https://github.com/cogsec-collaborative/AMITT
9. SJ
Terp|
The
business
of
cognitive
security
|
NYU
Dec
2021
Business questions
● Is there a market here?
● Does the market pay enough to
sustain businesses?
● Where’s the money coming from?
● What’s it paying for?
● Who is already in this space?
● Who is likely to move into this
space?
● Who is the customer base?
● What features and restrictions do
we have?
9
FOLLOW THE MONEY
10. SJ
Terp|
The
business
of
cognitive
security
|
NYU
Dec
2021
Where is the money in a threat landscape?
• Motivations
• Geopolitics mostly absent
• Party politics (internal, inter-party)
• Actors
• Political parties
• Nationstates
• Entrepreneurs
• Activities
• Manipulate faith communities
• Discredit election process
• Discredit/discourage journalists
• Attention (more drama)
• Potential harms / severities
• Assassination
• Voting reduction
• Sources
• WhatsApp
• Blogs
• Facebook pages
• Online newspapers
• Media
• Routes
• Hijacked narratives
• Whatsapp to blogs, vice versa
• Whatsapp forwarding
• facebook to whatsapp
• Social media to traditional media
• Social media to word of mouth
10
15. SJ
Terp|
The
business
of
cognitive
security
|
NYU
Dec
2021
Disinformation Actors
Persistent
Manipulators
Advanced teams
• Internet Research Agency
• China, Iran teams etc
For-profit website networks
• Antivax websites
• Pink slime sites
• “Stolen” US election sites
Nationstate media
• Sputnik
• Russia Today
Service
Providers
Disinformation as a Service
• Factories
• Ex-marketing, spam etc
Ad-Hoc paid teams
• EBLA Ghana
• PeaceData USA
Opportunists
Wares Sellers
• Clicks
• T-shirts
• Books etc.
Groups
• Conspiracy groups
• Extremists
Individuals
• Attention-seekers
• Jokers etc
17. SJ
Terp|
The
business
of
cognitive
security
|
NYU
Dec
2021
Disinformation as a service
“Doctor Zhivago’s services were priced very specifically, as seen below:
● $15 for an article up to 1,000 characters
● $8 for social media posts and commentary up to 1,000 characters
● $10 for Russian to English translation up to 1,800 characters
● $25 for other language translation up to 2,000 characters
● $1,500 for SEO services to further promote social media posts and traditional media articles, with a time frame of 10 to 15
days
Raskolnikov, on the other hand, had less specific pricing:
● $150 for Facebook and other social media accounts and content
● $200 for LinkedIn accounts and content
● $350–$550 per month for social media marketing
● $45 for an article up to 1,000 characters
● $65 to contact a media source directly to spread material
● $100 per 10 comments for a given article or news story”
Image: https://www.recordedfuture.com/disinformation-service-campaigns/
18. SJ
Terp|
The
business
of
cognitive
security
|
NYU
Dec
2021
DaaS examples
Internet Research Agency
● Russian “troll farm”
● Well organised
● Ex marketing
● Not quite official
Satellite organisation: EBLA
● Cut-out organisation based in Ghana
● Kids round a kitchen table model
Troll farms in the Philippines
● PR experts plus younger social media influencers
● Philippines because English-speaking workforce, used to call center, content moderation work
PR firms, various locations
● US-based: operating in other countries (Venezuela, Bolivia etc)
● MAS Agency (Ukraine-based PR firm)
● Saudi digital marketing firm
18
Image: https://en.wikipedia.org/wiki/Internet_Research_Agency
31. SJ
Terp|
The
business
of
cognitive
security
|
NYU
Dec
2021
Response Actors
Disinformation SOCs
Large actors
• ISAOs
• Platforms
• Other large actors
Event-specific
• War rooms
• Agencies
Disinformation
Teams
Disinformation “desk"
• In existing SOC
• Standalone unit
Investigators
• Journalists
• Academics
• Independent journalists
Other Responders
Policymakers
Law enforcement
Corporations
Influencers
Nonprofits
Educators
Individual researchers
Concerned citizens
32. SJ
Terp|
The
business
of
cognitive
security
|
NYU
Dec
2021
Example Response Landscape (Needs / Work / Gaps)
Risk Reduction
● Media and influence
literacy
● information
landscaping
● Other risk reduction
Monitoring
● Radio, TV, newspapers
● Social media platforms
● Tips
Analysis
● Tier 1 (creates tickets)
● Tier 2 (creates
mitigations)
● Tier 3 (creates reports)
● Tier 4 (coordination)
Response
● Messaging
○ prebunk
○ debunk
○ counternarratives
○ amplification
● Actions
○ removal
○ other actions
● Reach
38. SJ
Terp|
The
business
of
cognitive
security
|
NYU
Dec
2021
Fiveby: Adapting supply chain risk management
● Seattle-based risk
consultancy
● Techniques rooted in
fraud risk assessment
● Aimed at platforms and
other online businesses
Image: https://www.fiveby.com/wp-content/uploads/2021/05/Fiveby_disinformation_whitepaper_032921_final-1.pdf