MS PowerPoint - Institute for Security, Technology, and Society

COGNITIVE HACKING
AND THE
VALUE OF INFORMATION
George Cybenko
Annarita Giani
Paul Thompson
Thayer School of Engineering
and
Institute for Security Technology Studies
Dartmouth College
Hanover, NH
Overview
• Definition
• Comparison with related concepts
• Examples
• Information Theory Model
• Applications
• Countermeasures
• Future work
Definition
COGNITIVE HACKING
A networked information system attack
that relies on changing human users'
perceptions and corresponding behaviors
in order to be successful.

Requires the use of an information system - not true for
all social engineering

Requires a user to change some behavior- not true for all hacking
Exploits our growing reliance on
networked information sources
Related concepts
PKI
• Propaganda
• Social Engineering
POLICY
Email exchange
asking for a password
Fire
• Advertising
MITNICK
ATTACK
Virus
authentication
VULNERABILITY protocols
SCANNERS
FIREWALLS AUDITING
TCP
Social
WRAPPERS
Cognitive
Engineering
Hacking
Warm
• Semantic Hacking
PHYSICAL ATTACKS
• Computer Security
Telephone call
AUTONOMOUS
toATTACKS
ask for a SSN
• Information Warfare
COGNITIVE ATTACKS
Web page
hacking
Misinformation
Spoofing
Types
Hacking with the Goal of Modifying User Behavior
MODE
Autonomous
Cognitiveovert
GOALS
Theft of Services
8
Cognitivecovert
8, 15
Theft of Information
4
Fraud Financial
1, 2, 3, 4, 5
Fraud- non Financial
6, 7
Political
10,11,14,15, 17
17
Commercial or Private
Perception Management
6, 9
6
Self-aggrandizement
12, 13, 15
White Hat Hack
13, 16
1. NEI Webworld pump and dump
3. Fast-trades.com website pump and dump
5. EMULEX
7. Non-financial fraud - CartoonNetwork.com
9. Usenet perception management
11. Ariel Sharon site
13. Yahoo site
15. Fluffi Bunni declares Jihad
17. WTO site
2. Jonathan Lebed case
4. PayPal.com
6. Non-financial fraud-search engine optimization
8. Bogus virus patch report
10. Hamas site
12. New York Times site
14. Afghanistan related web sites
16. CNN site
Example (1)
On 7 October 2001. “Singer Britney Spears Killed in Car Accident”.
Due to a bug in CNN’s software, when people at the spoofed site clicked
on the “E-mail This” link, the real CNN system distributed a real CNN email to recipients with a link to the spoofed page.
With each click at the bogus site, the real site’s tally of most popular
stories was incremented for the bogus story.
Allegedly this hoax was started by a researcher who sent the spoofed
story to three users of AOL’s Instant Messenger chat software.
Within 12 hours more than 150,000 people had viewed the spoofed page.
Example (2)
THE-REV | SPLURGE
Sm0ked crew is back and better than ever!
“Well, admin I’m sorry to say by you have just got
sm0ked by splurge. Don’t be scared though,
everything will be all right, first fire your current
security advisor . . .”
In February 2001 the New York Times web site was defaced by a
hacker identified as “splurge” from a group called “Sm0ked Crew”,
which had a few days previously defaced sites belonging to
Hewlett-Packard, Compaq, and Intel.
Models of Cognitive Hacking
- Information Theory
Possible
Frameworks

Horse race

Stock portfolio

Theory of the firm
A cognitive hacker might lure an indecisive gambler (investor) to
invest money on false prospects. In this case it would be useful
to understand how sensitive the function W is to p and o and
tamper with the data in order to convince a gambler that it is:
a.
worth playing
b.
playing a certain way
Horse race model
A horse race is a system defined by the following ingredients
(see [Cover and Thomas - “Elements of Information Theory” ]):

there are n horses in a race.

each horse i is assigned a probability pi of winning the race

each horse i is assigned an odds signifying that a gambler that bet bi
dollars on horse i would win bi oi dollars in case of victory (and suffer a
total loss in case of defeat).

W is the “doubling rate”, that is, after k plays, the expected value of the
gambler’s assets are 2Wk
m
W(b,p,o)   pi logbioi
i 1
Horse race model analysis
W ( p, o1, o2 )  p log p o1  (1  p) log (1  p) o2
 p o1 
W

( p, o1 , o2 )  log 
p
1

p
o
2 

W
p
(p, o1 , o2 ) 
o1
o1
W
1 p
(p, o1 , o2 ) 
o2
o2
O1 = 1
O1 = 5
o
p
2
W p, 1, o2 
O1 = 500
o
p
2
W p, 5, o2 
ANALYSIS
o
2
p
W p, 500, o2 
Apply the model to the Emulex exploit

Mark Jakob, shorted 3,000 shares of Emulex stock for $72 and $92

Price rose to $100

Jakob lost almost $100,000
He retaliated with a cognitive hack

Sends false press release to Internet Wire Inc.

Claims Emulex Corporation being investigated by the SEC

Claims company was forced to restate 1998 and 1999 earnings.
He manipulated
o1
o2
o
2
earning $236,000
~ successful new product release imminent
~ the company is under investigation
Stock Portfolio Analysis

Better model for cognitive hacking, e.g., for
the Emulex example

Not yet developed – future work
Theory of the Firm

Market analysis and assumption of perfect,
costless information inadequate to describe
firms

More efficient, automated information flow
lowers both transaction and organization costs

Information systems susceptible to cognitive
hacking
Possible Countermeasures

Single source
 Authentication of source
 Information "trajectory" modeling
 Ulam games

Multiple Sources
 Source Reliability via Collaborative Filtering and Reliability
reporting
 Byzantine Generals Models
 Detection of Collusion by Information Sources
 Linguistic Analysis , e.g. Determination of common
authorship
Future work

Working with Securities and Exchange Commission –
Office of Internet Enforcement

Development of Software tools to:

Detect misinformation

Detect common authorship