Using Ontologies to Quantify
Attack Surfaces
Mr. Michael Atighetchi,
Dr. Borislava Simidchieva,
Dr. Fusun Yaman,
Raytheon BBN Technologies
Dr. Thomas Eskridge
Dr. Marco Carvalho
Florida Institute of Technology
Captain Nicholas Paltzer
Air Force Research Laboratory
Distribution Statement “A” (Approved for Public Release, Distribution Unlimited). This material is based upon
work supported by the Air Force Research Laboratory under Contract No. FA8750-14-C-0104.
03.10.10
Cyber Operations
Officer (COO)
Context
Defenses
Attacker
X
User
Static
Cyber
Defenses
Operational
Cyber Missions
Critical Services
Attacks
Stopped
Use
Dynamic
Cyber
Defenses
Moving
Target
Defenses
Missions
Systems
Require
Acceptable
Goodput
Use
Derivative
Critical Services
•
• Objective: Provide tools enabling
Problem: Defense selection and
automated security quantification of
configuration is a poorly understood,
distributed systems with a focus on
non-quantifiable process
architectural patterns
– Add defenses that provide little value
or even increase the attack surface
-
– Introduce unacceptable overhead
-
– Cause unintended side effects when
combining multiple defenses
-
Model key concepts related to cyber
defense
Provide algorithms to quantify and
minimize attack surfaces
Focus on Moving Target Defense
2
Systematic Quantification of Defense Postures
Current State of the Art of Cyber C2
Attacks
Networked
System
With Reasoning and Characterization
Attacks
Server
X
3
Attack Surface Reasoning (ASR)
•
Objective: Measure attack surfaces
for security quantification
Technical Achievements
Interact
Deployment
Explore
Characterization
Minimization
Metric Instances
System*
Compute
Attack Vectors
Compute
•
Metrics &
Algorithms
What-If Analysis
Compute
– Establish appropriate metrics for
quantifying different attack surfaces
– Incorporate mission security
and cost measurements
– Address usability issues through
representative and composite
measures of effectiveness
Cyber
Planning Tool
Compute
Unified
Models
Given
Software
Artifacts
Attack
Defense
Metric
Adversary
System
Mission
Transform
Transform
– Models for attack surfaces that
Mission Critical Moving Target
include systems, defenses, and
Systems & Apps Defenses (MTDs)
attack vectors to enable quantitative
characterization of attack surfaces
– Metrics for characterizing the attack surface of a dynamic, distributed system at the
application, operating system, and network layers
– Algorithms for evaluating the effectiveness of defenses and minimizing attack surfaces
4
Modeling Approach
• Express a configuration C as a collection of OWL models
– C = {system, defense, attack, adversary, mission, metrics}
– Ontology openly available at https://ds.bbn.com/projects/asr.html
• Focus on interactions between distributed components
– Adversaries tend to take advantage of weak seems
• Make as few assumptions about adversaries as possible
– Minimize “garbage in, garbage out” problems
• Leverage extensible knowledge representation frameworks
with powerful query languages
– Ontologies expressed in OWL
– Models can queried with SPARQL
• Automate model creation when possible
– Increase consistency and minimize cost of manual model creation
5
Systems Model
•
•
Capture the relevant
aspects of systems
Based on Microsoft’s STRIDE
dataflow model
– Process
• DLLs, EXEs, service
– External Entity
• People, other systems
– Data Flow
• Network flow, function call
– Data Store
• File
• Database
– Trust Boundary
• Process boundary
• File system
•
Extensions
– Hierarchical Layering
– Inclusion of specific concepts to
make models more understandable
6
Attack Model
Microsoft
STRIDE
MITRE
CAPAC
MITRE
CWE
S=
T=
R=
I =
D=
E=
Spoofing
Tampering
Repudiation
Information Disclosure
Denial of Service
Elevation of Privilege
Attack Types
6
Common
Attack
Pattern
Enumeration
And
Classification
>500
Common
Weakness
Enumeration
>943
Expresses
high-level
attack steps
7
Attack Step Model
Example AttackStepDefinition:
8
Current Set of Modeled Attack Steps
9
Adversary Model
• Captures assumptions we make about
adversaries
– Starting position
– Overall objective of the attack
• Quantification experiments assess attack surfaces
across many different adversary models
• To increase efficiency of attack vector finding,
knowledge of adversarial workflows can be
expressed in AttackVectorTemplates
10
Defense Model
• Express the security provided and cost incurred by cyber defenses
• Defense models may add new entities to system models (new data
flows, processes, etc.)
• Current set of modeled defenses includes three types of MTDs
– Time-bounding observables (e.g., IPHopping)
– Masquerading (OS Masquerading)
– Time-bounding footholds (e.g., continuous restart via Watchdogs)
11
Mission Model
• Missions are simply modeled as a subset of data flows together with
information security and cost requirements
– Security requirements are expressed as Confidentiality, Integrity, Availability
– Cost requirements are expressed as %change of latency and throughput
• Missions (and their individual flows) can be in three distinct modes
– Pass, degraded, fail
12
Metrics Model
• Attack surface metrics are themselves expressed through a model
• Cover {system & mission, security & cost} dimensions
13
Attack Surface Indexes
Security
Integer
Aggregate'Security'Index'(ASI)
• Attacker(Workload:
Minimum(length(of(attack(vectors
• Coverage(over(known(attacks:(
Number(of(attack(vectors
• Coverage(over(unknown(attacks:
Number(of(entry(points(and(exit(points
• Probabilistic(time@
to@
fail:(
Duration(distributions(of(attack(vectors(
and(estimated(probability(of(attack(success
Cost
Integer
Aggregate'Cost'Index'(ACI)
• Latency:
Overhead(on(critical(flows
• Throughput:
Overhead(on(critical(flows
Mission
Pass|Degraded|Fail
Aggregate'Mission'Index'(AMI)
• Latency(&(Throughput:
Resource(use(on(critical(flows
• Confidentiality|Integrity|Availability:
Required(security(on(critical(flows
14
Quantification Methodology
1. Wrap Defense
2. Scan System into Model
4. Quantify Attack Surface ASI ACI AMI
Networked System
System
123
0
Fail
System
System
-5
+12
Fail
Virtual Experimentation
Environment
Mission
System
+13 +3 Degraded
Networked System*
Attack
System
+23 +5
Pass
3. Characterize Defense
5. Validate Attack Vectors
Experimentation:
• System auto-scan
• Defense cost characterization
• Attack vector validation
Analytics:
• Cost and security metrics
• Attack vector finding
• Attack surface minimization
15
Experimental Results
• Generated models of tens of hosts and a small number of defenses
and attack steps
• Deployed scanning capabilities on BBN network and virtualized
network at customer location and automatically generated system
models from live systems
• Explore runtime complexity of attack vector finding and metrics
computation algorithms using
a random
Analysis
Time model generator
400
350
Time (sec)
300
250
200
6 a* ack steps
150
3 atack steps
100
50
0
100
200
300
400
500
Number of hosts
16
Conclusion and Next Steps
• We created a framework for quantifying attack surfaces
using semantic models
– Our ontologies are openly available at
https://ds.bbn.com/projects/asr.html
– We hope you will try them out and provide feedback!
• Next Steps
– Automate defense deployment exploration within a system
through a genetic search algorithm
– Include metrics to capture interaction effect between multiple
cyber defenses
– Expand scenario to enterprise-scale regimes
– Extend the set of modeled cyber defenses beyond MTDs
• Proxy overlay networks, deception, reactive defenses
17
Contacts
•
•
•
•
•
•
Mr. Michael Atighetchi, [email protected]
Dr. Borislava Simidchieva, [email protected]
Dr. Fusun Yaman, [email protected]
Dr. Thomas Eskridge, [email protected]
Dr. Marco Carvalho, [email protected]
Captain Nicholas Paltzer, [email protected]
18
© Copyright 2026 Paperzz