E040 Digital Data Quality and Accuracy

Product Accuracy
and
Data Quality
Unclassified
National Geospatial Intelligence College
Lesson Objectives
• Define Accuracy
• The Importance of Accuracy
• Accuracy vs. Precision
• Error Types & Sources
• Managing Error & Data Quality
• Metadata
• Product Accuracies
• Accuracy Concepts Vignette
(Optional)
UNCLASSIFIED
Objectives
When finished the student will be able to:
• Define Accuracy
• Discuss the importance of Accuracy
• Explain how to manage error
• Define Metadata
• List the Accuracy of the following
products:
– Hard copy 1:50K TLMs
– 1:50K CADRG
– DTED Level 2
– VMAP Level 2
UNCLASSIFIED
Why are You Here?
• You are here to prevent this:
DCI Statement on the Belgrade Chinese Embassy
Bombing
House Permanent Select Committee on Intelligence Open Hearing
“Mr. Chairman, Dr. Hamre and I are here today
to explain how a series of errors led to the
unintended bombing of the Chinese Embassy in
Belgrade on May 7th. We will try to describe to
the best of our ability – in this open, public
session – the causes of what can only be
described as a tragic mistake. It was a major
error. I cannot…”
UNCLASSIFIED
Accuracy Defined
• Accuracy can mean many things
–
–
–
–
–
–
–
Is the road really 2 lane – all weather?
How old is the information
Where did the information come from?
How accurate are the coordinates you read?
Does it show everything? What is missing?
Do the two maps fit together?
How much detail is there?
• We are going to look at a variety of ways to
categorize accuracy
• It does not always mean just a ground
distance!!
UNCLASSIFIED
Accuracy Defined
• Accuracy:
– How close a recorded location comes to its true value
– The degree to which information on a map or in a digital
database matches a control value
– Conformance to a recognizable standard
– The ability of a measurement to match the actual value
of the quantity being measured
The closeness of the best estimated value
obtained by measurement to the “true” value
of the quantity measured
UNCLASSIFIED
Accuracy VS. Precision
• Accuracy is NOT Precision
– Precision is the “Closeness with which repeated
measurements made under similar conditions are
grouped together”
– Precision is also “how exactly a location is specified
without any reference to its true value”
UNCLASSIFIED
Accuracy Defined
Accuracy vs. Precision
Accuracy and
Precision
Precision
Without
Accuracy
No Accuracy,
no Precision
UNCLASSIFIED
Accuracy Defined
• When we describe the positional accuracy of a
geospatial product, anything from a paper map to
the fanciest digital product, we use two terms:
– Absolute Accuracy refers to how closely a ‘well defined’
point shown on a geospatial product comes to its real
world location
– Relative Accuracy refers to how well the distance and
direction between two ‘well defined’ points on a map
represent the real world distance and direction between
the two objects
• Both can be described in terms of horizontal or
vertical components
• Both refer only to positional accuracy
UNCLASSIFIED
Accuracy Defined
Absolute
Accuracy
1:50,000 km
UNCLASSIFIED
LIMITED DISTRIBUTION
Relative Accuracy
1:250,000 km
Accuracy Defined
• Accuracy statements are meaningless
without a statement of the measurement
certainty...that is, the statistical probability
that the accuracy statement will be true
for a well-defined feature
– Statistics are used to quantify only RANDOM
error
• Always look for this statement of
certainty!
• ±50m @ 90%
• ±22m @ 95%
UNCLASSIFIED
Why is Accuracy
Important?
• Availability of Geospatial Information
• Move to Rapid Response & Decision Making
• Reliance on technology
– Desktop Geospatial Viewing Tools: everyone is
empowered with geospatial information — everyone is a
consumer
• False Assumptions: “Since digital data products
were created using computers, they have a
higher level of quality than standard products”
• Reality: There can be just as many or more
inherent errors within digital products; therefore,
the user must check to see if the information is fit
for the intended purpose
UNCLASSIFIED
Error Types & Sources
Types of Errors
• Blunders
– Mistakes, human errors
• Systematic Errors
– Repeated errors of similar magnitude
• Random Errors
– Small, different magnitude & direction for each
UNCLASSIFIED
Error Types & Sources
Blunders
• Large mistake, typically
“human errors”
– Chinese Embassy Bombing
• Not always obvious
– WD546321 vs. WD546312
• Generally caused by
carelessness
• Prevent Blunders by following
SOPs and paying attention
UNCLASSIFIED
Chinese Embassy
Bombed by NATO
Error Types & Sources
Systematic Errors
• Typically an error that repeatedly has the
same magnitude and direction
– Bombs have similar miss distance
– Calibrated device that is out of calibration
• Prevent
systematic
errors
by
first
identifying the cause of the error and then
reconstructing the system to prevent it
– Adjust the aim point for wind
– Calibrate the device
UNCLASSIFIED
Error Types & Sources
Random Errors
• Unavoidable! They CANNOT be prevented!
• Typically follows “Bell Curve”
– Equal likelihood of sign (+ or -)
– Often have small values
• Key is to minimize them using redundant
measurements of coordinates (“averaging”)
• If you use a product for its intended
purpose, the random errors have been
minimized to allow that use!
UNCLASSIFIED
Error Types & Sources
Error Sources
• Inherent:
– Errors present in the source materials
• Operational:
– Errors
produced
during
processing, and output
data
capture,
• User:
– Errors produced during use of the product
UNCLASSIFIED
Error Types & Sources
Inherently Sourced Errors
• These errors are present in the source you
are using from the moment you start using it
– Displaying an image of the 3D world in 2
dimensions (the orange peel effect)
– Generalization on a map (size & location of
buildings)
– Data displayed on a map (only primary/secondary
roads)
• They are caused by a variety of events
– Data Collection
• Interpretation, Sensor Anomalies, Product Specifications
– Data Input
• Scanning Errors, Manual Entry Errors, Importing Errors
UNCLASSIFIED
Remember the Source
• Many NGA digital products are derived
from paper source (vice photo-source
centerline) data. For example:
– Compressed ARC Digitized Raster Graphics
(CADRGs)
– Vector Map (VMap) Series
• Digitized products are no more accurate
than source maps and charts!
• Some ADRG & CADRG products have
pixels in DD, but the underlying map show
MGRS grids!
UNCLASSIFIED
Numerical Display
• Accuracy of coordinate depends upon source and
method
• No connection between “number of digits after
the decimal point” and accuracy (sometimes
software displays too many digits beyond the
decimal point)
Coordinate
A
B
Latitude
23.12126°N
23.12°N
Longitude
126.57213°W
126.57°W
Elevation
317.490 ft.
317.5 ft.
We cannot tell which coordinate is more accurate!!
UNCLASSIFIED
Error Types & Sources
Beware! Inherent Sources of Data Error
• Many NGA digital products are derived from
paper source (vice photo-source centerline
data). For example:
• ARC Digitized Raster Graphics (ADRGs)
• Vector Map (VMap) Series
• Digitized products are no more accurate than
source maps and charts!
• Some software tools will allow you to display
(and measure) coordinates to more “decimal
places” than is warranted by the accuracy of
the digital product!
• Qualified data is not erroneous; know what is
UNCLASSIFIED
suitable for the intended purpose!
Error Types & Sources
Operationally Sourced Errors
• These errors arise from the manipulation of the
data or source product
– When you print out a map from PowerPoint
– When you round off a coordinate
• They arise from a variety of causes:
– Data Storage
• Insufficient
precision
spatial
precision,
Insufficient
numerical
– Data Manipulation
• Enhancement, Geocoding, Mosaics, Merges, Analysis
– Data Output
• Scale changes, Perspective distortions
UNCLASSIFIED
Intended Error in Spatial
Simplification
Feature Generalization and Data Currency
Map-1978
Imagery-1997
Most geographic information involves some sort of purposeful
simplification of the phenomenon that is occurring in order to
fit the desired measurement framework or data may be omitted
altogether.
UNCLASSIFIED
Error Types & Sources
User Sourced Errors
• Interpretation of Results
– Interpreting more precision than is there
– Errors in spectral classification
• Use of the results
– Applying bad information to the problem
We cannot tell which coordinate is more accurate!!
UNCLASSIFIED
Common User Error
• Accuracy of coordinate depends upon
source and method
• No connection between “number of digits
after the decimal point” and accuracy
(sometimes software displays too many
digits beyond the decimal point)
• Your scaling template is not a precision
instrument
• Most people can't read positions from a
map any more accurately than about 1
millimeter
UNCLASSIFIED
Error Types & Sources
Map-1978
UNCLASSIFIEDLIMITED DISTRIBUTION
Imagery-1997
Compounding Error
• Inaccuracy, imprecision, and error may be
compounded in a GIS that employs many
data sources through:
– Propagation
• one error leads to another in data creation
– Cascading
• bad information will skew subsequent solutions when
combined into new layers
UNCLASSIFIED
Managing Error
• The best we can do is minimize the error
sources
and
document/assess
the
remainder
– Develop appropriate classification schemes
– Use consistent data collection strategies and/or
appropriate sampling techniques
– Challenge
assumptions
and
processing
methodologies
– Document data (such as its age, source,
original purpose, scale, format, collection
strategy)
Managing Error
• Error CANNOT be eliminated
• Error must be be managed with:
– Training
– Supervision
– Understanding system, data, and product
capabilities and limitations (i.e. understanding
the intended purpose for each).
• METADATA - The “data about data” that
tells us the data suitability and allows us
to make informed decisions about its use
UNCLASSIFIED
Metadata
Executive Order 12906
• 11 April 1994: requires all data producers
to document data products with metadata
• Metadata must adhere to the Federal
Geographic Data Committee’s (FGDC)
Content Standards for Digital Geospatial
Metadata
• Before EO 12906, users could not find out:
–
–
–
–
–
Why was this data collected?
What was collected?
Who collected it?
How did they collect it?
When did they collect it?
UNCLASSIFIED
FGDC Standards
Metadata Sections
1. Identification
Information
2. Data Quality
Information
3. Spatial Data
Organization
Information
4. Spatial Reference
Information
5. Entity & Attribute
Information
UNCLASSIFIED
1. Distribution
Information
2. Metadata Reference
3. Information
4. Citation Information
5. Time Period of
Content
6. Information
7. 10 Contact
Information
Metadata
• Positional Accuracy
– Horizontal & vertical accuracy statements,
both absolute & relative
• Currency
– Time period of the data or time of completion
of the product
• Logical Consistency
– Information on the fidelity of the data set
• Resolution
– The quantities that define the position of a
point on the Earth’s surface
UNCLASSIFIED
Metadata
• Lineage
– Source information & processing methods
• Completeness
– Information about omissions, selection criteria
& generalizations
• Attribute Accuracy
– A quantitative & qualitative assessment of the
quality of the attribute information
UNCLASSIFIED
Hard Copy Products
• Map & Charts
• Basis of understanding all other product
accuracies
• Most commonly used NGA products base
their accuracy off of these standards –
understanding these will make the rest
easier to follow!!!
• The Metadata comes from the marginalia
& training
UNCLASSIFIED
Map Accuracy (Horizontal)
90% Circular Map Accuracy Standards
(CMAS)
• Traditionally used to describe absolute
horizontal map accuracy
• 90% of all well-defined features will fall
within a circle size as specified in the MILSTD for each different scale product
UNCLASSIFIED
Map Accuracy (Horizontal)
1. 90% of all well-defined points located within 0.5
mm (.02") of their geographic position with
respect to the prescribed datum (ATC/Targeting)
2. 90% of all well-defined points located within 1.0
mm (.04") of their geographic position
(JOG/TLM/Nautical Charts>1:300,000)
3. 90% of all well-defined points located within 2.0
mm (.08") of their geographic position (ONC/TPC)
4. Less than 90% of all well-defined points located
within 2.0 mm (.08") of their geographic position
(City Graphics)
So what does that circle equate to on the
ground?
UNCLASSIFIED
Map Accuracy (Horizontal)
• Example for a 1:50,000 TLM
The 90% CMAS for this map is
1mm
(category 2 from the
previous slide)
That means that this intersection
has a 90% chance of being within
1mm of where it actually belongs
on the map
• At map scale, 1mm covers 50m on the ground.
Because it is a 1mm radius circle it is ±50m in
any direction at 90% confidence or
±50m@90%.
UNCLASSIFIED
Circular (Horizontal)
Error
1:50,000 TLM
50 m absolute accuracy (90% Circular Error)
57 m
NATO (95%)
p(x)
23.3 m
σ (39.35%)
50 m
CE (90%)
39.35%
50%
27.4 m
CEP (50%)
90%
σ = Standard Deviation (39.35%)
CEP = Circular Error Probable (50%)
2σ = 2 x Standard Deviation (86.47%)
CE = Circular Error
(NIMA’s 90% Standard)
46.6 m
2σ (86.47%)
Not to Scale
UNCLASSIFIED
Estimate of
true position
NATO = North Atlantic Treaty
Organization’s 95% Standard
Map Accuracy (Horizontal)
“Rule of Thumb” for estimating product
accuracy
Accuracy
Distance (m)
Descriptor
Chart/Map
Scale
City Graphic
1:25,000
> 2mm
> 50
TLM
1:50,000
1mm
50
JOG
1:250,000
1mm
250
TPC
1:500,000
2mm
1000
ONC
1:1,000,000
2mm
2000
Accuracy = (Scale X Accuracy Descriptor)/1,000
Ex: Accuracy = (250,000 X 1)/1,000 = 250m for a JOG
UNCLASSIFIED
Map Accuracy (Vertical)
90% Linear Error (LE)
• Traditionally used to describe absolute
vertical map accuracy
• 90% of all contour lines are accurate
within a certain contour interval as
specified in the MIL-STD for each product
UNCLASSIFIED
Map Accuracy (Vertical)
1. 90% of all contours are accurate to within
one-half the basic contour interval
(ATC/Targeting)
2. 90% of all contours are accurate to within
one basic contour interval (JOG/TLM/TPC)
3. 90% of all contours are accurate to within
two
basic
contour
intervals
(ONC/JNC/GNC)
4. Less than 90% of all contours are
accurate to within two basic contour
intervals (City Graphics)
UNCLASSIFIED
Map Accuracy (Vertical)
• Example for a 1:50,000 TLM with 10m Contour
Intervals
The 90% LE for this map is 1
contour interval (category 2 from
the previous slide)
That means that the elevation
depicted has a 90% chance of
being within 1 contour interval of
where it actually belongs on the
map
• At map scale, 1 contour interval is 10m in
elevation. That means that the true elevation at
that point is ±10m (up or down) at 90%
UNCLASSIFIED
confidence or ±10m@90%.
Linear (Vertical) Error
DTED1 Elevation Post
+/- 30 m absolute accuracy
(90% Linear Error)
36.5 m
2σ (95.45%)
35.7 m
NATO (95%)
30 m
LE (90%)
18.2 m
σ (68.27%)
σ = Standard Deviation (68.27%)
LEP = Linear Error Probable (50%)
LE = Linear Error
(NIMA’s 90% Standard)
NATO = North Atlantic Treaty
Organization’s 95% Standard
2σ =UNCLASSIFIED
2 x Standard Deviation (95.45%)
12.3 m
LEP (50%)
Note: All these
state the same
accuracy.
Not to Scale
Map Accuracy (Vertical)
• Quick Reference for Linear Error (LE)
Category
Product
LE
Interval
1
Targeting
0.5
2
JOG/TLM
1
3
ONC/TPC
2
4
City Graphics
2
• Remember – this is for major contour
intervals, NOT supplemental intervals!!!
UNCLASSIFIED
Compounding Error
• Inaccuracy, imprecision, and error may be
compounded when you use geospatial
information that employs many data
sources
• This happens through:
– Propagation
• One error leads to another in data creation
– Cascading
• Bad information will skew subsequent solutions when
combined into new layers
UNCLASSIFIED
Compounding Error
GPS vs Map
A Common Problem
Map coordinate
determined by
terrain association
X
GPS coordinate
plotted on map
X
±22m GPS Error
UNCLASSIFIEDLIMITED DISTRIBUTION
Raster Maps
Positional Accuracy
• To create CADRG, NGA/GPC scanned in
Hard Copy Maps
• The same positional accuracy standards
that applied to those hard copy products
still apply to them when they are a
computer file
– Horizontal Accuracy
– Vertical Accuracy
UNCLASSIFIED
Imagery Accuracy
• Two “Types” of Imagery
– Non-Geodetically Controlled Imagery (“Happy Snaps”)
– Geodetically Controlled Imagery:
• Rectified/Orthorectified Imagery (e.g. CIB)
• Accuracy varies depending on designed intended
use of imagery (+/- m to +/- km):
– Number, Distribution, and Accuracy of Control Points
– Resolution of Imagery
– Processing Done to Imagery
• Accuracy can vary tremendously!
UNCLASSIFIED
Raster Imagery
• Imagery can have a a variety of positional
accuracies associated with it!!!!!
– The accuracy of a product is heavily dependent
on the intended use and the production
methods
VS
UNCLASSIFIED
Accuracy Differences
UNCLASSIFIED
Understanding Data
Currency
• Describes data age ranges; refers to the
date at which the data was either
introduced or modified in the database
• It is rare that all components of a data
layer have the same date of origin
• Usually found in the Data Quality Table:
– Example
fields:
revision_date.
UNCLASSIFIED
creation_date
and
Raster Maps
Currency
• CADRG Currency is the same as that of the
source map!!
UNCLASSIFIEDLIMITED DISTRIBUTION
Raster Maps
Logical Consistency
• Within a map it is the same as the source map or chart
• When map sheets are tiled together there may be
problems – but they are the same as you would have if
you taped together a series of maps
• River
• Power Lines
UNCLASSIFIEDLIMITED DISTRIBUTION
Completeness
• It is desirable that the whole of a study
area should have a uniform cover of
information
• If there is only partial coverage, the data is
not complete & requires decisions on its
usefulness:
– may have to collect more data
– may have to fill in with remotely sensed data
– may have to generalize detailed data to match
less detailed areas
UNCLASSIFIED
Raster Maps
Completeness
• Is everything you need shown/available?
UNCLASSIFIEDLIMITED DISTRIBUTION
Raster Imagery
• Controlled Image Base
– All of it is produced to the same standard
– Absolute horizontal accuracy of 90%
– Spatial Resolution depends on the product
• CIB5 has a spatial resolution of 5m
• CIB10 has a spatial resolution of 10m
– Currency varies depending on world events
(How often the CIB is updated and redistributed)
UNCLASSIFIED
Raster Imagery
• Digital Point Positioning Data Base
(DPPDB)
– All of it is produced to the same standard
– Absolute horizontal and vertical accuracy is
classified
– Spatial Resolution is also classified
– Currency varies depending on world events
UNCLASSIFIED
Matrix
• Digital Terrain Elevation Data
–
–
–
–
All of it is produced to the same standard
Absolute horizontal accuracy of ±50m@90%
Absolute vertical accuracy of ±30m@90%
Spatial Resolution depends on the product
• DTED 0 has a post spacing of 30 arc seconds ~1km
• DTED 1 has a spatial resolution of 3 arc seconds
~100m
• DTED 2 has a spatial resolution of 1 arc second ~30m
– Currency varies depending on world events
– DTED collected via the SRTM mission
• Absolute horizontal accuracy of ±20m@90%
• Absolute vertical accuracy of ±16m@90%
UNCLASSIFIED
Matrix
• Digital Bathymetric Data Base – Variable
Resolution (DBDB-V)
– No defined positional accuracy standard
– Absolute horizontal & vertical accuracy
equivalent to the supported chart scale
– Spatial Resolution – 4 Resolutions
• 5 arc minute post spacing (1:4,000,000 scale)
~10km
• 2 arc minute post spacing (1:1,000,000 scale) ~4km
• 1 arc minute post spacing (1:1,000,000 scale) ~2km
• 30 arc second post spacing (1:500,000 scale) ~1km
– Currency varies depending on world events
UNCLASSIFIED
Vector
•
•
•
•
•
•
•
Remember the Accuracy Categories
Positional Accuracy
Resolution
Currency
Lineage
Logical Consistency
Attribute Accuracy
Completeness
UNCLASSIFIED
Vector
Vector Map (VMAP) & Urban Vector Map (UVMAP)
• To create VMAP/UVMAP, NGA digitized features
from Hard Copy Maps (Cartographically
Sourced)
• The same Positional Accuracy standards that
applied to those hard copy products still apply
to them when they are a computer file
– Horizontal Accuracy (CMAS)
– Vertical Accuracy (LE)
• Remember that we are talking about features
on a map – they have been Generalized!
UNCLASSIFIED
Vector
Vector Map (VMAP) & Urban Vector Map
(UVMAP)
Spatial Resolution
• VMap 0
– 1:1,000,000 Scale
• VMap 1
– 1:250,000 Scale
• VMap 2
– 1:50,000 Scale
• UVMAP
– Variable Scale
UNCLASSIFIED
Vector
Vector Map (VMAP) & Urban Vector Map
(UVMAP)
Currency
• For Vector Products, currency refers to the
date at which the data was either
introduced or modified in the database
• It is rare, but possible, that VMAP/UVMAP
component data layers have different
dates of origin!!
– Currency information can be found in either
the Data Quality coverage or in the
Lineage.doc
UNCLASSIFIED
Vector
Understanding Data Currency
Data Set
1972
Forest Types
1995
Soils
Streams
1997
Roads
1964
Elevation
1981
Understanding the currency of the information
found in the database is important for analysis and
decision making!
UNCLASSIFIED
Vector
Vector Map (VMap) & Urban Vector Map
(UVMAP)
Lineage
• Information about the source of the
product and the production methods and
specifications used during production
• “A short synopsis of the data’s life”
• For VMAP/UVMAP this will list the source
Map or chart with edition number
UNCLASSIFIED
Understanding Data
Lineage
• Information of data source materials and data
production process
• “a short synopsis of the data’s life”
• can be found in the DQT (Data Quality Table) or
the Lineage.doc and contains:
–
–
–
–
processing tolerances
interpretation rules applied to source materials
basic production procedures
key decisions made by the data producer during the
production stage
UNCLASSIFIED
Lineage.doc File
• Example: City of Konjic - UVMAP
This file documents the lineage characteristics of
the Konjic UVMap database. It provides
information supplementary to the Data Quality
Table (DQT), a standard VPF table at the library
level that, along with the dqarea.aft in the DQ
Coverage, is the main repository for information
about the source data. This file also contains
general
information
about
development
techniques common to all Konjic library
coverages. The source used for this library was
DMA City Graphic. Series: M903 Edition: 2-DMA
Printed: 10/01/1995. Etc...
UNCLASSIFIED
Vector
• UVMAP Lineage.Doc example
UNCLASSIFIEDLIMITED DISTRIBUTION
Vector
Vector Map (VMAP) & Urban Vector Map
(UVMAP)
Logical Consistency
• Covers the internal consistency of the
representation. Verifies the relationships
that should be present
• “How well the data relates to the other
data in the database”
• When VMAP data sets are tiled together
there may be problems – but they are the
same as you would have if you taped
together a series of maps
UNCLASSIFIED
Vector
Tile
Boundary
UNCLASSIFIEDLIMITED DISTRIBUTION
Vector
Tile Boundary
Feature classifications
should be consistent
UNCLASSIFIEDLIMITED DISTRIBUTION
Vector
Vector Map (VMAP) & Urban Vector Map
(UVMAP)
Completeness
• In a perfect world, the whole study area
will have a uniform, complete cover of
information
• Completeness refers to not only the
amount of data (feature completeness),
but also the richness of the attribution
(attribute completeness)
• VMAP/UVMAP has 10 layers of information
UNCLASSIFIED
Vector
Vector Map (VMAP) & Urban Vector Map
(UVMAP)
Feature Completeness
• A percentage value that indicates how complete
the features are captured according to a capture
specification
• The feature completeness value is usually
contained in the Data Quality Table
Features, such as buildings,
should be complete according
to the capture specification.
UNCLASSIFIEDLIMITED DISTRIBUTION
Vector
Vector Map (VMAP) & Urban Vector Map
(UVMAP)
Attribute Completeness
• Refers to the completion of the assignment of
attribute values to the feature within a
database
(according
to
a
capture
specification)
• Rated 100% if all relevant attributes of a
feature were captured according to a capture
specification
• The attribute completeness value is usually
contained in the Data Quality Table
UNCLASSIFIED
Vector
Vector Map (VMAP)
(UVMAP)
Attribute Accuracy
&
Urban
Vector
Map
• Covers the correspondence of the non-spatial
elements
• Data attribute accuracy describes the
accuracy/reliability of the data capture for an
attribute
• Attribute types are Quantitative or Qualitative
• The
attribute
accuracy
is
expressed
differently based on the type of attribute
UNCLASSIFIED
Thematic Accuracy
• The degree to which the description of a
feature in a digital data base matches its
actual description in the real world
– described another way, thematic accuracy
refers to the likelihood that a given pixel or
feature will be placed in its correct category
• Thematic accuracy can be thought of as
the “correctness” of data while thematic
precision can be thought of as the
“descriptiveness” of data (i.e. numbers of
attributes)
UNCLASSIFIED
Vector
Vector Map (VMAP) & Urban Vector Map
(UVMAP)
Attribute Accuracy
• Quantitative and Qualitative accuracy:
– Quantitative: expressed as the standard
deviation of the attribute value (e.g., ±1
meter)
– Qualitative:
expressed
as
reliability
percentage (e.g., a Road = 90%)
• The Data Attribute accuracy value may exist
in the Data Quality table or as attributes at
lower levels of the database structure
UNCLASSIFIED
Quantitative vs. Qualitative
• Quantitative
Attributes:
Attributes These are
numerical attributes
(e.g., height or length
values).
• Qualitative Attributes:
Attributes
These are text or code
attribute (e.g. NAME
= “Potomac River”)
UNCLASSIFIED
GSC Slope gradient
0
1
2
3
4
5
6
Unknown
0 - 3%
> 3 to 10%
> 10 to 20%
> 20 to 30%
> 30 to 45%
> 45%
Example: GSC is a qualitative attribute
that may contain numerical code values.
Data Quality Coverage
• The Data Quality (DQ)
coverage contains lines
and areas for visual
display of data quality
issues. They are linked
to
attribute
tables
containing data quality
info.
• The
data
quality
coverage is similar to
the Tileref coverage in
appearance
and
provides
information
about
the
source
material when queried.
UNCLASSIFIED
Data Quality Coverage
• Example data quality coverage polygons
(data voids) with an associated attribute table
UNCLASSIFIED
Vector
Foundation Feature Data
– Positional Accuracy
• Absolute horizontal of ±25m (50m for vegetation and
boundaries) @ 90%
• Absolute vertical of ±10m @ 90%
– Resolution
• 1:50,000 equivalent
– Currency - varies
– Lineage
• Image sourced
– Logical Consistency
• Only inconsistent when current source information forces
it
– Attribute Accuracy – As per the specification MIL-PRF
89049
– Completeness
• 7 thematic coverages
UNCLASSIFIED
Vector
Digital Nautical Chart (DNC)
– Positional Accuracy
• Absolute horizontal
–
–
–
–
DNC
DNC
DNC
DNC
General > ±500 m @ 90%
Coastal > ±1000m @ 90%
Approach > ±200m @ 90%
Harbor > ±100@ 90%
• Absolute vertical of > ±2 contour intervals @ 90%
– Resolution Varies
•
•
•
•
UNCLASSIFIED
DNC
DNC
DNC
DNC
General 1:500,000 and smaller
Coastal 1:50,000 – 1:300,000
Approach 1:150,000 and smaller
Harbor 1:50,000 and larger
Vector
Digital Nautical Chart (DNC)
– Currency - varies
– Lineage
• Cartographic sourced
– Logical Consistency - Varies
– Attribute Accuracy – As per the specification
MIL-PRF 89023
– Completeness
• 12 thematic coverages
UNCLASSIFIED
Vector
Tactical Ocean Data Level 0 (TOD0)
– Positional Accuracy
• Absolute horizontal accuracy of ±250m @ 90%
–
–
–
–
–
Resolution
Currency
Lineage
Logical Consistency
Attribute Accuracy – As per the specification
MIL-PRF 89049
– Completeness
• 4 thematic coverages
UNCLASSIFIED
Vector
Digital Topographic Data (DTOP)
– Positional Accuracy
• Absolute horizontal of ±50m @ 90%
• Absolute vertical of ±20m @ 90%
–
–
–
–
–
–
Resolution - 1:50,000 equivalent
Currency
Lineage – Image Sourced
Logical Consistency
Attribute Accuracy
Completeness – 14 Thematic Layers
UNCLASSIFIED
LE
CE
UNCLASSIFIED
Accuracy
Concepts
for
Precise
Positioning
Precise Positioning:
Overview
• Fundamentals:
–
–
–
–
–
Presence of Errors
Absolute vs. Relative Accuracy
Components of Error
Accuracy vs. Precision
Digits After the Decimal Point
• Types of Error
– Error Statistics
– Propagation of Errors
UNCLASSIFIED
Precise Positioning:
Types of Error
• Target Location Error:
– Error present before use of weapon
– Error in target coordinates
– Depends on source and method used to derive
coordinates
• Weapon Navigation Error:
– Errors introduced during operation of the
weapon
– Depends on design of navigation system,
including INS and use of GPS signal
UNCLASSIFIED
Precise Positioning: Target
Location Error (TLE)
DERIVED
TARGET
COORDINATES
TRUE
TARGET
LOCATION
UNCLASSIFIED
Precise Positioning: Target
Location Error (TLE)
• Check source and method:
–
–
–
–
–
Who derived the coordinate?
How did they do it?
What coordinate system did they use?
What product did they use?
Is the source product accurate enough?
• This will tell you:
– Accuracy
– Format (e.g., DD, DM, DMS, UTM, MGRS)
– Coordinate Datum
UNCLASSIFIED
Precise Positioning: Weapon
Navigation Error (WNE)
ESTIMATED WEAPON
LOCATION
TRUE WEAPON
LOCATION
UNCLASSIFIED
Precise Positioning: Target
Location Error (TLE)
(90% Error Cylinder)
Derived
Coordinate
LE
CE
Intended
Coordinate
UNCLASSIFIED
Precise Positioning: Weapon
Navigation Error (WNE)
(50% Error Cylinder)
LEP
CEP
True
Weapon
Location
UNCLASSIFIED
Estimated
Weapon
Location
Precise Positioning:
Computing Combined Accuracy?
Computing the Total Delivery Error
• Step 1: Convert Statistics
– Why? …Cannot directly mix a 50% statistic
with a 90% statistic
– Convert 50% to equivalent 90% or vice versa
– Note: Changing the statistic does not change
the “level of accuracy”
• Step 2: Combine Errors (Error
Propagation)
– Not simple addition of errors!
– Result is Total Delivery Error (TDE)
UNCLASSIFIED
Precise Positioning:
Converting Statistics
Circular Errors (Horizontal)
LE
from
LEP
CEP
CE
to
CEP
(50%)
CEP
(50%)
CE
(90%)
1.82
0.55
Linear Errors (Vertical)
from
to
LEP
(50%)
LEP
(50%)
LE
(90%)
UNCLASSIFIED
CE
(90%)
LE
(90%)
2.44
0.41
22
Precise Positioning:
Combining Errors
Total Delivery Error (TDE)
TDE =
(TLE)
TLE
2
+ (WNE)
WNE
2
where: TLE = Target Location Error
WNE = Weapon Navigation Error
WNE
TLE
0
10
20
30
40
0
10
20
30
40
0
10
20
30
40
10
14
22
32
41
20
22
28
36
45
30
32
36
42
50
40
41
45
50
57
Example: TLE = 20, WNE = 30, TDE = 36
UNCLASSIFIED
Precise Positioning:
Example
• Given:
– TLE = 20 m (90% CE); 30 m (90% LE)
– WNE = 15 m (50% CEP); 18 m (50% LEP)
• Find TDE:
– Convert WNE CEP / LEP into CE / LE
• 15 m CEP X 1.82 = 27.3 m CE (90%)
• 18 m LEP X 2.44 = 43.9 m LE (90%)
– Compute TDE
TDE (Hor.) =
(20 m) 2 + (27.3 m) 2 = 33.8 m CE (90%)
TDE (Vert.) =
(±30 m) 2 + (±43.9 m) 2 = ±53.2 m LE (90%)
Note: Accuracy values are not representative of actual capabilities!
UNCLASSIFIED
Precise Positioning: Total
Delivery Error (TDE)
(90% Error Cylinder)
Impact Point
LE
CE
Intended
Coordinate
UNCLASSIFIED
Compounding Error
El Dorado Canyon
Situation
(a) Enemy Forces. In 1986, the US had determined that the
country of Libya was sponsoring training for international terrorist
organizations around the world. Political negotiations with the
countries dictator, Moahmar Kaddaffi, failed to cease this training
(b) Friendly Forces
(1) The initial planning by DIA (on 1:250,000 JOG)
(2) Planners at Lakenheath (on 1:50,000 TLM)
(3) APPS
(4) City Graphics
Mission
US personnel determined that Military intervention was the only
way deal with the situation. F-111’s out of RAF Lakenheath, UK
were tasked to execute precision air strikes on a signal building
UNCLASSIFIED
Compounding Error
Eldorado Canyon
• 1:250,000 - ED 50
– 334,100 mE, 3,641,200 mN
• 1:50,000 - ED 50
– 334,000 mE, 3,641,450 mN
• City Graphic - ED 50
– 333,970 mE, 3,641,410 mN
• APPS - WGS
– 333,901 mE, 3,641,250 mN
• APPS coordinate with Datum converted - ED 50
– 333,977 mE, 3,641,438 mN
UNCLASSIFIED
Compounding Error
• Target Coordinates From APPS
– 333,977 mE, 3,641,438 mN
Product
Intended
Uses
Target
Coordinate
1:250K
JOG
Navigate,
Locate
1:50K
TLM
1:25K
City
Graphic
UNCLASSIFIED
Error
difference
Allowable
Error
334,100 mE,
3,641,200
mN
259m
±250m
@90%
Land
Combat
334,000 mE,
3,641,450
mN
25m
±50m
@90%
Close
Combat,
Evacuations
333,970 mE,
3,641,410
mN
29m
±50m
@90%
Accuracy
• Errors
can
never
be
completely
eliminated; the best we can do is minimize
the error sources and document/assess
the remaining error
• When you use a product consider:
–
–
–
–
–
–
–
Positional Accuracy
Resolution
Currency
Lineage
Logical Consistency
Attribute Accuracy
Completeness
UNCLASSIFIED
Accuracy
• Develop
appropriate
classification
schemes
• Use consistent data collection strategies
and/or appropriate sampling techniques
• Challenge assumptions and processing
methodologies
• Document data (such as its age, source,
original purpose, scale, format, collection
strategy)
UNCLASSIFIED
Summary
• What is accuracy?
• Why is accuracy important?
• What is Metadata?
UNCLASSIFIED
Questions??
Questions?
Perguntas?
UNCLASSIFIED