SpecComFprintSyst13b.pdf

Specifying
commercial & consumer
fingerprint systems
1
DRS \\ 7jun02
How commercial fingerprint systems fail
Type of failure
Severity class
During enrollment
z
Unable to enroll a user
Denial of service
z
Unable to enroll a finger
Nuisance
During authentication
z
Accept an imposter
Security breach
z
Reject a valid user
Denial of service
z
Reject a single touch
Nuisance
To be useful, system specifications must be based on the failure modes.
2
DRS \\ 7jun02
Specifying commercial fingerprint system
performance
Systems must be tested / specified as complete systems
z
z
The components of the system should be tested separately for development and
design validation,
However, combining independent component specifications into a system spec is
rarely useful
–
Inaccurate at best, totally misleading at worst
Controlling the test conditions is critical to testing
z
Test population demographics is the single most significant factor
–
–
z
Age, occupation, health, race are significant
Results typically vary + or – a factor of 10 due to demographics
Environmental conditions
–
–
Humidity and temperature are significant
Expect major differences between summer and winter
System specifications must apply all of the key metrics simultaneously
z
because system configuration can trade-off improvement in one metric for
degradation of others
3
DRS \\ 7jun02
Causes of fingerprint systems failure
Low quality
data
capture
Weak feature
extraction
Sensor
Sensor
Low quality
data capture
Sensor
Sensor
Enrollment
Processor
Processor
Authentication
Processor
Processor
4
DRS \\ 7jun02
Weak template
generation
Storage
Storage
Weak
matching
Template
too small
Storage
Storage
Specifying fingerprint component performance
ATA
Sensor
Sensor
FAR/FRR
Processor
Processor
Storage
Storage
Ability to Acquire (ATA)
z
It is a measure of the fingerprint sensor’s ability to generate highquality, repeatable data across the range of finger types and conditions
to be encountered by the system
z
In practice, sensors (and systems) must be tested literally side-by-side,
on the same fingers, at the same time. This minimizes the effect of the
huge variability in fingers over time. A single finger varies significantly
from hour to hour.
z
The ATA is generally represented as a relative success rate (e.g.
99.99%) for the specific test population demographics and specific test
condition ranges.
5
DRS \\ 7jun02
Specifying fingerprint component performance
ATA
FAR/FRR
Sensor
Sensor
Processor
Processor
Storage
Storage
False Accept Rate (FAR) and False Reject Rate (FRR)
z
In fingerprint systems, these are a measure of the processing software’s
ability to accurately and repeatably distinguish one finger from another.
z
FAR and FRR are not independent. You must have both simultaneously
to have meaningful data.
z
The classic definitions are – for a selected scoring threshhold:
–
FAR = # of mistaken matches / total # of non-match cases tested
–
FRR = # of mistaken non-matches / total # of true match cases tested
6
DRS \\ 7jun02
Tailoring the biometric performance specification
The definitions of the key metrics differs from one application to the next
z
For example look at the contrast between these two applications:
–
In fingerprint PC login applications, falsely rejecting a single finger touch is
considered a nuisance reject provided the user is accepted on a subsequent
touch. If after several finger touches the user is still not accepted, he has
been Denied Service.
–
In fingerprint sensors on guns, failure to verify a single finger touch is a Denial
of Service, due to the real-time nature of gun fights
The allowable failure rates vary from one application to the next
z
For example:
–
In Debit card user authentication, a imposter accept rate of 1 in 10,000 may be
appropriate, and a nuisance reject rate of 1 in 10 acceptable provided no
Denial of Service situations occur.
–
In entertainment park season pass user authentication, an imposter accept
rate of 1 in 10 my be acceptable, while nuisance reject rates must be kept
below 1 in 1000.
7
DRS \\ 7jun02
The key system metrics -- and an example set of
definitions for a PC login application
During Enrollment
Unable to enroll a finger
z 2 attempts to enroll the finger fail
Unable to enroll a user
z 2 different (reasonably selected) fingers are unable to enroll (note 1)
During Verification
Accept an Imposter
z Imposter is accepted at least once, when allowed 3 tries on each
of 2 fingers (note 2)
Reject a valid User
z Unable to verify either of 2 enrolled fingers with 3 tries each (note 2)
Reject a single finger touch
z This is the classic system false reject rate
Nuisance
Denial of
Service
Security
Breach
Denial of
Service
Nuisance
Note1: Assumes that user does not select fingers known to be damaged
Note 2: Assumes that the system will monitor sequential failures, determine that an attack is in
progress after 6 sequential failures, and go into a defensive mode (e.g., lockout account).
8
DRS \\ 7jun02
Justification for the definitions
suggested for the PC login application
Enroll:: If one finger cannot be enrolled, but a second finger can, it is considered a
nuisance. Inability to enroll either of 2 fingers (when starting up a new PC for
example) is likely to cause user frustration, and is judged to be a denial of service.
Imposter accept:: The system is assumed to be unattended so that an imposter
could try to fool it as many times with as many different fingers as the system will let
him. As with a password system, if a sufficiently long sequence of failures occurs the
system assumes it is under attack and initiates protective measures to prevent further
attempts to penetrate the system. Hence if the imposter succeeds before protective
measures are initiated a breach of security occurs.
Reject a valid user:: This event is defined to occur to a valid user if a sufficiently long
sequence of failures occurs that the system takes defensive action. This may include
disabling a user account or a physical device, and generally requires intervention of a
system administrator to reset. Hence the event is a denial of service.
Reject a valid single finger touch:: This is considered a nuisance in the same context
that a failed mouse click is a nuisance. In an interactive environment, the user
corrects the situation immediately by touching the sensor again.
9
DRS \\ 7jun02
An example system specification for a
PC login application
Type of failure
Severity
Rate
During enrollment
z
Unable to enroll a user
Denial of service
<1/100k
z
Unable to enroll a finger
Nuisance
<1/100
During authentication
z Accept an imposter
(specified as a FAR)
Security breach
<1/10k
z
Reject a valid user
(specified as a FRR)
Denial of service
<1/100k
z
Reject a single touch
(specified as a FRR)
Nuisance
<1/50
A single specification (e.g., imposter accept rate) without
the other 4 specifications is meaningless due to trade-offs
10
DRS \\ 7jun02
Understanding authentication
Imposture accept rates
How would a particular system that has a single touch FAR of 1/10k, perform in
the PC login application discussed in the previous slide.
z
Imposter accept is here defined to allow 3 tries for each of 2 fingers. An
expected performance range can be estimated
–
Assuming the two fingers are independent but the three tries for each finger are
not generates the best possible case where the imposter accept rate would be
1/10k * 2 = 1/5k
–
Assuming each try for each finger is independent generates the worst case
where the imposter accept rate would be 1/10k * 6 = 3/5k or ~ 1/2k
This illustrates the importance of implementing sequential failure limits in
unattended fingerprint identification systems.
z
In the extreme case, if no limit is imposed, an imposter could try his 10
fingers 3 times each.
z
For the example system discussed above with a single touch FAR of 1/10k:
–
The best case is an imposter accept rate of 1/10k * 10 = 1/1k.
–
The worst case would be an imposter accept rate of 1/10k * 30 = 1/330
11
DRS \\ 7jun02
Understanding authentication
Nuisance reject and denial of service rates
It is not possible to estimate denial of service rates from a system’s
measured single finger false reject rate.
z Denial of service rates must be explicitly measured, as
sequences of touches
This is primarily because the mechanisms causing nuisance rejects
and those that cause denial of service are usually different, and the
single finger reject rate is usually dominated by nuisance rejects.
Typical causes of nuisance rejects in commercial systems:
z Inaccurate finger positioning
–
z
z
The user is typically more careful on the second touch
Long skin settling time1 (e.g. during cold weather)
Skin distortion during placement (most pronounced in elderly)
Note 1: In typical access control systems, the system must decide
decide to accept or reject a finger within 1 or 2 seconds.
However, in cold weather, the finger skin tends to get more stiff,
stiff, especially in people with dry or callused skin. It may take
5 or more seconds for the finger skin to settle down flat on the fingerprint sensor. Lifting and replacing the finger once or
twice during the settling period does not affect the settling process
process significantly.
12
DRS \\ 7jun02
Understanding enrollment
Denial of service
Denial of service rates cannot be directly estimated from
single finger failure to enroll measurements
z Enrolling several fingers from the same person are
not statistically independent events
z The mechanisms that prevent a single finger from
enrolling are often different than those that prevent a
person from enrolling any of his fingers.
Enrollment denial of service should be measured
explicitly as sequences of failures of several tries using
several fingers of the same person
13
DRS \\ 7jun02
Conclusions
Specifying commercial biometric system performance
requires careful consideration of the system’s
operational procedures
System-level biometric performance metrics mean
different things in different classes of systems
z The definitions must be tailored to the application
System level biometric performance typically cannot be
reliably estimated from the biometric performance of the
components
z System level performance should be explicitly tested
14
DRS \\ 7jun02