Biometrics and the threat to civil liberties - Computer

THE PROFESSION
Biometrics and
the Threat to
Civil Liberties
based on that data only—which can
require a search of the entire database.
Performing this search takes a long time
and even then will only rarely result in a
single-record match. This means that the
system must perform additional filtering.
Keep in mind that these searches are
not text-based. Because biometric data
is pattern-based, finding a hit requires
specialized algorithms that focus on
finding specific patterns in certain
aspects of the data.
Margaret L. Johnson, Stanford University
FACE-RECOGNITION SYSTEM
Applying this background to some
biometric systems examples makes it
I
n the post-9/11 world, various
government agencies have proposed or built several data systems
that significantly affect civil liberties. As system designers and developers, we might not be aware of how
the decisions we make when implementing such systems could threaten
civil liberties. Thus, we need mechanisms or procedures to help us make
technical decisions that respect human
rights. Biometrics is an area in which
this need is especially important.
WHAT IS BIOMETRICS?
Biometrics refers to the automatic
identification or verification of living
persons using their enduring physical
or behavioral characteristics. Many
body parts, personal characteristics,
and imaging methods have been suggested and used for biometric systems:
fingers, hands, faces, eyes, voices, signatures, typing styles, DNA, and so on.
The body parts most often used in current applications are fingerprints and
facial characteristics.
Biometric systems process raw data
to extract a biometric template—a
small set of data that can be uniquely
derived given a biometric feature.
Various algorithms process biometric
data to produce a template. For example, in a face-recognition system, facialgeometry algorithms work by defining
a reference line—for example, the line
joining the pupils of the eyes—and
92
Computer
Biometrics is an area in
which having mechanisms
for making decisions that
respect human rights is
especially important.
using it to measure the distance and
angle of various facial features relative
to this reference. Templates are easier
to process and store than the original
raw data.
Biometric systems fall into two categories: authentication and identification, with authentication systems being
far more common. To be authenticated
by a system, a subject presents a password or a token such as an ID card,
along with a live biometric sample such
as a fingerprint. The system accesses a
record based on the token, then compares the sample’s biometric data with
the record’s sample to authenticate the
subject’s identity.
Authentication systems are reliable
and efficient if the subject base is small
and the biometric readers are accurate
and durable. Airports, prisons, and
companies that need secure access use
systems such as these.
Implementing identification systems
is more difficult. To be identified by a
system, a subject provides biometric
data, and the system must find a record
easier to understand how implementation decisions can pose a threat to civil
liberties. Consider the timely example
of an airport passenger identification
system containing a database that
stores the facial data of known criminals and terrorists in a watch list. This
system uses special cameras to scan the
faces of passengers as it looks for individuals whose facial data match
records in its database. If the system
finds a match, it dispatches a security
guard to bring the person to a security
checkpoint for further investigation.
Is such a system feasible? Experimental systems have been implemented,
most notably in Boston’s Logan International Airport, but such systems do
not yet meet expectations. At Logan,
where 10 of the September 11th terrorists boarded flights that were subsequently hijacked, face-recognition systems exhibited a failure rate of 38.6 percent during testing. According to press
reports, the systems failed to detect volunteers playing potential terrorists.
Continued on page 90
The Profession
Continued from page 92
Face-recognition technology is not
yet robust enough to be used this way,
but given the development rate in this
area, identification systems using it will
likely be implemented soon. Three
primary impediments must, however,
be overcome first:
• Excessive false positive rate. A false
positive occurs when a subject’s
biometric data incorrectly matches
that of a watch list member.
• Uncontrolled environmental and
subject conditions. Samples taken
in an airport are noisy in that the
light is uneven, shadows can partially cover the face, the image
may not be frontal, the subject
may be wearing a disguise, and so
on. These variations make matching more difficult.
• Watch list size. This factor poses an
important limiting factor because
every time database size doubles,
accuracy decreases by two to three
percentage points overall (P.J.
Phillips et al., Face Recognition
Vendor Test 2002, National Institute of Standards and Technology,
2003).
IMPACT ON CIVIL LIBERTIES
An identification system based on
face-recognition technology poses several threats to civil liberties. First, false
positives must be investigated, which
impinges on the privacy of innocent
people.
In biometric systems, the degree of
similarity between templates required
for a positive match depends on a decision threshold, a user-defined system
parameter. The user can specify high
security, in which case innocent subjects
might be caught when the system casts
a broader net. Alternatively, the user
might specify low security, in which
case terrorists could escape. Setting this
parameter thus directly affects the false
positive rate, which in turn directly
affects subjects’ privacy.
Another important civil liberty issue
involves the potential for biometric sys90
Computer
tems to locate and physically track airline passengers. People being scanned
and possibly tracked may not be aware
of the system and thus cannot control
it. The US Constitution’s Fourth Amendment guards against illegal searches
and seizures by the government. Article
12 of the United Nations’ Universal
Declaration of Human Rights, adopted
in 1948, guards against interference
with privacy, family, or home. Thus, a
case could be made that if a government agency installs and maintains a
face-recognition system at an airport,
data collected and used without a subject’s consent could represent a civil liberties violation.
A database with
biometric data
presents a natural
target for theft
and malicious and
fraudulent use.
WHO ARE THE DECISION MAKERS?
Obviously, system designers and
developers must be aware of their
work’s civil liberty implications. In the
example I’ve described, many technical decisions could, if made in ignorance of these issues, threaten civil
liberties. For example, the securitylevel parameter that lets a user define
the false-positive rate can be implemented in several ways. Internally, the
parameter controls how closely biometric data must match to represent a
hit. A system designer or developer will
decide which aspects of the biometric
data to use and establish the ranges of
acceptability. Because each of these
decisions affects the false-positive rate
in ways the user cannot control, they
affect the civil liberties of the subjects
the system processes.
The camera technology chosen provides another potential threat. Suppose
a designer decides which camera the
system should use based solely on the
project’s requirements, without con-
sidering whether the camera is small
and unobtrusive or large and obvious.
This decision can affect the likelihood
that subjects will be aware that the system is collecting their biometric data.
Lack of consent implies lack of control
in how a private company or a government agency might use a person’s
biometric data.
Finally, the question of how to store
the collected biometric data arises. It’s
common practice to store this data for
an extended time after collection. If a
disaster occurs, the data would be helpful in any ensuing investigation. A
designer creating a database to store the
biometric data makes decisions about
accessibility, security, and data organization, all of which define who can
access the data and how it can be used.
The stored data provides a record of
the subject’s location at a particular
time and can be used for tracking.
CRITICAL ISSUES
More serious issues arise in the
implementation of certain authentication systems. Consider another system
that might be used in airports: To get
past the security checkpoint, all passengers must provide a fingerprint.
Each passenger also presents an ID
such as a driver’s license. This data is
entered into a system, which then
searches a database of US citizens and
their fingerprints against the passenger’s ID. If the data matches, the passenger is allowed to pass; if the data
does not match, or if the person does
not have a record in the database, officials detain the passenger for further
investigation.
This type of authentication system
presumes a communication mechanism to a host computer and a central
repository of biometric data. The
implementation of such a system represents both the most serious technical
challenges in biometrics and the most
serious threats to civil liberties.
A database with biometric data presents a natural target for theft and
malicious and fraudulent use. If criminals intercept someone’s biometric
data—either by breaking into a transmission or by stealing it from the database—they can either replicate the
sample itself or the template produced
from a successfully matched sample. If
the thieves can ascertain whose data is
associated with the ciphertext, they can
even steal encrypted data. Armed with
these capabilities, criminals can steal
identities. Identity theft is much harder
to correct than theft in current tokenbased systems. Given the difficulty in
identifying compromised records, a
successfully attacked system is not only
useless, it’s dangerous.
Further, although anyone who loses
a driver’s license can replace it easily,
someone whose fingerprints have been
stolen cannot obtain new ones. This
adds a new dimension to identity theft,
which represents one of the most serious civil liberty violations.
Implementing a large-scale authentication system requires making a multitude of technical decisions concerning
security and database safeguards.
Many of these decisions affect civil liberties in that they define the system’s
level of security and safety. It often
comes down to a tradeoff between system performance and system security.
Who will decide on that tradeoff, and
what criteria will they use?
ing passports that contain biometric
data.
What can we do to raise the sensitivity of future system designers and developers to the social impact of the systems
they create? Stanford University offers
an Ethics and Social Responsibility
course that addresses these issues in a
scenario-based format. Students participate in role-playing in real-world situations to help them understand the
effects of their decisions. Also, the
ACM-IEEE Computing Curricula 2001
discusses the need for a required course
on social and professional responsibility, along with short, relevant modules
presented in other courses. Such courses
are becoming increasingly critical as the
systems we build become more intrusive and dangerous.
What can we do to raise the awareness of practicing designers and developers? Perhaps currently used software
design and development methodolo-
I E E E
C o m p u t e r
S o c i e t y
N o t
o n l i n e
a
m e m b e r ?
M
J o i n
any computing professionals
agree that technological limitations make implementing largescale biometric systems too risky at this
time. This consensus is not stopping private companies and the US government
from moving forward with such implementations, however.
Under the new US-VISIT program
started in January 2004, all foreigners
who enter the US on visas must have
their hands and faces scanned digitally.
In addition, starting later this year, new
passports will be issued that bear a chip
containing biometric data. By October
2004, all countries whose nationals can
enter the United States without a visa—
including western European countries,
Japan, and Australia—must begin issu-
t o d a y !
m e m b e r s
gies can be enhanced to include checkpoints that allow consideration of
social and legal issues.
How developers design, build, protect, and maintain a biometric system
will determine its effectiveness and the
degree to which it poses a threat to
civil liberties. As application designers
and developers, we must understand
the tremendous effect our decisions
and actions can have on society as a
whole. ■
Margaret L. Johnson is a senior lecturer in computer science at Stanford
University. Contact her at johnson@cs.
stanford.edu.
Editor: Neville Holmes, School of Computing, University of Tasmania, Locked Bag
1-359, Launceston 7250; neville.holmes@
utas.edu.au
save
25
%
on all
conferences
sponsored
by the
IEEE
Computer Society
w w w. c o m p u t e r. o r g / j o i n
April 2004
91