Chapter 3 Fundamentals of Geographic Information and Cartography

Chapter 8
Remote Sensing
Chapter Overview
Remote sensing is the collection of data without directly measuring the object—it relies on the
reflectance of natural or emitted electromagnetic radiation (EMR). EMR can be emitted by the
sun and sensed by photographic film, or it can be sent by a transmitter, e.g., radar and the
returned energy sensed. Remote sensing has become a key means of data collection for a number
of reasons. Mainly though, it allows for systematic and accurate collection of geographic
information.
Remote sensing is defined very broadly in this chapter: A measurement of an object’s
characteristics from a distance using reflected or emitted electro-magnetic energy. This definition
means remote sensing includes all kinds of photography, aerial imagery, satellite sensors, and
any kind of laser. Remote sensing involves different types of sensor technologies ranging from
photographic emulsions to digital chips. It also involves a vast array of storage media including
everything from photographic film to computer files. As you can imagine, the broad definition
means remote sensing overlaps with a number of other fields. This is indeed true and important.
For example, the discipline of surveying has changed enormously with the introduction of laserbased distance finding technology.
1
The reason for defining remote sensing so broadly is that it is a very important geographic
information technology. Remote sensing, in general, offers three advantages over other forms of
data collection and geographic information. First, it makes it much easier to systematically
recognize things and events over a large area. Second, it makes it easier and less costly to revise
most maps. Third, digital remote sensing images can be used directly by other applications.
There are some caveats to these advantages that you will find out about in this chapter. This
chapter is purely introductory in nature and will skim over many of the crucial details and
physics, but you should end up with a solid understanding of what remote sensing involves and
what some of key issues and applications for remote sensing are.
2
Figure 1 Different sensor types. Passive sensors use only reflected EMR.
Active sensor use emitted EMR
Principles
Electromagnetic radiation
Following the definition for this chapter, any understanding of remote sensing, regardless of the
sensor technology, storage media, or application starts with understanding electromagnetic
radiation. First off, remote sensing’s detection of EMR has three characteristics.
3
1. It only detects EMR from the surface of an object, although some sensors allow for
penetration
2. There is no contact between the sensor and the object.
3. All remote sensing measurements use reflected energy (usually from the sun) or emitted
energy (from a radar station or plants) for example
Figure 2 Emitted and reflected electormagnetic energy
The EMR detected by remote sensing technologies varies. It depends on the desired application
4
as well as on the cost of different remote sensing data collections.
Figure 3 The electromagnet spectrum showing common examples (Based on:
http://landsat.gsfc.nasa.gov/education/compositor/)
Spectral Signature
The EMR emitted or reflected by a thing or event varies. These differences are the basis for
distinguishing things and events. The reflections and emissions of a particular thing or event can
be associated with a particular spectral signature that is used to identify where these things and
events are located in a remote sensing image.
5
Figure 4 Examples of spectral signatures. Note that a micrometer is 10-6
meter (based on image from http://rst.gsfc.nasa.gov/Intro/Part2_5.html)
It also varies by time of day, season, weather conditions, moisture levels in the soil, wind, and a
number of other factors. The physics involved in addressing these differences are critical to the
success of remote sensing. They are also very complex, but you need to be aware of the
differences and a common-place solution. This solution is called ‘ground-truthing’ and involves
having some people in the field before, during, or after data collection who may take similar
sensor measurements or observations. These measurements and observations can be used later to
verify the remote sensing image or data and possibly define correction parameters for adjusting
the remotely sensed data to correspond to ground observations. Needless to say, this is highly
complex and requires very well trained specialists to assess these factors and detect patterns in
the remote sensing data.
6
Bands
The detection of patterns is helped by the use of different ranges or bands of EMR in sensing
technology. Each band, as they are called commonly, refers to a particular range of wavelength
for that sensor. The bands available for a particular sensor depend greatly on the purpose of the
sensor and technical characteristics of the sensor. Some sensors have only a few bands in a
narrow range of the total EMR, others are much broader. For example, Landsat 7 has seven
bands:
Band 1 0.45-0.52 µm Blue-Green
Band 2 0.52-0.60 µm Green
Band 3 0.63-0.69 µm Red
Band 4 0.76-0.90 µm Near IR
Band 5 1.55-1.75 µm Mid-IR
Band 6 10.40-12.50 µm Thermal IR
Band 7 2.08-2.35 µm Mid-IR
The following figure (xx) shows the different bands and how they can be combined for an
application.
7
8
Figure 5 Illustration of different bandwidths used by Landsat 7 (Source:
http://landsat.gsfc.nasa.gov/education/compositor/)
Another widely used satellite, SPOT 5, offers a different set of bandwidths.
[insert table ch8-table1-spot bands.doc]
[insert table ch8-table2-landsat bands.doc]
Resolution
Resolution of remote sensing distinguishes between spatial and temporal. Spatial is the size of
the unit recognized by the sensor, temporal is the frequency that a satellite visits the same place.
Spatial resolution is usually given in a distance measurement. For example, most SPOT sensors
have a resolution of 10 meters; some have a higher resolution of 2.5 meters. The resolution does
not mean that an object of that size can be consistently detected and identified. Various
atmospheric and situational characteristics play into this, and you might rather want to think of
this as simply the measure of side of one of raster cells detected by the remote sensing
technology. A raster cell is often also referred to as a pixel.
9
Figure 6 Comparison of spatial resolutions (Source:
http://www.csc.noaa.gov/products/sccoasts/html/rsdetail.htm)
Temporal resolution depends greatly on the spatial resolution of the sensing technology. High
spatial resolutions will record a great amount of data for a small area, requiring much longer to
return to a place than low spatial resolution sensors. For example, Landsat with 30 m spatial
resolution revisits a place only once every 16 days. The Advanced Very High-Resolution
Radiometer (AVHRR) has a spatial resolution of 1.1 km and revisits a place once every day.
10
Types of Sensors
The discussion of principles focused on satellite based remote sensing technology. This is only
part of available remote sensing technologies. The same technologies used for satellites, or
adaptions thereof are often used for remote sensing technologies used by airplanes, helicopters,
and in some case hand-held formats.
Photography
Photography is the most common remote sensing technology because it is very commonplace. In
fact, some of the first military remote sensing satellites used cameras with film in the 1960s. The
film was dropped out of the satellite in a special heat-resistance re-entry container with a
parachute and picked up out of the air by an airplane. Satellites still use cameras, but most of the
images are now captured and stored digitally. Satellite sensor technologies detecting EMR in this
range are often called panchromatic. The resolutions of photographic images are very high, and
the main issue related to determining spatial and temporal resolution is the cost. Of course, many
governments and companies use aerial photography as a means of data collection. Collected
using ground reference points and calculations to remove subtle changes in the airplanes
movements, two aerial photographs made simultaneously can be used to make a stereoscopic
image. They are a very useful type of remote sensing because when viewed with some additional
equipment, it is possible for most people to distinguish heights and elevation changes. A single
photographic image which also has the effects of elevation change removed (called planimetric)
is called an orthophoto and is georeferenced to a coordinate system.
11
Infrared
Usually when we refer to photographic remote sensing we mean recording EMR in the visible
wavelength spectrum, but this can be broadened to include infrared. This can be done with the
chemical applied to photographic film (called an emulsion) or by using digital devices built and
calibrated to detect this EMR spectrum.
Multispectrum
The data collected and images made with Landsat, SPOT, and similar sensing technologies are
known as multispectrum because of the different bands. The variability of multispectrum remote
sensors opens up a vast number of application possibilities.
Hyperspectral
This type of sensor technology collects more than 16 bands simultaneously. For example the
Hyperion satellite collects 220 bands from blue to short wave infrared in equal steps (from 0.4 to
2.5 µm) with a 30 meter spatial resolution. Flying in formation with Landsat 7, images from
Hyperion can be used easily with Landsat 7 images and data.
Radar
Radar is an important remote sensing sensor type. It’s ability to penetrate through cloud cover
and into the ground make it very useful for applications in areas with frequent cloud-cover and
12
for geological work.
Laser (LiDAR)
Not used on satellites, but on planes, helicopters, and from the ground, LIght Detection And
Ranging (LiDAR) uses laser generate light pulses in the same way radar uses radio waves.
LiDAR is a highly accurate and cost-effective means of collecting elevation data. Because of its
speed, hand-held units are now being introduced to quickly scan an area, e.g., a crime or accident
scene.
Applications
Images acquired by satellites have been used to produce local, regional, national, and global
composite multispectral mosaics. They have been used in countless applications including
monitoring timber losses in the U.S. Pacific Northwest, establishing urban growth, and
measuring forest cover. Remote sensing images have also been used in military operations,
locate mineral deposits, monitor strip mining, and assess natural changes due to fires and insect
infestations.
Data collection in general
Thinking about remote sensing in a most general sense, we can easily distinguish types of data
collection by the platform and by sensor technology. If the remote sensing is based on satellite
images or data, in most cases we are likely to have multispectral, hyperspectral, or radar images
or data. If it is airplane based, then we are more likely to have aerial photography, multispectral,
13
or LiDAR images or data. If it is ground based, then we are most likely to find photography,
multispectral, or LiDAR images and data. These rules of thumb have exceptions of course, and
will change as certain types of sensor technology and remote sensing systems become cheaper.
They are simply helpful in seeing the relationship between costs, types of data, and application
types. Applications in smaller areas tend to use airplane based or ground based sensor
technologies; larger areas tend towards satellite based remote sensing.
Coastal monitoring
An important application area is coastal monitoring. Because of the key role of dynamic
processes in coastal erosion coastal monitoring applications tend to use remote sensing sources
that can repeat their observations often. Aerial and LiDAR photography and data may be suitable
for smaller areas if the area is generally cloud-free; multispectral satellite images and data may
be useful for larger areas, and radar may be used for large areas, or areas with frequent cloud
cover.
14
Figure 7 Multispectral sensors produce data and imagery to help monitor and model complex
coastal changes (Source: http://earthasart.gsfc.nasa.gov/images/netherla_hires.jpg)
Global change
With an increase in average temperatures world-wide, the study of changes to glaciers and artic
and Antarctic ice fields has benefited greatly from the use of remote sensing images and data.
The frequency of observations helps scientists keep track of changes to ice fields and even ice
bergs in the water. Detailed observations, combined with measurements on the ground, help
researchers monitor minute changes in ice fields. Made available online to other researchers,
these measurements, images, and data have become a crucial part of a key area of global change
research.
15
Figure 8 Atmospheric monitoring is a crucial part of global change research, which sensors on
the TIMED spacecraft are especially designed to observe (Source: RST
http://rst.gsfc.nasa.gov/Intro/Part2_1a.html)
16
Figure 9 A composite of different multispectral data to produce a 'picture-like' image of the
world (Source: http://earthobservatory.nasa.gov/Newsroom/BlueMarble/)
Urban Dynamics
Because of the frequency of observation, satellite based remote sensing images and data have
17
proven to be very useful in documenting and assessing the growth of large cities around the
world and distinguishing changes and processes. Urban dynamics are complex, but individual
changes in a single area can be compared to assess the impacts of various policies and urban
planning programs. This data and models developed to understand past growth can also be used
to make predictions of future growth and assess alternative policy and planning proposals
Figure 10 Aerial imagery (here from a digitized aerial photograph) can show
a great amount of detail (Source: USGS)
Precision Farming
Detailed remote sensing images and data, from a variety of platforms, are used by farmers to
reduce and become more efficient in the application of fertilizers and pesticides. Agricultural
factors including plant health, plant cover and soil moisture can be monitored with remote
sensing data. By combining the remote sensing images and data from different sources,
18
deficiencies of one remote sensing system can be made up. For instance, Landsat provides
multispectral data on average only once every 16 days for any place in the continental US and is
impaired by cloud coverage, even partially cloudy weather. By using radar data, scientists have
been able to help farmers keep track of changing soil and plant conditions more frequently that is
especially critical during particular phases of plant growth, e.g., pollination.
19
Figure 11 Center pivot irrigation systems create red circles of healthy vegetation in this image
of croplands near Garden City, Kansas (Source:
http://earthasart.gsfc.nasa.gov/images/garden_hires.jpg)
20
Chapter 8
Web Resources
One of the most consumer-friendly remote sensing-based web applications (registration required
for full access)
http://www.keyhole.com/
Information about LandSat 7
http://www.keyhole.com/
NASA provides many fascinating images at this web site
http://visibleearth.nasa.gov/
Documentation of wetland destruction using animations
http://svs.gsfc.nasa.gov/vis/a000000/a002200/a002210/index.html
This website offers in depth discussion of everything related to remote sensing with an emphasis
on Landsat, but covering other sensor technologies in great detail
http://rst.gsfc.nasa.gov/
For information about SPOT satellites
21
http://www.spot.com/html/SICORP/_401_.php
Another source for information about Landsat satellites
http://landsat.gsfc.nasa.gov/
An excellent interactive tutorial and various aspects of remote sensing
http://satftp.soest.hawaii.edu/space/hawaii/
A tutorial introduction to LiDAR
http://www.ghcc.msfc.nasa.gov/sparcle/sparcle_tutorial.html
22
Review Questions
1. What does the term LiDAR stand for?
2. What does the term panchromatic stand for?
3. What often prevents the wider use of remote sensing?
4. What is the oldest commercial satellite system that is still in use?
5. What are some general characteristics of using remote sensing data?
6. How are surveying and remote sensing growing together?
7. What were some of the first applications for radar-based remote sensing?
8. What is the highest panchromatic remote sensing now available?
9. When was remote sensing first used?
10. How is remote sensing data usually stored?
Chapter 8
Chapter Readings
Gibson, Paul J. Introductory remote sensing: principles and concepts. London, New York,
Routledge, 2000.
23
Sabins, Floyd F. Remote sensing: principles and interpretation. 3rd ed. New York, W. H.
Freeman and Co., c1997.
Lillesand, Thomas M., Ralph W. Kiefer, and Jonathan W. Chipman. Remote sensing and image
interpretation. 5th ed. New York, Wiley, c2004.
Conway, Eric D. An introduction to satellite image interpretation. Baltimore, Johns Hopkins
University Press, c1997.
Chapter Glossary
Note: These entries are largely based on the glossary from NASA’s Remote Sensing Tutorial
(RST) http://rst.gsfc.nasa.gov/AppD/glossary.html.
Landsat -a series of US satellites that acquire multispectral images.
24
Oblique photograph - photograph acquired with a camera directed at an angle between horizontal
and vertical orientations.
Orthophotograph - a vertical aerial photograph from which the distortions due to varying
elevation, tilt, and surface topography have been removed.
Radar - acronym for radio detection and ranging.
SPOT - Système Probatoire d'Observation del la Terre, French remote sensing satellite system
Stereoscope - binocular optical device for viewing overlapping images or diagrams. The left eye
sees only the left image, and the right eye sees only the right image. When configured correctly,
the viewer sees the images in three dimensions.
Supervised classification - information analysis technique in which the operator provides
information that the computer uses to assign pixels to categories.
Unsupervised classification - information analysis technique in which the computer assigns
pixels to categories with no instructions from the operator.
25