Platforms and sensors

ACET 450 REMOTE SENSING
Remote Sensing Platforms & Sensors
1.
What is a Sensor / What is a Platform?
In order for a sensor to collect and record energy reflected or emitted from a target or
surface, it must be placed on a stable platform. Platforms for remote sensors may be
situated on the ground, on an aircraft or balloon (or some other platform within the
Earth's atmosphere), or on a spacecraft or satellite outside of the Earth's atmosphere.
A satellite platform is put into orbit around the earth in space and is used for carrying a
sensor for data collection and transmission.
Remote sensing instruments (e.g. sensors) can be placed on a variety of platforms to
view and image targets. Ground-based and aircraft platforms may be used, as well as
satellites. Satellites have several unique characteristics which make them particularly
useful for remote sensing of the Earth's surface.
2.
Satellite Orbit
Orbit is the path a satellite follows. There are different categories of orbits as follows:
•
Geostationary Orbits
A geostationary (GEO=geosynchronous) orbit is when the satellite is always in the same
position with respect to the rotating Earth and therefore it views the same portion of the
Earth's surface at all times. For this to happen, the satellite is at an elevation of
approximately 35,790 km because that produces an orbital period (time for one orbit)
equal to the period of rotation of the Earth (23 hrs, 56 mins, 4.09 secs). By orbiting at the
same rate, in the same direction as Earth, the satellite appears stationary (synchronous
with respect to the rotation of the Earth).
Geostationary satellites provide a "big picture" view, enabling coverage of weather
events. This is especially useful for monitoring severe local storms and tropical cyclones.
Because a geostationary orbit must be in the same plane as the Earth's rotation, that is
the equatorial plane, it provides distorted images of the polar regions with poor spatial
resolution.
1
•
Near-polar Orbits
A polar orbit has an inclination of 90 degrees to the equator, i.e. it follows an orbit
(basically north-south) which, in conjunction with the Earth's rotation (west-east), allows
them to cover most of the Earth's surface over a certain period of time. These are nearpolar orbits, so named for the inclination of the orbit relative to a line running between
the North and South poles. A polar orbit usually. On every pass around the Earth, it
passes over both the north and south poles. Therefore, as the Earth rotates to the east
underneath the satellite which is travelling north and south, it can cover the entire Earth's
surface.
2
•
Sun-synchronous Orbit
This is like a near-polar orbit, but in a sun-synchronous orbit, the satellite passes over
the same part of the Earth at the same local time each day. This can make
communication and various forms of data collection very convenient, e.g. measuring air
quality of a city at the same time every day.
To get data the same time each day, ensures consistent illumination conditions when
acquiring images in a specific season over successive years, or over a particular area
over a series of days. This is very important for monitoring changes between images as
they do not have to be corrected for different illumination conditions.
3.
Swath
As a satellite revolves around the Earth, the sensor "sees" a certain portion of the
Earth's surface. The area on the earth’s surface viewed by the sensor, is called the
swath. Imaging swaths for spaceborne sensors generally vary between tens and
hundreds of kilometres wide. As the satellite orbits the Earth from pole to pole, its eastwest position wouldn't change if the Earth didn't rotate. However because the Earth is
rotating (from west to east) it allows the satellite swath to cover a new area with each
consecutive pass.
4.
Spatial Resolution
Generally, the further away the sensor is from the target being imaged (i.e. the greater
the altitude of the sensor), the larger the area it covers and the smaller the ability to
obtain detail information of the target.
3
Spatial resoluition is the smallest possible object/feature that a sensor can detect. If we
say a sensor has a spatial resolution of 30m, it means that each pixel (picture element)
represents an area on the ground of 30m by 30m.
Pixels are the smallest units of an image and are are normally square and represent a
certain area on an image.
5.
Spectral Resolution
Spectral resolution describes the ability of a sensor to define fine wavelength intervals.
The finer the spectral resolution, the narrower the wavelength range for a particular
channel or band. This is very important because the higher the spectral resolution of a
sensor, the more distinctions between surface materials/objects can be made.
6.
Temporal Resolution
Temporal resolution can be defined as the time taken for a sensor to complete one orbit
cycle. In other words, it is the time taken for a sensor to repeat coverage of the same
area.
7.
Radiometric Resolution
The radiometric resolution of a sensor describes its ability to discriminate very slight
differences in energy (number of bits for each pixel). The finer the radiometric resolution
of a sensor, the more sensitive it is to detecting small differences in energy.
The maximum number of brightness levels available depends on the number of bits
used in representing the energy recorded. Imagery data are represented by positive
digital numbers that range from 0 to a selected exponent of 2.
For example:
4
•
2-bit is 22 = 4
Therefore, a range of 4 Digital Numbers (DNs) are used, i.e.
0 (black)
1 (tone of gray)
2 (tone of gray)
3 (white)
•
4-bit is 24 = 16
Similarly, a range of 16 DNs are used, from 0 to 15. 0 is black, 15 is white and the rest
are in tonnes of gray.
•
8-bit is 28 = 256
Similarly, a range of 256 DNs are used, from 0 to 255. 0 is black, 255 is white and the
rest are in tonnes of gray.
Image data are generally displayed in a range of grey tones, with black representing a
digital number of 0 and white representing the maximum value (for example, 255 in 8-bit
data). By comparing a 2-bit image with an 8-bit image below, we can see that there
is a large difference in the level of detail discernible depending on their radiometric
resolutions.
2-bit image
8-bit image
The human eye can differentiate between only twenty to thirty tones of grey while some
sensors can detect many more (e.g. 8-bit = 28 = 256).
5