Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

Remote Sensing

Jane M Read, Chad Chambers, and Marla Torrado, Department of Geography, Maxwell School of Syracuse University, Syracuse,
NY, United States
© 2020 Elsevier Ltd. All rights reserved.
This article is a revision of the previous edition article by J. M. Read, M. Torrado, volume 9, pp 335–346, © 2009 Elsevier Ltd.

Glossary
Band An image layer containing reflectance data recorded for a specified range of wavelengths
Electromagnetic spectrum The range of wavelengths of electromagnetic radiation
Hyperspectral Contains many (up to hundreds) of narrow wavelength bands; very fine spectral resolution
LIDAR (light detection and ranging) Sensor emits laser light and records (¼ light detection) the time it takes for the pulse to
return to the sensor (¼ ranging). Involves hundreds of thousands of pulses that can be used to detect shapes and structure of
objects and create digital elevation models.
Multispectral Contains more than one layer (band) of reflectance data. Usually 3–10 wide bands
Orthophoto An aerial photograph that has been geometrically corrected to remove distortions across the image. Orthophotos
can be used like a map to make accurate distance measurements
Pixel (picture element) the smallest unit of a remotely sensed image
Raster Spatial data structure based on dividing an area into a grid of pixels
Sun-synchronous orbit Follows the sun and crosses the equator at the same time each day
Georeference Relate locations on an image or map to real-world coordinates.

Nomenclature
Nm nanometers 109 m
Mm micrometers 106 m

History of Remote Sensing

Modern-day remote sensing can be traced back to 1858 when a French photographer and balloonist, Gasper Félix Tournachon
(nicknamed Nadar) took the first aerial photograph from a balloon over Paris. Fifty years later, Wilbur Wright was the first to
snap an aerial photograph from an airplane. There followed a highly successful era of aerial photography. Since then, remote
sensing evolved from aircraft-based, analog, visible-wavelength photographic systems capturing small areas of earth’s surface
and accessible to a few specialists; to aircraft-, UAV-, and space-based, digital, multi- and hyper-spectral systems covering the entire
earth’s surface. The ubiquity of personally owned computers makes these datasets available to most people who have access to the
internet.
The history of remote sensing follows the now familiar path of development of technology, from military/national security uses
through commercial and recreational applications. During World War I, aerial photography was used for reconnaissance and
surveillance missions. Then, during the 1920s and 1930s, it was adopted primarily, but not exclusively, by government agencies
for mapping applications, such as topographic mapping, soil and geologic mapping, and forest and agricultural inventories.
Personnel were trained, and the science of photogrammetry (making accurate measurements from photographs) rapidly developed.
By 1934, the American Society for Photogrammetry (later renamed the American Society for Photogrammetry and Remote Sensing)
formed, and with World War II came innovative uses of photography employing infrared and microwave radiation. Applications of
aerial photography expanded beyond mapping to assessment of conditions of different land-cover types, such as crop health,
improved characterization of vegetation, assessment, and prediction of crop yields and nutrient uptake for precision farming,
soil conditions, and environmental hazards mapping.
In 1960, the first space-based remote sensing began. It was at this time that a group of US government agencies put TIROS-1
(Television and Infrared Observation Satellite) into orbit as the first experimental meteorological and weather satellite. The only
other space satellite programs at that time were classified military programs, such as the United States’ Corona system, which
was operational until 1972 and was declassified in 1995. In the absence of other data sources, piloted spacecraft provided important
photographs for civilian purposes, although none were suitable for scientific analysis. In the 1960s, the United States’ National
Aeronautics and Space Administration (NASA) launched a remote sensing research program named Earth Resources Technology
Satellites (ERTS) which later became known as “Landsat” (land satellite). Landsat-1 was the first space-based satellite to capture
digital multispectral data of much of the earth’s surface, with predictable and repetitive coverage available for civilian uses. As digital

International Encyclopedia of Human Geography, 2nd edition, Volume 11 https://doi.org/10.1016/B978-0-08-102295-5.10589-X 411


412 Remote Sensing

data became available, digital image processing, quantitative methods, and other algorithms were developed; perhaps the most
useful of these being automated classification algorithms, indices, and statistics. Today, many raster-based GIS analyses incorporate
digital remotely sensed data.
In addition to national and international space satellite programs, such as those of the United States, Canada, India, Brazil,
Japan, Russia, the European Space Agency, and others, many commercial systems are operational, with still more planned and
under development on an ongoing basis for different applications. Hyperspectral systems were first developed in the 1980s, allow-
ing for capture of fine detail about earth’s surface through spectroscopy. In the 1990s, satellites that capture data from the entire
surface of the earth were put into orbit, and in 1999 IKONOS was the first commercial satellite sensor to be launched, providing
much finer detail than was previously available. Today, RADAR (radio detection and ranging) and LIDAR (light detection and
ranging) systems are increasingly being used for mapping terrain and other landscape features.
There are many social science applications of earth resources satellites, including urban mapping, public health, rural and urban
planning, precision agriculture, land-use analyses, resource use, transportation, disaster management, demographic and economic
studies, and studies of human–environment interactions. Internet applications, such as virtual globes (e.g., Google Earth and Micro-
soft’s Virtual Earth), data clearinghouses, and other online visualization and mapping tools (e.g., The United States Geological
Survey’s (USGS) National Map; the USGS Global Visualization Viewer, GloVis; and USGS Earth Explorer, EE), provide the lay
user with access to vast amounts of archived and near real-time remotely sensed data.

Relationship of Remote Sensing to other Geospatial Technologies

While many remote sensing operations involve visual interpretation of imagery, digital image processing is commonly employed
for classification and other analyses. In addition, global positioning system (GPS) technologies are frequently used to assist with
georeferencing remotely sensed imagery, which assigns real-world coordinates to an image. Remotely sensed data are an important
source of environmental data, and are often imported into GIS for analysis and integration with other datasets. Thus, remote sensing
specialists often use GIS in combination with remotely sensed imagery, GPS, and other spatial data in their research.

Electromagnetic Radiation
All real objects above absolute zero temperature reflect and emit radiation. The measurement and interpretation of emitted and
reflected electromagnetic radiation from earth’s surface represents the foundation of remote sensing. In remote sensing, radiation
is typically measured and categorized based on wavelength using a logarithmic scale. The range of wavelengths is known as the
electromagnetic spectrum (Fig. 1). Remote sensing uses physical principles of radiation to determine characteristics about an object
emitting or radiating at a specific wavelength. Most remote sensing employs passive systems that capture the sun’s reflected or
emitted light. When remote sensing scientists talk about visible, infrared (IR), and microwave radiation, we are referring to arbi-
trarily designated regions of the electromagnetic spectrum (Fig. 1). In remote sensing we measure reflected solar radiation in the
ultraviolet, visible (i.e., visible to the human eye), and near-IRwavelengths; reflected and emitted solar radiation in mid-IR; and
emitted radiation only in the far-IR (in the form of thermal energy) and microwave wavelengths. In contrast, active systems
such as RADAR and LIDAR generate and measure their own returns of electromagnetic radiation.
Energy interacts with the atmosphere, and the earth’s surface. When electromagnetic radiation travels through the atmosphere, it
is scattered, absorbed, and refracted. Satellites capture light that has traveled twice through the atmosphere (from the sun to earth’s
surface, and back from earth’s surface to the sensor) (Fig. 2). Some atmospheric scattering and absorption of light are wavelength
dependent, and determine how much energy of a specific wavelength is able to pass through the atmosphere and arrive at the
sensor. Shorter wavelengths, for instance, tend to be scattered more by the atmosphere than longer wavelengths, and as a result
are more affected by haze and clouds.
The degree to which energy is reflected, absorbed, and/or transmitted when it interacts with earth surface features depends not
only on the feature itself, but also on its wavelength. In the case of reflected energy, the amount of reflected light, known as spectral
reflectance, depends on surface roughness as it relates to wavelength and the angle of energy incidence. Each of the earth’s surface
features has distinct spectral reflectances as they reflect, absorb, or transmit energy at different wavelengths. Due to atmospheric and
other (e.g., temporal) effects on spectral reflectances, it is not possible to define a precise spectral reflectance value for any one object
under normal remote sensing conditions. As a result, remote sensing analysts think in terms of spectral response patterns. It is
through these spectral response patterns that different objects and conditions can be distinguished (Fig. 3). Fig. 4 illustrates
how the spectral responses of different features vary with wavelength. As we can see, the deep water of Onondaga Lake
(A) absorbs more incident energy in NIR and longer wavelengths (as evidenced by its darker appearance) than the shallower water
of the Inner Harbor (B). Healthy vegetation, on the other hand (as demonstrated by the spectral response of the park at C), absorbs
blue and red visible light, and reflects strongly in the green and NIR wavelengths. Note also, the differences in the responses at
different wavelengths of the mall buildings and its surrounding parking lot at D (Fig. 4).
Remote Sensing 413

Lower
energy
Radio

30 cm

Visible

Microwave 0.70 µm

1.0 mm Red
Far infrared
(thermal)
0.60 µm

Mid-infrared
Wavelength

Green
Near infrared
0.72 µm
0.50 µm

Blue
Ultraviolet rays
300 nm
0.40 µm

X-rays

0.03 nm

Gamma rays
Higher
energy
Figure 1 The electromagnetic spectrum, showing arbitrary divisions use in remote sensing.

Sensor platform
Energy (sensors on board)
source

Atmosphere
Emitted
energy
Reflected or
emitted energy Scattered
energy
Scattered,
refracted, or
absorbed
energy
Absorbed or
transmitted energy

Earth's surface objects

Figure 2 Energy pathways and components of a typical remote sensing system.


414 Remote Sensing

Dry bare soil (gray-brown)


60 Vegetation (green)
Water (clear)

Reflectance (%)
40

20

0
0.4 0.6 0.8 1.0 1.2 1.4 1.6 1.8 2.0 2.2 2.4 2.6
Wavelength (µm)
Figure 3 Typical spectral reflectance curves for vegetation, soil, and water. Permission pending from Lillesand, T., Kiefer, R., and Chipman, J.
(2004). Remote Sensing and Image Interpretation. Fifth edition. New York: Wiley.

C C
A A

D D

B B

Band 1: Blue band, 30 m resolution Band 4: Near infrared band, 30 m resolution

C C
A A

D D

B B

Band 6: Thermal band, 120 m resolution Band 7: Mid-infrared band, 30 m resolution


N
0 0.5 1 km

Figure 4 Selected bands of a Landsat Thematic Mapper image of an area of the City of Syracuse, New York (2000), demonstrating the different
spectral responses of land-cover types: A ¼ deep water at the south end of Onondaga Lake; B ¼ shallower water in the Inner Harbor; C ¼ vegetation;
D ¼ mall buildings and surrounding parking lot. Dark shades represent low reflectivity; light shades represent high reflectivity.
Remote Sensing 415

A B

90 91 90 86 82 78 78 82
89 100 94 85 79 78 82 86

91 102 100 89 80 75 81 84

97 89 81 79 83 82 77 78

92 81 78 86 98 101 80 78

83 77 87 94 114 117 79 76

80 82 91 84 86 100 99 90

77 85 81 66 66 88 104 105

Figure 5 (A) An array of brightness values stored as individual pixels, and (B) a corresponding image shown in grayscale (low to high values range
from dark to light shades).

Characteristics of RemotelySensed Data and Data Collection


A remote sensing system is composed of three basic components: a light source (usually the sun’s energy), a sensor flown on a plat-
form to capture and measure the reflected or emitted light, and an analog or digital recording medium (Fig. 2). Platforms are usually
aircraft, satellite, or unpiloted aerial vehicles.
Traditional aerial photography is flown in an airplane equipped with a camera and special lens designed to refract and focus the
light onto photographic film. The aircraft’s altitude determines an image’s spatial resolution (see below), and by flying overlapping
flight lines, a mosaic of stereoscopic aerial photographs can be recorded for an area.
In contrast, digital sensors record photons as brightness values (also known as digital numbers or digital counts) for a specific
range of wavelengths. Each image layer is composed of an array of individual pixels which have unique brightness values (Fig. 5).
The flying height of the sensor determines the instantaneous field of view (IFOV), which relates to the area of ground being sensed
(ground resolution); this defines the size of the pixel, or its finest level of detail (Fig. 6). With multispectral data, different regions of
the electromagnetic spectrum are recorded simultaneously, generating a series of separate bands (layers) that collectively make up
an image (Fig. 6).

Resolution
Resolution refers to the amount of detail, or ability to distinguish between different objects in a remotely sensed image. Resolution
is determined by the characteristics of the remote sensing system as a whole; this includes the characteristics of the sensors and its
specific orbit parameters. There are four types of resolution that determine the detail captured: spatial, spectral, radiometric, and
temporal.

Sensor

Multiple bands

Pixel {

IFOV

Figure 6 Schematic showing the relationship between the Instantaneous Field of View (IFOV) of the sensor and the pixels and different bands of
a multispectral image.
416 Remote Sensing

Spatial resolution determines the smallest object that can be detected and is defined by the instantaneous field of view (IFOV), or
ground resolution (Fig. 6). This is most often referred to in terms of pixel size. For instance, the Advanced Very High Resolution
Radiometer (AVHRR) may be considered a coarse-resolution sensor at a pixel size of 1.1 km  1.1 km; Landsat’s Thematic Mapper
(TM) with 30 m  30 m pixels, medium-resolution; and IKONOS sensor’s 4 m  4 m pixels or WorldView2 sensor’s
1.85 m  1.85 m, fine spatial resolution (Table 1). Fig. 7 illustrates how the increased spatial resolution of IKONOS data are
more applicable for smaller features; IKONOS clearly delineates differences between the parking garage and university sports
dome building whereas the Landsat-TM data can only identify the general shape of the sports dome. Resolution is relative, however;
some researchers working with 2 cm  2 cm pixels might not consider a 4 m  4 m pixel to be so fine. Social scientists interested in
detecting individual buildings in an urban setting would need to select a sensor with either a high or very high spatial resolution. For
instance, Microsoft’s building footprint database published in 2018 was created by analyzing imagery with 1  1 ft pixels to extract
building shapes for over 125 million buildings covering the entire US IKONOS is considered high resolution data, whereas very
high spatial resolution (VHR) sensors include QuickBird, GeoEye-1, or one of the WorldView (1,2,3,4) series; however, if one is
interested in studying a vast area of urban versus rural land cover, a coarser sensor, such as Landsat-OLI/TIRS, would be more
appropriate.
Spectral resolution is defined by the number and width of spectral bands recorded, as well as the sensor’s ability (sensitivity) to
record a wider range of photons across the entire bandwidth. Narrower bands provide finer spectral resolution. IKONOS has 4
bands; Landsat-OLI/TIRS has 11 bands; and WorldView4 has 28 bands; however, hyperspectral sensors typically have over 200
very narrow bands.
Radiometric resolution refers to the number of potential brightness values that can be stored for a pixel. Finer radiometric reso-
lutions store and display more subtle differences in brightness values than lower resolutions, allowing finer detail of features to be
recorded. Landsat-TM scenes are typically stored as 8-bit data; this means that there are 256 possible brightness values per pixel in
the scene. On the other hand, IKONOS, QuickBird, and WorldView4, for example, have a 16-bit radiometric resolution which
means that they can store 65536 possible values per pixel.
Temporal resolution refers to a sensor’s repeat cycle or how frequently it revisits a location to capture an image. This is important
for applications that seek to uncover change through time. Temporal resolution is especially important in cases of natural disasters
where frequent updates are needed. Each sensor has a different temporal resolution. For example, the Moderate Resolution Imaging
Spectroradiometer (MODIS) sensor orbits the entire earth every one or 2 days, whereas Landsat orbits over the same place once
every 16 days (Table 1). Some commercial sensors orbit at a much more rapid pace; WorldView4, for example, can orbit the earth
in just 1 day (Table 1). Some sensors also have pointing capabilities which can uptick their revisit frequency.
Each remote sensing system has its own unique set of resolutions (radiometric, spectral, and spatial), and there are technological
trade-offs between the different types of resolution. The amount of detail in an image also depends on the characteristics, config-
uration, and context of the objects in the scene being imaged. For instance, two land-cover classes with similar spectral reflectance
patterns may not be distinguishable, regardless of how the remote sensing system is configured. Or mixed pixels (pixels that repre-
sent more than one land-cover type or object) may limit an analysis. Although a higher spatial resolution may result in fewer mixed
pixels, finer resolution may also resolve more features, thereby leading to more mixed pixels. Thus, the level of detail captured in an
image depends on several different factors relating to the remote sensing system and of the scene itself.

Digital Image Processing and Analysis


While visual display of remotely sensed data can provide a powerful tool for many applications by aiding analysis of analog or
digital data, the speed at which operations can be performed confer distinct advantages to digital image processing. Many different
operations can be performed on digital imagery. When an image is processed, its values are transformed by passing each pixel
through a procedure which produces a new image with new pixel values. Some operations are simple, others more complex.
Digital image processing often involves several steps, including a) preprocessing operations (such as, georeferencing, correction
of atmospheric, sun angle, and sensor effects, or conversion from brightness values to radiances); b) feature extraction (methods to
highlight and isolate specific features in an image); c) data fusion (merging different types of data to take advantage of the charac-
teristics of both data types); and d) other digital analyses, such as classification.

Image Display

Images are displayed based on their digital numbers; generally, the higher the brightness value, the brighter the display color. In gray
scale, colors vary from light to dark; this range typically is composed of white, with several shades of gray in between, to black. A
computer monitor has three-color guns (red, green, and blue), each of which traditionally displays up to 256 brightness levels.
Image processing software allows the user to display up to three bands at any one time by assigning each band to a different color
gun (Fig. 8). By selecting different band combinations, it is possible to render an image based on the characteristics of objects being
sensed through their different spectral pattern responses, and emphasize particular land cover/land use features (Fig. 8). Image
enhancement functions may improve visual interpretation by altering the digital display of values to make subtle differences in
brightness values more obvious.
Remote Sensing 417

Table 1 Characteristics of selected remote sensors.

Orbit
No. of height Repeat Nominl swath
Sensor system Sensors bands Nominal pı´xel size (km) cycle (days) width (km)

High resolution sensors


WorldView4 (2016-) Panchromatic 5 0.31 m 617 1 13.1
Multispectral 1.24 m
WorldView3 (2014-) Panchromatic 28 0.31 m 617 <1 13.1
Multispectral 1.24 m
SWIR 3.7 m
CAVIS 30 m
WorldView2 (2009-) Panchromatic 9 0.46 m 770 1.1 16.4
Multispectral 1.85 m
GeoEye-1 (2008-) Panchromatic 6 41 cm 681 <3 15.3
Multispectral 1.64 m
WorldView1 (2007-) Panchromatic 1 50 cm 496 1.7 17.7
 QuickBird (2001-) Panchromatic 1 0.61 m 450 1–3.5 272–435
Multispectral 4 2.40 m
IKONOS (1999-) Panchromatic 1 1m 681 <11 11
Multispectral 4 4m
 SPOT 1 (1986–90) 2xHigh-Resolution Visible (HRV) 4 1  10 m (pan); 3  20 m 822 26 60–80
 SPOT 2 (1990-) (xS)
 SPOT 3 (1993-)
 SPOT 4 (1998-) 2xHigh-Resolution Visible and 4 1  10 m (pan) and 822 26 60–80
Infrared (HRVIR) 3  20 m (xS)
Vegetation 1 4 1000 m 1–2 2250
 SPOT 5 (2002-) 2 High Resolution Geometric (HRG) 6 2  5m (pan); 3  10 m 822 26 60–80
(xS); 1  20 m (NIR)
1 High Resolution Stereoscopic 1 5–10 m (pan) 60–120
(HRS)
Vegetation 2 4 1000 m 1–2 2250
 NASA/JPL (1989-) Airborne Visible/Infrared Imaging 210 4–20 m depending on – – 11
Spectrometer (AVIRIS) altitude
 (1989-) Compact Airborne Spectrographic 288 Varies depending on altitude – – –
Imager (CASI)
 NASA/JPL Thermal Infrared Multispectral 6 7.6 m – – –
Scanner (TIMS)
 Japanese Earth Resources Optical Sensor (OPS) 7 18 m 570 44 75
Satellite (JERS-1)(1992–1998) Synthetic aperture radar (SAR)
Moderate resolution sensors
 Landsat 1 (1972–1978) Return Beam Vidicon (RBV) 3 80 m 900 18 185
 Landsat 2 (1975–82) Multispectral Scanner (MSS) 4 56 m  79 m
 Landsat 3 (1978–83) Return Beam Vidicon (RBV) 1 30 m 900 18 185
Multispectral Scanner (MSS) 5 56 m  79 m
 Landsat 4 (1982–93) Multispectral Scanner (MSS) 4 56 m  79 m 705 16 185
 Landsat 5 (1984-) Thematic Mapper (TM) 7 Bands 1–5 and 7: 30 m
Band 6 (thermal): 120 m
 Landsat 7 (1999-) Enhanced Thematic Mapper (ETMþ) 8 Bands 1–5 and 7: 30 m 705 16 185
Band 8 (pan): 15 m
Band 6 (thermal): 60 m
Landsat 8 (2013-) Operational Land Imager (OLI) 11 Bands 1–7, and 9: 30 m 705 16 (cross-track):
Band 8 (pan): 15 m 185
Thermal Infrared Sensor (TIRS) Bands 10 and 11 (thermal (along-track):
infrared): 100 m 180
 Indian Remote Sensing (IRS)-1A LISS-I 4 72.5 m 904 22 148
(1988-) and IRS-1B (1991-) LISS-II 4 36.25 m 146
 Indian Remote Sensing (IRS)-1C LISS-III 4 3  23 m and 1  70 m 817 24 142–148
(1995-) and IRS-1D (1997-) Panchromatic 1 5.8 m 70
Wide Field Sensor (WiFS) 1 188 m 774
 RADARSAT-1 (1995-) Synthetic aperture radar (SAR) 4 8–100 m 798 24 45–510
 EO-1 (2000-) Hyperion 220 30 m 705 16 7.5
Coarse resolution sensors
(Continued)
418 Remote Sensing

Table 1 Characteristics of selected remote sensors.dcont'd

Orbit
No. of height Repeat Nominl swath
Sensor system Sensors bands Nominal pı´xel size (km) cycle (days) width (km)

 EOS Terra (1999-) and Aqua Moderate Resolution Imaging 36 250–1000 m 705 2 2330
(2002-) Spectro-Radiometer (MODIS)
 NOAA missions 6–17 (1979– Advanced Very High Resolution 5 1.1 km 833–870 4–9 2400
2002) Radiometer (AVHRR)
 Sea-viewing Wide Field-of-view Local Area Coverage (LAC) 8 1.13 km 705 varies 2800
Sensor (SeaWiFS)(1997) Global Area Coverage (GAC) Subsampled every 4th line 2
and 4th pixel

Pan, Panchromatic; xS, multispectral; NOAA, National Oceanographic and Atmospheric Administration; NASA, National Aeronautics and Space Administration; JPL, Jet Propulsion
Laboratory.
Sources: Campbell, J. (1996). Introduction to remote sensing (second edn.). New York: Guilford Press.Erdas. (1999). Erdas Field Guide (fifth edn.). Georgia: ERDAS Inc.Digital Globe.
About Digital Globe. World-class technology. Retrieved March 2019 from: https://www.digitalglobe.com/company/about-us.GeoImage Website. Retrieved October 2006 from http://
www.geoimage.com.au/.Geospatial Data and Information System. Spot Satellite Imagery. Retrieved October 2006 from http://www.geovar.com/spot.htm.Lillesand, T., Kiefer, R. and
Chipman, J. (2004). Remote Sensing and Image Interpretation. Fifth edition. New York: Wiley.Lindborg, C. Update April 20, 2000. IRS (Indian Remote Sensing). Retrieved October
2006 from http://www.fas.org/spp/guide/india/earth/irs.htm.Modis Web. About Modis Components. Curator:Maccherone, Brandon. Retrieved October 2006 from <http://ladsweb.
nascom.nasa.gov/>.NASA Earth Observing-1 Homepage. Retrieved March 2007 from http://eo1.gsfc.nasa.gov.NOAA Coastal Services Center. Update June 30, 2006. About LIDAR
Data. Retrieved March 2007 from http://www.csc.noaa.gov/products/sccoasts/html/tutlid.htm.NOAA Satellite and Information Service. Update March 27, 2007. Advanced Very High
Resolution Radiometer – AVHRR. Retrieved October 2006 from http://noaasis.noaa.gov/NOAASIS/ml/avhrr.html.SeaWiFS Project Homepage. An Overview of SeaWIFS and the
SEASTAR Spacecraf; Spacecraft Description. Retrieved October 2006 from http://oceancolor.gsfc.nasa.gov/SeaWiFS/SEASTAR/SPACECRAFT.html.SPOT Image webpage. Retrieved
March 2007 from http://www.spotimage.fr.USGS. Landsat Missions. Landsat 8. Retrieved March 2019 from: https://www.usgs.gov/land-resources/nli/landsat/landsat-8?qt-science_
support_page_related_con¼0#qt-science_support_page_related_con.USGS. Update April 6, 2005. Thermal Infrared Multispectral Scanner (TIMS) Sensor/Instrument. Retrieved
March 2007 from http://eosims.cr.usgs.gov:5725/sensor_documents/tims_sensor.html#1.

N
0 100 200 m
B

Figure 7 (A) 1 m pan-sharpened IKONOS data of the City of Syracuse, New York, US (2001), (B) 30 m Landsat Thematic Mapper data of the same
area (2000): A ¼ sports dome, B ¼ parking garage, C ¼ Syracuse University buildings.
Remote Sensing 419

A B C

0 0.5 1 km

Figure 8 IKONOS data of the City of Syracuse, New York, US (2001): (A) bands 3, 2, and 1 displayed on the red (R), green (G), and blue (B) color
guns, respectively; (B) bands 4, 3, and 2 displayed on RGB, respectively; (C) bands 1, 4, and 2 displayed as RGB, respectively, where band 1 ¼ blue,
band 2 ¼ green, band 3 ¼ red, band 4 ¼ near-IR.

Visual Image Interpretation

Visual image interpretation is extremely important in remote sensing, and in some cases may represent the only use of the data.
Visual interpretation may be incorporated into the beginning or intermediate stages of a more complex set of analyses. Each image
(whether aerial photograph or other digital sensor image) has its own particular pattern of brightness values, tone (lightness/dark-
ness), texture, shadows, spatial feature patterns, and contextual characteristics. By considering all of these attributes, an interpreter
can identify, classify, quantify (make measurements), or assess conditions of objects or regions.

Classification

Remotely sensed data are often used in classification analyses, whereby individual pixel values are classified into meaningful cate-
gories. There are a variety of ways to classify a digital image based on its spectral, spatial (texture, proximity, etc.), or temporal
(changes through time) information. Supervised and unsupervized classifications are the two most commonly used automated clas-
sification methods. These two methods are inherently different. Supervised classifications leverage the operator’s a priori knowledge
of the study area to drive the classification process; this assumes that the remote sensing specialist has complete knowledge of loca-
tions on the ground and can properly assign pixel values to a land cover class. In contrast, unsupervized classifications use the statis-
tical distribution of pixel values to assign pixels to statistical classes; these classes are then subsequently interpreted by the remote
sensing analyst. A hybrid classification combines parts of unsupervized and supervised methods. Classification results in a new
image that contains unique land-cover categories. Accuracy is assessed by comparing the resulting classification to reference
data; a classification error matrix (Fig. 9) is commonly reported, sometimes with Kappa statistics which assess the result against
the possibility of it being generated randomly.

Classification accuracy assessment report


Reference data
User's
Classified data Vegetation Impervious Water Row totals accuracy
Vegetation 112 24 0 136 112/136 = 82.35%
Impervious 24 100 40 164 100/164 = 60.97%
Water 0 0 40 40 40/40 = 100%

Column total 136 124 80 340

Producer's 112/136 = 100/124 = 40/80 =


accuracy 82.35% 80.65% 50.00% Overall
accuracy
252/340 = 74.12%
Kappa = 0.7359

Figure 9 Typical accuracy assessment error matrix.


420 Remote Sensing

In the social sciences, classifications are important in characterizing and mapping environments such as urban, suburban, and
agricultural land uses. Other forms of analyses social scientists might apply to their studies include various automated change detec-
tion methods that compare raw, transformed, or classified imagery from one time to another. These can be useful for studying and
evaluating historical to present-day changes, rapid or near-time change, such as emergency response, and for real-time disaster
monitoring and management. Other useful functions include calculation of indices based on pixel values from one or more bands.
For example, the Normalized Difference Vegetation Index (NDVI) is commonly used to identify healthy vegetation using red and
NIR band values. NDVI has many potential uses; for example, it can be useful in characterizing changing crop health, which could,
in turn, assist in predicting famine or human migration patterns in a particular place.

Remote Sensing Data Types


Aerial Photographs
Aerial photography, generally flown from an airplane, is still widely used in the creation of topographic maps worldwide; they are
also a relatively inexpensive and accessible data source. Photography can provide black-and-white, color, or color-IR data in either
film or digital form. All photographs are captured with some inherent geometric distortion; however, those distortions can be cor-
rected to produce an orthophoto. The USGS makes available digital orthophoto quadrangles that correspond to USGS 7.5-min
topographic sheets, and is a good repository for downloading US archival photography (and other remotely sensed datasets).
Because aerial photography predates satellite imagery, it represents a valuable source of historic landscape data. Land-use studies
often rely on panchromatic aerial photographs for compiling land-use histories. Another advantage to aerial photography is that
oftentimes it is flown at relatively low altitudes, thereby capturing fine detail, such as buildings, stands of trees, roads, bodies of
water etc., which renders it highly useful for visual interpretation.

Multispectral Earth Observation Sensors


Most multispectral satellite observation sensors orbit the earth at approximately 400 km–900 km (refer to Table 1 for details of
select systems). Geostationary satellites, which carry communications and meteorological sensors, remain in a fixed position above
the earth at much higher altitudes, whereas sun-synchronous satellites are placed in orbits that follow the sun. Satellite sensors
confer the advantages of sensing large areas in a single pass, repetitive and predictable passes, and coverage of all or most of the
earth.
The US government’s Landsat program (Table 1) far surpassed initial expectations. When the first Landsat satellite was launched
in 1972 it was not known whether the system would work. Landsats 1–3 carried the Return Beam Vidicon (RBV) and Multispectral
Scanner (MSS); the MSS, which had been put on board as an afterthought, was the more successful sensor of the two. Landsats 4 and
5 carried MSS and the Thematic Mapper (TM) sensor; Landsat 6 carried the Enhanced Thematic Mapper (ETM) but failed on launch.
Although Landsat 7 carries the ETMþ, its successor, Landsat 8, is equipped with two sensors: an Operational Land Imager (OLI), and
Thermal Infrared Sensor (TIRS). By 2007, data from Landsat 5 were only being collected over North America, and Landsat 7 was
functioning with a scan line corrector error which resulted in missing data. The USGS has recognized the importance of continuing
the Landsat program and has provided a continuous stream of land data for nearly 50 years; more than 6 million scenes have been
archived to date. Because Landsats 5 and 7 were not designed to last past 2010, Landsat 8 was launched in 2013, and is still presently
orbiting earth. By late fall 2020, Landsat 9 is anticipated to launch. Just like Landsat 8 mission, Landsat 9 will be equipped with both
OLI/TIRS sensors, though they will be named OLI-2 and TIRS-2.
Another important sensors include National Oceanographic and Atmospheric Administration’s (NOAA) Advanced Very High
Resolution Radiometer (AVHRR) sensor, which was launched in 1978 on a polar orbiting satellite (Table 1). Although this sensor
was designed to collect meteorological data, it has also been used successfully for sensing land features. Indeed, the highly used
NDVI was originally developed for AVHRR to provide global coverage of earth’s vegetation coverage. Medium-resolution multispec-
tral systems include the French earth observation system, SPOT, which was launched in 1985, and the India Remote Sensing (IRS)
instrument (Table 1).
In 1999 the first commercial high-resolution sensor, IKONOS (with four 4 m multispectral bands and a 1 m panchromatic
band), was launched, and 2 years later QuickBird followed suit. Many other high-resolution sensors have since been operational-
ized, with still more planned and under development on an ongoing basis for different applications. Through companies like Planet
Labs (San Francisco, CA) and their constellation of Dove satellites, it is now possible to capture imagery of the whole surface of the
Earth every day.

Other Sensors
Other classes of sensor include hyperspectral sensors, which collect almost a continuum of very narrow bands of the electromagnetic
spectrum to provide very detailed spectral information about features. A typical hyperspectral sensor may have 200 þ bands
(Table 1). Hyperspectral data are especially useful in studying atmospheric composition, as well as detailed agricultural applications
such as determining leaf composition, soil moisture content, and salinity.
RADAR data are acquired by the sensor sending out known wavelengths of energy and recording the time it takes for the
energy to return. RADAR has the advantage that it can penetrate (“see”) through clouds. RADAR data are good for mapping
topography, drainage and other clearly defined landforms. In the social sciences the most common use of this type of data
is for elevation mapping as well as analyses of land-cover change. LIDAR is another active sensor that works similarly to
Remote Sensing 421

RADAR, except that it uses beams of light. Very detailed profiles of three-dimensional features can be built up using LIDAR
data. Some of its uses include mapping elevation/topography, urban mapping, highway mapping applications, assessment
of forests and protected areas, monitoring of contaminants such as air pollution, and land-cover changes.
Finally, thermal sensors, sensitive in the mid- and far-IR and microwave regions of the spectrum, sense surface temperatures and
thermal properties of features. They can be useful for identifying and mapping objects, and generally are better as a qualitative inter-
pretation tool rather than for precise temperature measurements. In the social sciences, thermal remotely sensed data have been
used to study urban heat island effects, as well as monitoring of forest fires.

Unpiloted
Unpiloted Aerial Vehicles (UAVs, also known as unmanned aerial vehicles and drones) have recently gained traction in remote
sensing. UAVs can capture several different types of data depending on the sensors they carry, including aerial imagery, point cloud
data, and videography. Point cloud data can be processed to abstract 3D models of earth, such as digital elevation models (DEMs).
Unlike most satellite sensors, UAVs can be flown at varying heights, times, and difficult-to-access places, allowing finely detailed
data to be captured with precise locational and temporal characteristics, and at relatively low cost. This makes them ideal for smaller
scale urban and environmental applications, some of which include forestry studies, resource management, archeology, land
surveys, geomorphology, disaster response and management, and public safety.

Conclusion

Remote sensing has developed into a technology that is now available to a wide audience, providing valuable spatial environ-
mental information to support a wide range of social science applications. Limitations in using remotely sensed data in the
social sciences continue to derive from the nature of remotely sensed data themselves. The increase in number, capabilities,
and coverage of commercial satellite sensors are reducing some of the logistical issues of data availability, accessibility, cost,
and ease of use, yet they still exist for many users worldwide. The recent increase in popularity of UAVs for commercial,
scientific, social, humanitarian, and recreational image capture, and improvements in software for analyzing the data, however,
is rapidly removing many of those limitations. While remotely sensed data cannot provide direct information about social
conditions of individuals, households, communities, or global systems, they can provide information about the nature and
spatial configuration of the physical environment in which those entities operate. Remote sensing can also be useful in
providing indicators of, and predictions for, social processes that manifest themselves in the physical environment, such as
urban revitalization projects, or processes that develop as a result of changes in the physical environment, such as famine
and out-migration as a result of drought. We are rapidly approaching the time when the entire surface of the earth will be
imaged continuously and at fine detail. As more remote sensing systems are developed, and data availability and access to
user-friendly software continue to improve, applications of remotely sensed data in the social sciences are likely to continue
to expand.

See Also: Geographic Information Systems and Cartography; Geographic Information Science and Systems; Geovisualization; Global Positioning/GPS;
Qualitative Geographic Information Systems.

Further Reading

Campbell, J., 2011. Introduction to Remote Sensing, fifth ed. Guilford Press, New York.
Congalton, R.G., Green, K., 2019. Assessing the Accuracy of Remotely Sensed Data: Principles and Practices, third ed. CRC Press, Boca Raton, FL.
Crawford, C., Loveland, T., Masek, J., Wulder, M., 2018. An Overview of USGS-NASA Landsat Science Team Activities during 2018. 2018 Edition. NASA The Earth Observer.
Fox, J., Rindfuss, R., Walsh, S., Mishra, V., 2003. People and the Environment; Approaches for Linking Household and Community Surveys to Remote Sensing and GIS. Kluwer
Academic Publisher, Norwell.
IEEE Transactions on Geoscience and Remote Sensing.
International Journal of Remote Sensing.
Photogrammetric Engineering and Remote Sensing.
Remote Sensing of Environment.
Keranen, K., Kolvoord, R., 2016. Making Spatial Decisions Using GIS and Lidar: A Workbook. ESRI Press.
Lillesand, T., Kiefer, R., Chipman, J., 2015. Remote Sensing and Image Interpretation, seventh ed. Wiley, New York.
Liverman, D., Moran, E.F., Rindfuss, R.R., Stern, P.C. (Eds.), 1998. People and Pixels: Linking Remote Sensing and Social Science. National Academy Press, Washington, D.C.
McCoy, R.M., 2004. Field Methods in Remote Sensing. Guilford Press, New York.
Wallace, T., Watkins, D., Schwartz, J., 2018. A Map of Every Building in America. The New York Times. Oct 12th, 2018.
Williams, D.L., Goward, S., Arvidson, T., 2006. Landsat: yesterday, today, and tomorrow. Photogramm. Eng. Remote Sens. 72, 1171–1178.
422 Remote Sensing

Relevant Websites

National Aeronautics and Space Administration (NASA) Global Change Master Directory. http://gcmd.gsfc.nasa.gov/: information on satellite and earth science data.
European Space Agency (ESA).http://www.esa.int/esaCP/index.html: information on European remote sensing programs and activities.
United States Geological Survey (USGS) Earth Resources Observation and Science (EROS) data center. https://www.usgs.gov/centers/eros/: information and access to aerial and
satellite data of the United States. Go here for links to USGS Global Visualization Viewer (GloVis) and Earth Explorer to search EROS data center archives.
USGS Landsat. https://www.usgs.gov/land-resources/nli/landsat/: comprehensive information on the US Landsat program.
USGS Earth Explorer.https://earthexplorer.usgs.gov/: open access to remotely-sensed data.
American Society for Photogrammetry and Remote Sensing (ASPRS).www.asprs.org/.
International Society for Photogrammetry and Remote Sensing (ISPRS).http://www.isprs.org/.
Eyes Aloft: http://www.womp-int.com/story/2013vol07/story025.htm: more information related to UAV’s and drone data.
D. Satellites: https://petapixel.com/2014/11/23/planet-labs-working-photograph-whole-earth-every-day-army-tiny-satellites/: More Information Related to the Future Directions of
Small Satellites.

You might also like