Professional Documents
Culture Documents
Remote Sensing of The Environment An Earth Resource Perspective 2Nd Ed Pearson New International Edition Edition Jensen Full Chapter PDF
Remote Sensing of The Environment An Earth Resource Perspective 2Nd Ed Pearson New International Edition Edition Jensen Full Chapter PDF
https://ebookmass.com/product/strategy-an-international-
perspective-6th-ed-edition-bob-de-wit/
https://ebookmass.com/product/human-communication-disorders-
pearson-new-international-edition-an-introduction-8th-edition-
ebook-pdf/
https://ebookmass.com/product/remote-sensing-digital-image-
analysis-6th-ed-2022-edition-john-a-richards/
https://ebookmass.com/product/the-business-environment-a-global-
perspective-9th-edition-ed-thompson/
Remote Sensing of Geomorphology, Volume 23 Paolo
Tarolli
https://ebookmass.com/product/remote-sensing-of-geomorphology-
volume-23-paolo-tarolli/
https://ebookmass.com/product/the-greening-of-antarctica-
assembling-an-international-environment-alessandro-antonello/
https://ebookmass.com/product/atmospheric-remote-sensing-abhay-
kumar-singh/
https://ebookmass.com/product/analysis-with-an-introduction-to-
proof-fifth-edition-pearson-new-international-edition-steven-r-
lay/
https://ebookmass.com/product/biostatistical-analysis-pearson-
new-international-edition-jerrold-h-zar/
Remote Sensing of the Environment Jensen Second Edition
Remote Sensing of the Environment
An Earth Resource Perspective
John R. Jensen
Second Edition
ISBN 978-1-29202-170-6
9 781292 021706
Remote Sensing of the Environment
An Earth Resource Perspective
John R. Jensen
Second Edition
Pearson Education Limited
Edinburgh Gate
Harlow
Essex CM20 2JE
England and Associated Companies throughout the world
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted
in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without either the
prior written permission of the publisher or a licence permitting restricted copying in the United Kingdom
issued by the Copyright Licensing Agency Ltd, Saffron House, 6–10 Kirby Street, London EC1N 8TS.
All trademarks used herein are the property of their respective owners. The use of any trademark
in this text does not vest in the author or publisher any trademark ownership rights in such
trademarks, nor does the use of such trademarks imply any affiliation with or endorsement of this
book by such owners.
Table of Contents
I
14. In Situ Spectral Reflectance Measurement
John R. Jensen 591
Appendix—Sources of Remote Sensing Information
John R. Jensen 601
Index 605
II
Remote Sensing of the Environment
S
cientists observe nature, make measurements, and then attempt to accept or
reject hypotheses concerning these phenomena. The data collection may take
place directly in the field (referred to as in situ or in-place data collection), or
at some remote distance from the subject matter (referred to as remote sens-
ing of the environment).
One form of in situ data collection involves the scientist going out in the field
and questioning the phenomena of interest. For example, a census enumera-
tor may go door to door, asking people questions about their age, sex, educa-
tion, income, etc. These data are recorded and used to document the
demographic characteristics of the population.
From Chapter 1 of Remote Sensing of the Environment: An Earth Resource Perspective, Second Edition. John R. Jensen.
Copyright © 2007 by Pearson Education, Inc. Published by Pearson Prentice Hall. All rights reserved.
1
Remote Sensing of the Environment
In situ Measurement
radiometer
in backpack
personal
computer
detector
Figure 1 In situ (in-place) data are collected in the field. a) A scientist is collecting leaf-area-index (LAI) measurements of soybeans
(Glycine max L. Merrill) using a ceptometer that measures the number of “sunflecks” that pass through the vegetation canopy.
The measurements are made just above the canopy and on the ground below the canopy. The in situ LAI measurements may
be used to calibrate LAI estimates derived from remote sensor data. b) Spectral reflectance measurements from vegetation are
being collected using a spectroradiometer located approximately 1 m above the canopy. The in situ spectral reflectance mea-
surements may be used to calibrate the spectral reflectance measurements obtained from a remote sensing system.
Data collection by scientists in the field or by instruments involve the use of a biased sampling design or the system-
placed in the field provide much of the data for physical, bio- atic, improper use of a piece of equipment. Finally, the in
logical, and social science research. However, it is important situ data-collection measurement device may be calibrated
to remember that no matter how careful the scientist is, error incorrectly. This can result in serious measurement error.
may be introduced during the in situ data-collection process.
First, the scientist in the field can be intrusive. This means Intrusive in situ data collection, coupled with human
that unless great care is exercised, the scientist can actually method-produced error and measurement-device miscali-
change the characteristics of the phenomenon being mea- bration, all contribute to in situ data-collection error. There-
sured during the data-collection process. For example, a sci- fore, it is a misnomer to refer to in situ data as ground truth
entist could lean out of a boat to obtain a surface-water data. Instead, we should simply refer to it as in situ ground
sample from a lake. Unfortunately, the movement of the boat reference data, acknowledging that it contains error.
into the area may have stirred up the water column in the
vicinity of the water sample, resulting in an unrepresenta- Remote Sensing Data Collection
tive, or biased, sample. Similarly, a scientist collecting a
spectral reflectance reading could inadvertently step on the
sample site, disturbing the vegetation canopy prior to data
collection. Fortunately, it is also possible to collect information about
an object or geographic area from a distant vantage point
Scientists may also collect data in the field using biased pro- using remote sensing instruments (Figure 2). Remote sens-
cedures. This introduces method-produced error. It could
2
Remote Sensing of the Environment
Remote Sensing Measurement ing data collection was originally performed using cameras
mounted in suborbital aircraft. Photogrammetry was defined
in the early editions of the Manual of Photogrammetry as:
But where did the term remote sensing come from? The
actual coining of the term goes back to an unpublished paper
in the early 1960s by the staff of the Office of Naval
Object, area, or Research Geography Branch (Pruitt, 1979; Fussell et al.,
materials within the 1986). Evelyn L. Pruitt was the author of the paper. She was
ground-projected IFOV assisted by staff member Walter H. Bailey. Aerial photo
D interpretation had become very important in World War II.
diameter of the The space age was just getting under way with the 1957
ground-projected IFOV launch of Sputnik (U.S.S.R.), the 1958 launch of Explorer 1
(U.S.), and the collection of photography from the then
Figure 2 A remote sensing instrument collects information secret CORONA program initiated in 1960 (Table 1). In
about an object or phenomenon within the instanta- addition, the Geography Branch of ONR was expanding its
neous-field-of-view (IFOV) of the sensor system research using instruments other than cameras (e.g., scan-
without being in direct physical contact with it. The ners, radiometers) and into regions of the electromagnetic
remote sensing instrument may be located just a spectrum beyond the visible and near-infrared regions (e.g.,
few meters above the ground and/or onboard an air- thermal infrared, microwave). Thus, in the late 1950s it had
craft or satellite platform. become apparent that the prefix “photo” was being stretched
too far in view of the fact that the root word, photography,
3
Remote Sensing of the Environment
literally means “to write with [visible] light” (Colwell, and microwave regions of the electromagnetic spec-
1997). Evelyn Pruitt (1979) wrote: trum by means of instruments such as cameras,
scanners, lasers, linear arrays, and/or area arrays
The whole field was in flux and it was difficult for located on platforms such as aircraft or spacecraft,
the Geography Program to know which way to and the analysis of acquired information by means
move. It was finally decided in 1960 to take the of visual and digital image processing.
problem to the Advisory Committee. Walter H.
Bailey and I pondered a long time on how to present Robert Green at NASA’s Jet Propulsion Lab (JPL) suggests
the situation and on what to call the broader field that the term remote measurement might be used instead of
that we felt should be encompassed in a program to remote sensing because data obtained using the new hyper-
replace the aerial photointerpretation project. The spectral remote sensing systems are so accurate (Robbins,
term ‘photograph’ was too limited because it did not 1999). Each of the definitions are correct in an appropriate
cover the regions in the electromagnetic spectrum context. It is useful to briefly discuss components of these
beyond the ‘visible’ range, and it was in these non- remote sensing definitions.
visible frequencies that the future of interpretation
seemed to lie. ‘Aerial’ was also too limited in view Remote Sensing: Art and/or Science?
of the potential for seeing the Earth from space.
Science: A science is defined as a broad field of human
The term remote sensing was promoted in a series of sympo- knowledge concerned with facts held together by principles
sia sponsored by ONR at the Willow Run Laboratories of the (rules). Scientists discover and test facts and principles by
University of Michigan in conjunction with the National the scientific method, an orderly system of solving prob-
Research Council throughout the 1960s and early 1970s, and lems. Scientists generally feel that any subject that humans
has been in use ever since (Estes and Jensen, 1998). can study by using the scientific method and other special
rules of thinking may be called a science. The sciences
Maximal/Minimal Definitions include 1) mathematics and logic, 2) physical sciences, such
as physics and chemistry, 3) biological sciences, such as bot-
any and zoology, and 4) the social sciences, such as geogra-
Numerous other definitions of remote sensing have been phy, sociology, and anthropology (Figure 3). Interestingly,
proposed. In fact, Colwell (1984) suggests that “one mea- some persons do not consider mathematics and logic to be
sure of the newness of a science, or of the rapidity with sciences. But the fields of knowledge associated with math-
which it is developing is to be found in the preoccupation of ematics and logic are such valuable tools for science that we
its scientists with matters of terminology.” Some have pro- cannot ignore them. The human race’s earliest questions
posed an all-encompassing maximal definition: were concerned with “how many” and “what belonged
together.” They struggled to count, to classify, to think sys-
Remote sensing is the acquiring of data about an tematically, and to describe exactly. In many respects, the
object without touching it. state of development of a science is indicated by the use it
makes of mathematics. A science seems to begin with sim-
Such a definition is short, simple, general, and memorable. ple mathematics to measure, then works toward more com-
Unfortunately, it excludes little from the province of remote plex mathematics to explain.
sensing (Fussell et al., 1986). It encompasses virtually all
remote sensing devices, including cameras, optical-mechan- Remote sensing is a tool or technique similar to mathemat-
ical scanners, linear and area arrays, lasers, radar systems, ics. Using sophisticated sensors to measure the amount of
sonar, seismographs, gravimeters, magnetometers, and scin- electromagnetic energy exiting an object or geographic area
tillation counters. from a distance and then extracting valuable information
from the data using mathematically and statistically based
Others have suggested a more focused, minimal definition of algorithms is a scientific activity (Fussell et al., 1986).
remote sensing that adds qualifier after qualifier in an Remote sensing functions in harmony with other geographic
attempt to make certain that only legitimate functions are information sciences (often referred to as GIScience),
included in the term’s definition. For example: including cartography, surveying, and geographic informa-
tion systems (GIS) (Curran, 1987; Clarke, 2001; Jensen,
Remote sensing is the noncontact recording of 2005). Dahlberg and Jensen (1986) and Fisher and Linden-
information from the ultraviolet, visible, infrared, berg (1989) suggested a model where there is interaction
4
Remote Sensing of the Environment
Physical
Sciences
Mathematics
and Logic
Surveying
Activity
Remote
Biological GIS Sensing Social
Sciences Sciences
Stage 1 Stage 2 Stage 3 Stage 4
Time
Cartography
Figure 4 The developmental stages of a scientific discipline
(Wolter, 1975; Jensen and Dahlberg, 1983).
5
Remote Sensing of the Environment
6
Remote Sensing of the Environment
valuable information from the imagery. It is a fact that some remote sensing of the terrestrial Earth, using sensors that are
image analysts are superior to other image analysts because placed on suborbital air-breathing aircraft or orbital satellite
they: 1) understand the scientific principles better, 2) are platforms placed in the vacuum of space.
more widely traveled and have seen many landscape objects
and geographic areas, and/or 3) they can synthesize scien- Remote sensing and digital image processing techniques can
tific principles and real-world knowledge to reach logical also be used to analyze inner space. For example, an electron
and correct conclusions. Thus, remote sensing image inter- microscope can be used to obtain photographs of extremely
pretation is both an art and a science. small objects on the skin, in the eye, etc. An x-ray instrument
is a remote sensing system where the skin and muscle are
Information About an Object or Area like the atmosphere that must be penetrated, and the interior
bone or other matter is the object of interest.
Sensors can obtain very specific information about an object
(e.g., the diameter of an oak tree crown) or the geographic Remote Sensing Advantages and Limitations
extent of a phenomenon (e.g., the polygonal boundary of an
entire oak forest). The electromagnetic energy emitted or
reflected from an object or geographic area is used as a sur- Remote sensing has several unique advantages as well as
rogate for the actual property under investigation. The elec- some limitations.
tromagnetic energy measurements must be turned into
information using visual and/or digital image processing Advantages
techniques.
Remote sensing is unobtrusive if the sensor is passively
The Instrument (Sensor) recording the electromagnetic energy reflected from or emit-
ted by the phenomenon of interest. This is a very important
Remote sensing is performed using an instrument, often consideration, as passive remote sensing does not disturb the
referred to as a sensor. The majority of remote sensing object or area of interest.
instruments record EMR that travels at a velocity of 3 × 108
m s–1 from the source, directly through the vacuum of space Remote sensing devices are programmed to collect data sys-
or indirectly by reflection or reradiation to the sensor. The tematically, such as within a single 9 × 9 in. frame of vertical
EMR represents a very efficient high-speed communications aerial photography or a matrix (raster) of Landsat 5 The-
link between the sensor and the remote phenomenon. In fact, matic Mapper data. This systematic data collection can
we know of nothing that travels faster than the speed of light. remove the sampling bias introduced in some in situ investi-
Changes in the amount and properties of the EMR become, gations (e.g., Karaska et al., 2004).
upon detection by the sensor, a valuable source of data for
interpreting important properties of the phenomenon (e.g., Remote sensing science is also different from cartography or
temperature, color). Other types of force fields may be used GIS because these sciences rely on data obtained by others.
in place of EMR, such as acoustic (sonar) waves (e.g., Dart- Remote sensing science can provide fundamental, new sci-
nell and Gardner, 2004). However, the majority of remotely entific information. Under controlled conditions, remote
sensed data collected for Earth resource applications is the sensing can provide fundamental biophysical information,
result of sensors that record electromagnetic energy. including x,y location; z elevation or depth; biomass; tem-
perature; and moisture content. In this sense, remote sensing
Distance: How Far Is Remote? science is much like surveying, providing fundamental
information that other sciences can use when conducting sci-
Remote sensing occurs at a distance from the object or area entific investigations. However, unlike much of surveying,
of interest. Interestingly, there is no clear distinction about the remotely sensed data can be obtained systematically over
how great this distance should be. The intervening distance very large geographic areas rather than just single-point
could be 1 cm, 1 m, 100 m, or more than 1 million m from observations. In fact, remote sensing–derived information is
the object or area of interest. Much of astronomy is based on now critical to the successful modeling of numerous natural
remote sensing. In fact, many of the most innovative remote (e.g., water-supply estimation; eutrophication studies; non-
sensing systems and visual and digital image processing point source pollution) and cultural (e.g., land-use conver-
methods were originally developed for remote sensing extra- sion at the urban fringe; water-demand estimation;
terrestrial landscapes such as the moon, Mars, Io, Saturn, population estimation) processes (Walsh et al., 1999; Stow et
Jupiter, etc. This text, however, is concerned primarily with al., 2003; Nemani et al., 2003; Karaska et al., 2004). A good
7
Remote Sensing of the Environment
example is the digital elevation model that is so important in War, the war in Bosnia, and the war on terrorism. Many of
many spatially-distributed GIS models (Clarke, 2001). Dig- the accomplishments are summarized in Table 1. Basically,
ital elevation models are now produced mainly from stereo- military contracts to commercial companies resulted in the de-
scopic imagery, light detection and ranging (LIDAR) (e.g., velopment of sophisticated electrooptical multispectral remote
Maune, 2001; Hodgson et al., 2003b; 2005), radio detection sensing systems and thermal infrared and microwave (radar)
and ranging (RADAR) measurements, or interferometric sensor systems. While the majority of the remote sensing sys-
synthetic aperture radar (IFSAR) imagery. tems may have been initially developed for military recon-
naissance applications, the systems are also heavily used for
Limitations monitoring the Earth’s natural resources.
Remote sensing science has limitations. Perhaps the greatest The remote sensing data-collection and analysis procedures
limitation is that it is often oversold. Remote sensing is not a used for Earth resource applications are often implemented
panacea that will provide all the information needed to con- in a systematic fashion that can be termed the remote sensing
duct physical, biological, or social science research. It sim- process. The procedures in the remote sensing process are
ply provides some spatial, spectral, and temporal summarized here and in Figure 5:
information of value in a manner that we hope is efficient
and economical. • The hypothesis to be tested is defined using a specific type
of logic (e.g., inductive, deductive) and an appropriate
Human beings select the most appropriate remote sensing processing model (e.g., deterministic, stochastic).
system to collect the data, specify the various resolutions of
the remote sensor data, calibrate the sensor, select the plat- • In situ and collateral data necessary to calibrate the remote
form that will carry the sensor, determine when the data will sensor data and/or judge its geometric, radiometric, and
be collected, and specify how the data are processed. Human thematic characteristics are collected.
method-produced error may be introduced as the remote
sensing instrument and mission parameters are specified. • Remote sensor data are collected passively or actively
using analog or digital remote sensing instruments, ideally
Powerful active remote sensor systems that emit their own at the same time as the in situ data.
electromagnetic radiation (e.g., LIDAR, RADAR, SONAR)
can be intrusive and affect the phenomenon being investi- • In situ and remotely sensed data are processed using a)
gated. Additional research is required to determine how analog image processing, b) digital image processing, c)
intrusive these active sensors can be. modeling, and d) n-dimensional visualization.
Remote sensing instruments may become uncalibrated, • Metadata, processing lineage, and the accuracy of the
resulting in uncalibrated remote sensor data. Finally, remote information are provided and the results communicated
sensor data may be expensive to collect and analyze. Hope- using images, graphs, statistical tables, GIS databases,
fully, the information extracted from the remote sensor data Spatial Decision Support Systems (SDSS), etc.
justifies the expense. Interestingly, the greatest expense in a
typical remote sensing investigation is for well-trained It is useful to review the characteristics of these remote sens-
image analysts, not remote sensor data. ing process procedures.
8
Remote Sensing of the Environment
• Formulate Hypothesis • In Situ Measurements • Analog (Visual) Image Processing • Image Metadata
(if appropriate) - Field (e.g., x,y,z from GPS, - Using the Elements of - Sources
biomass, reflectance) Image Interpretation - Processing lineage
• Select Appropriate Logic - Laboratory (e.g., reflectance,
- Inductive and/or leaf area index) • Digital Image Processing • Accuracy Assessment
- Deductive - Preprocessing - Geometric
- Technological • Collateral Data - Radiometric Correction - Radiometric
- Digital elevation models - Geometric Correction - Thematic
• Select Appropriate Model - Soil maps - Enhancement - Change detection
- Deterministic - Surficial geology maps - Photogrammetric analysis
- Empirical - Population density, etc. - Parametric, such as • Analog and Digital
- Knowledge-based - Maximum likelihood - Images
- Process-based • Remote Sensing - Nonparametric, such as - Unrectified
- Stochastic - Passive analog - Artificial neural networks - Orthoimages
- Frame camera - Nonmetric, such as - Orthophotomaps
- Videography - Expert systems - Thematic maps
- Passive digital - Decision-tree classifiers - GIS databases
- Frame camera - Machine learning - Animations
- Scanners - Hyperspectral analysis - Simulations
- Multispectral - Change detection
- Hyperspectral - Modeling • Statistics
- Linear and area arrays - Spatial modeling using GIS data - Univariate
- Multispectral - Scene modeling - Multivariate
- Hyperspectral - Scientific geovisualization
- Active - 1, 2, 3, and n dimensions • Graphs
- Microwave (RADAR) - 1, 2, and 3 dimensions
- Laser (LIDAR) • Hypothesis Testing
- Acoustic (SONAR) - Accept or reject hypothesis
Figure 5 Scientists generally use the remote sensing process when extracting information from remotely sensed data.
the data or appreciate the vertical or oblique perspective of Models based on deductive and/or inductive logic can be
the terrain recorded in the imagery. further subdivided according to whether they are processed
deterministically or stochastically (Jensen, 2005). Some sci-
Scientists who use remote sensing, on the other hand, are entists extract new thematic information directly from
usually trained in the scientific method—a way of thinking remotely sensed imagery without ever explicitly using
about problems and solving them. They use a formal plan inductive or deductive logic. They are just interested in
that has at least five elements: 1) stating the problem, 2) extracting information from the imagery using appropriate
forming the research hypothesis (i.e., a possible explana- methods and technology. This technological approach is not
tion), 3) observing and experimenting, 4) interpreting data, as rigorous, but it is common in applied remote sensing. The
and 5) drawing conclusions. It is not necessary to follow this approach can also generate new knowledge.
formal plan exactly.
Remote sensing is used in both scientific (inductive and
The scientific method is normally used in conjunction with deductive) and technological approaches to obtain knowl-
environmental models that are based on two primary types edge. There is debate as to how the different types of logic
of logic: used in the remote sensing process yield new scientific
knowledge (e.g., Fussell et al., 1986; Curran, 1987; Fisher
• deductive logic and Lindenberg, 1989; Dobson, 1993; Skidmore, 2002).
• inductive logic
9
Remote Sensing of the Environment
Identification of In situ and Remote Sensing Data that the remotely sensed data can provide fundamental bio-
Requirements logical and/or physical (biophysical) information directly,
generally without having to use other surrogate or ancillary
data. For example, a thermal infrared remote sensing system
If a hypothesis is formulated using inductive and/or deduc- can record the apparent temperature of a rock outcrop by
tive logic, a list of variables or observations are identified measuring the radiant energy exiting its surface. Similarly, it
that will be used during the investigation. In situ observation is possible to conduct remote sensing in a very specific
and/or remote sensing may be used to collect information on region of the spectrum and identify the amount of water
the most important variables. vapor in the atmosphere. It is also possible to measure soil
moisture content directly using microwave remote sensing
Scientists using remote sensing technology should be well techniques (Engman, 2000). NASA’s Moderate Resolution
trained in field and laboratory data-collection procedures. Imaging Spectrometer (MODIS) can be used to measure
For example, if a scientist wants to map the surface temper- absorbed photosynthetically active radiation (APAR) and
ature of a lake, it is usually necessary to collect some accu- leaf area index (LAI). The precise x, y location, and height
rate empirical in situ lake-temperature measurements at the (z) of an object can be extracted directly from stereoscopic
same time the remote sensor data are collected. The in situ aerial photography, overlapping satellite imagery (e.g.,
observations may be used to 1) calibrate the remote sensor SPOT), light detection and ranging (LIDAR) data, or inter-
data, and/or 2) perform an unbiased accuracy assessment of ferometric synthetic aperture radar (IFSAR) imagery.
the final results (Congalton and Green, 1998). Remote sens-
ing textbooks provide some information on field and labora- Table 2 is a list of selected biophysical variables that can be
tory sampling techniques. The in situ sampling procedures, remotely sensed and useful sensors to acquire the data. Great
however, are learned best through formal courses in the sci- strides have been made in remotely sensing many of these bio-
ences (e.g., chemistry, biology, forestry, soils, hydrology, physical variables. They are important to the national and in-
meteorology). It is also important to know how to collect ternational effort under way to model the global environment
accurately socioeconomic and demographic information in (Jensen et al., 2002; Asrar, 2004).
urban environments based on training in human geography,
sociology, etc. Hybrid Variables: The second general group of variables
that can be remotely sensed include hybrid variables, created
Most in situ data are now collected in conjunction with glo- by systematically analyzing more than one biophysical vari-
bal positioning system (GPS) x, y, z data (Jensen and Cowen, able. For example, by remotely sensing a plant’s chlorophyll
1999). Scientists should know how to collect the GPS data at absorption characteristics, temperature, and moisture con-
each in situ data-collection station and how to perform dif- tent, it might be possible to model these data to detect vege-
ferential correction to obtain accurate x, y, z coordinates tation stress, a hybrid variable. The variety of hybrid
(Rizos, 2002). variables is large; consequently, no attempt is made to iden-
tify them. It is important to point out, however, that nominal-
Collateral Data Requirements scale land use and land cover are hybrid variables. For
example, the land cover of a particular area on an image may
Many times collateral data (often called ancillary data), be derived by evaluating several of the fundamental bio-
such as digital elevation models, soil maps, geology maps, physical variables at one time [e.g., object location (x, y),
political boundary files, and block population statistics, are height (z), tone and/or color, biomass, and perhaps tempera-
of value in the remote sensing process. Ideally, the spatial ture]. So much attention has been placed on remotely sens-
collateral data reside in a GIS (Clarke, 2001). ing this hybrid nominal-scale variable that the interval- or
ratio-scaled biophysical variables were largely neglected
Remote Sensing Data Requirements until the mid-1980s. Nominal-scale land-use and land-cover
mapping are important capabilities of remote sensing tech-
Once we have a list of variables, it is useful to determine nology and should not be minimized. Many social and phys-
which of them can be remotely sensed. Remote sensing can ical scientists routinely use such data in their research.
provide information on two different classes of variables: However, there is now a dramatic increase in the extraction
biophysical and hybrid. of interval- and ratio-scaled biophysical data that are incor-
10
Remote Sensing of the Environment
Table 2. Selected biophysical and hybrid variables and potential remote sensing systems used to obtain the information.
z Topography/Bathymetry
- Digital Elevation Model (DEM) - GPS, stereoscopic aerial photography, LIDAR, SPOT, RADARSAT,
IKONOS, QuickBird, OrbView-3, Shuttle Radar Topography Mission
(SRTM), Interferometric Synthetic Aperture Radar (IFSAR)
- Digital Bathymetric Model (DBM) - SONAR, bathymetric LIDAR, stereoscopic aerial photography
Vegetation
- Pigments (e.g., chlorophyll a and b) - Color aerial photography, Landsat ETM+, IKONOS, QuickBird, OrbView-3,
Orbimage SeaWiFS, Advanced Spaceborne Thermal Emission and Reflec-
tion Radiometer (ASTER), Moderate Resolution Imaging Spectrometer
(MODIS), ENVISAT, airborne hyperspectral (e.g., AVIRIS, HyMap, CASI)
- Canopy structure and height - Stereoscopic aerial photography, LIDAR, RADARSAT, IFSAR
- Biomass derived from vegetation indices - Color-infrared (CIR) aerial photography, Landsat (TM, ETM+), IKONOS,
- Leaf area index (LAI) QuickBird, OrbView-3, Advanced Very High Resolution Radiometer
- Absorbed photosynthetically active radiation (AVHRR), Multiangle Imaging Spectroradiometer (MISR), airborne hyper-
- Evapotranspiration spectral systems (e.g., AVIRIS, HyMap, CASI)
Surface Temperature (land, water, atmosphere) - ASTER, AVHRR, GOES, Hyperion, MISR, MODIS, SeaWiFS, airborne
thermal infrared
Surface Roughness - Aerial photography, ALMAZ, ERS-1 and 2, RADARSAT, Intermap Star 3i,
IKONOS, QuickBird, ASTER, ENVISAT ASAR
Atmosphere
- Aerosols (e.g., optical depth) - MISR, GOES, AVHRR, MODIS, CERES, MOPITT, MERIS
- Clouds (e.g., fraction, optical thickness) - GOES, AVHRR, MODIS, MISR, CERES, MOPITT, UARS, MERIS
- Precipitation - Tropical Rainfall Measurement Mission (TRMM), GOES, AVHRR, SSM/1,
MERIS
- Water vapor (precipitable water) - GOES, MODIS, MERIS
- Ozone - MODIS
Water
- Color - Color and CIR aerial photography, Landsat (TM, ETM+), SPOT, IKONOS,
- Surface hydrology QuickBird, OrbView-3, ASTER, SeaWiFS, MODIS, airborne hyperspectral
- Suspended minerals systems (e.g., AVIRIS, HyMap, CASI), AVHRR, GOES, bathymetric
- Chlorophyll/gelbstoffe LIDAR, MISR, CERES, Hyperion, TOPEX/POSEIDON, MERIS
- Dissolved organic matter
11
Remote Sensing of the Environment
Table 2. Selected biophysical and hybrid variables and potential remote sensing systems used to obtain the information.
Volcanic Effects
- Temperature, gases - ASTER, MISR, Hyperion, MODIS, airborne hyperspectral systems
Land Use
- Commercial, residential, transportation, etc. - Very high spatial resolution panchromatic, color and /or CIR stereoscopic
- Cadastral (property) aerial photography, high spatial resolution satellite imagery (<1 x 1 m:
- Tax mapping IKONOS, QuickBird, OrbView-3), SPOT (2.5 m), LIDAR, high spatial reso-
lution hyperspectral systems (e.g., AVIRIS, HyMap, CASI)
Land Cover
- Agriculture, forest, urban, etc. - Color and CIR aerial photography, Landsat (MSS, TM, ETM+), SPOT,
ASTER, AVHRR, RADARSAT, IKONOS, QuickBird, OrbView-3, LIDAR,
IFSAR, SeaWiFS, MODIS, MISR, MERIS, hyperspectral systems (e.g.,
AVIRIS, HyMap, CASI)
Vegetation
- stress - Color and CIR aerial photography, Landsat (TM, ETM+), IKONOS, Quick-
Bird, OrbView-3, AVHRR, SeaWiFS, MISR, MODIS, ASTER, MERIS, air-
borne hyperspectral systems (e.g., AVIRIS, HyMap, CASI)
porated into quantitative models that can accept spatially matrix (raster) of brightness values obtained using a scanner,
distributed information. linear array, or area array]. A selected list of some of
the most important remote sensing systems is presented
Remote Sensing Data Collection in Table 3.
12
Remote Sensing of the Environment
Resolution
Spectral
Middle-
Near- infrared Thermal Micro- Spatial Temporal
Remote Sensing Systems Blue Green Red infrared (SWIR) infrared wave (m) (days)
Suborbital Sensors
Panchromatic film (black & white) 0.5 ——— 0.7 μm Variable Variable
Color film 0.4 —————— 0.7 μm Variable Variable
Color-infrared film 0.5 ——————— 0.9 μm Variable Variable
Digital Frame Cameras (CCD) 1 1 1 1 0.25 – 5 Variable
CASI - 1500 0.40 ———— 288 bands ——— 1.0 μm variable Variable
AVIRIS - Airborne Visible Infrared Imaging Spectrometer 0.40 ———— 224 bands ————— 2.5 μm 2.5 or 20 Variable
Intermap Star-3i X-band radar 1 Variable Variable
Satellite Sensors
NOAA-9 AVHRR LAC 1 1 3 1100 14.5/day
NOAA- K, L, M 1 1 2 2 1100 14.5/day
Landsat Multispectral Scanner (MSS) 1 1 2 79 16 – 18
Landsat 4 and 5 Thematic Mappers (TM) 1 1 1 1 2 1 30 and 120 16
+
Landsat 7 Enhanced TM (ETM ) — Multispectral 1 1 1 1 2 1 30 and 60 16
— Panchromatic 0.52 ———— 0.9 μm 15 16
SPOT 4 HRV — Multispectral 1 1 1 20 Pointable
— Panchromatic 0.51 ———— 0.73 μm 10 Pointable
GOES Series (East and West) 0.52 ———— 0.72 μm 4 700 0.5/hr
European Remote Sensing Satellite (ERS-1 and 2) VV polarization C-band (5.3 GHz) 1 26 – 28 —
Canadian RADARSAT (several modes) HH polarization C-band (5.3 GHz) 1 9 – 100 1 – 6 days
Shuttle Imaging Radar (SIR-C) 3 30 Variable
Sea-Viewing Wide Field-of-View Sensor (SeaWiFS) 3 2 1 2 1130 1
MODIS - Moderate Resolution Imaging Spectrometer 0.405 ————— 36 bands ———— 14.385 μm 250, 500, 1–2
1000
ASTER - Advanced Spaceborne Thermal Emission 0.52 — 3 bands — 0.86 μm 15 5
and Reflection Radiometer 1.6 — 6 bands — 2.43 μm 30 16
8.12 — 5 bands — 11.6 μm 90 16
MISR - Multiangle Imaging SpectroRadiometer Nine CCD cameras in four bands (440, 550, 670, 860 nm) 275 and 1–2
1100
NASA Topex/Poseidon — TOPEX radar altimeter (18, 21, 37 GHz) 315,000 10
— POSEIDON single-frequency radiometer (13.65 GHz)
Space Imaging IKONOS — Multispectral 1 1 1 1 4 Pointable
— Panchromatic 0.45 ———————— 0.9 μm 1
Digital Globe QuickBird — Multispectral 1 1 1 1 2.44 Pointable
— Panchromatic 0.45 ———————— 0.9 μm 0.61
13
Remote Sensing of the Environment
t = temporal information, i.e., when, how long, and how for comparative purposes (Figure 6a, c, and d). The cam-
often the data were acquired; era’s bandwidths were refined to record information in
more specific regions of the spectrum (band 1 = 450 – 515
θ = set of angles that describe the geometric relationships nm; band 2 = 525 – 605 nm; band 3 = 640 – 690 nm; and
between the radiation source (e.g., the Sun), the terrain target band 4 = 750 – 900 nm ). There are gaps betw een the
of interest (e.g., a corn field), and the remote sensing system; spectral sensi- tivities of the detectors. Note that this digital
camera system is also sensitive to reflected blue wavelength
P = polarization of back-scattered energy recorded by the energy.
sensor; and The aforementioned terminology is typically used to de-
scribe a sensor’s nominal spectral resolution. However, it is
Ω = radiometric resolution (precision) at which the data difficult to create a detector that has extremely sharp band-
(e.g., reflected, emitted, or back-scattered radiation) are pass boundaries such as those shown in Figure 6a. Rather,
recorded by the remote sensing system. the more precise method of stating bandwidth is to look at
the typical Gaussian shape of the detector sensitivity, such
It is useful to briefly review characteristics of the parameters as the example shown in Figure 6b. The analyst then deter-
associated with Equation 1 and how they influence the nature mines the Full Width at Half Maximum (FWHM). In this
of the remote sensing data collected. hypothetical example, the Landsat MSS near-infrared band
3 under investigation is sensitive to energy between 700
Spectral Information and Resolution and 800 nm.
Most remote sensing investigations are based on developing A hyperspectral remote sensing instrument typically
a deterministic relationship (i.e., a model) between the acquires data in hundreds of spectral bands (Goetz, 2002).
amount of electromagnetic energy reflected, emitted, or For example, the Airborne Visible and Infrared Imaging
back-scattered in specific bands or frequencies and the Spectrometer (AVIRIS) has 224 bands in the region from 400
chemical, biological, and physical characteristics of the phe- to 2,500 nm spaced just 10 nm apart based on the FWHM cri-
nomena under investigation (e.g., a corn field canopy). Spec- teria (Clark, 1999; NASA, 2006). An AVIRIS hyperspectral
tral resolution is the number and dimension (size) of specific datacube of a portion of the Savannah River Site near Aiken, SC,
wavelength intervals (referred to as bands or channels) in is shown in Figure 7. Ultraspectral remote sensing involves
the electromagnetic spectrum to which a remote sensing data collection in many hundreds of bands.
instrument is sensitive.
Certain regions or spectral bands of the electromagnetic
Multispectral remote sensing systems record energy in mul- spectrum are optimal for obtaining information on biophysi-
tiple bands of the electromagnetic spectrum. For example, cal parameters. The bands are normally selected to maxi-
in the 1970s and early 1980s, the Landsat Multispectral mize the contrast between the object of interest and its
Scan- ners (MSS) recorded remotely sensed data of much of background (i.e., object-to-background contrast). Careful
the Earth that is still of significant value for change detection selection of the spectral bands might improve the probability
studies. The bandwidths of the four MSS bands are dis- that the desired information will be extracted from the
played in Figure 6a (band 1 = 500 – 600 nm; band 2 = 600 remote sensor data.
– 700 nm; band 3 = 700 – 800 nm; and band 4 = 800 – 1,
100 nm). The nominal size of a band may be large (i.e., Spatial Information and Resolution
coarse), as with the Landsat MSS near-infrared band 4 (800
– 1,100 nm) or relatively smaller (i.e., finer), as with the Most remote sensing studies record the spatial attributes of
Landsat MSS band 3 (700 – 800 nm). Thus, Landsat MSS objects on the terrain. For example, each silver halide crystal
band 4 detectors recorded a relatively large range of in an analog aerial photograph and each picture element in a
reflected near- infrared radiant flux (300 nm wide) while the digital remote sensor image is located at a specific location
MSS band 3 detectors recorded a much reduced range of in the image and associated with specific x,y coordinates on
near-infrared radiant flux (100 nm wide). the ground. Once rectified to a standard map projection, the
spatial information associated with each silver halide crystal
The four multispectral bandwidths associated with the Posi- or pixel is of significant value because it allows the remote
tive Systems ADAR 5500 digital frame camera are shown sensing–derived information to be used with other spatial
14
Remote Sensing of the Environment
Spectral Resolution
Landsat Multispectral Scanner (MSS) 100% Maximum intensity
Near-
G R NIR
infrared Spectral
Intensity
Resolution
1 FWHM = 0.7 – 0.8 μm
2 3 band 4 50%
= 700 – 800 nm
Half Maximum
Full Width at
= 100 nm bandwidth
0.4 0.5 0.6 0.7 0.8 0.9 1.0 1.1
Positive Systems ADAR 5500
green band
(525 – 605 nm)
red band
(640 – 690 nm)
near-infrared
(750 – 900 nm)
Figure 6 a) The spectral bandwidths of the four Landsat Multispectral Scanner (MSS) bands (green, red, and two near-infrared) com-
pared with the bandwidths of an ADAR 5500 digital frame camera. b) The true spectral bandwidth is the width of the Gauss-
ian-shaped spectral profile at Full Width at Half Maximum (FWHM) intensity (Clark, 1999). This example has a spectral
bandwidth of 0.1 μm (100 nm) between 700 and 800 nm. c) If desired, it is possible to collect reflected energy in a single band
of the electromagnetic spectrum (e.g., 750 – 900 nm). This is a 1 × 1 ft spatial resolution ADAR 5500 near-infrared image. d)
Multispectral sensors collect data in multiple bands of the spectrum (images courtesy of Positive Systems, Inc.).
data in a GIS or spatial decision support system (Jensen et that can be resolved by the remote sensing system. The spa-
al., 2002). tial resolution of aerial photography may be measured by 1)
placing calibrated, parallel black and white lines on tarps
There is a general relationship between the size of an object that are placed in the field, 2) obtaining aerial photography
or area to be identified and the spatial resolution of the of the study area, and 3) computing the number of resolvable
remote sensing system. Spatial resolution is a measure of the line pairs per millimeter in the photography. It is also possi-
smallest angular or linear separation between two objects ble to determine the spatial resolution of imagery by com-
15
Remote Sensing of the Environment
16
Remote Sensing of the Environment
Spatial Resolution
0.5 x 0.5 m
10 x 10 m 20 x 20 m
Spatial Resolution
enlarged view
80 m
Instantaneous
field of view
40
30
5
40 x 40 m 80 x 80 m 1 10 20 80 m
Figure 8 Imagery of residential housing near Mechanicsville, N.Y., obtained on June 1, 1998, at a nominal spatial resolution of 0.3 ×
0.3 m (approximately 1 × 1 ft) using a digital camera (courtesy of Litton Emerge, Inc.). The original data were resampled to
derive the imagery with the simulated spatial resolutions shown.
Temporal Information and Resolution al., 1997). For example, agricultural crops have unique phe-
nological cycles in each geographic region. To measure
One of the valuable things about remote sensing science is specific agricultural variables, it is necessary to acquire re-
that it obtains a record of Earth landscapes at a unique motely sensed data at critical dates in the phenological cy-
moment in time. Multiple records of the same landscape cle (Johannsen et al., 2003). Analysis of multiple-date
obtained through time can be used to identify processes at imagery provides information on how the variables are
work and to make predictions. changing through time. Change information provides insight
into processes influencing the development of the crop
The temporal resolution of a remote sensing system gener- (Jensen et al., 2002). Fortunately, several satellite sensor
ally refers to how often the sensor records imagery of a par- systems such as SPOT, IKONOS, ImageSat and QuickBird
ticular area. The temporal resolution of the sensor system are pointable, meaning that they can acquire imagery off-
shown in Figure 9 is every 16 days. Ideally, the sensor ob- nadir. Nadir is the point directly below the spacecraft.
tains data repetitively to capture unique discriminating char- This dramatically increases the probability that imagery
acteristics of the object under investigation (Haack et will be obtained during a growing season or during
17
Remote Sensing of the Environment
18
Remote Sensing of the Environment
Infrastructure In
Emergency
0.001 Response
Weather
GOES
Prediction
Aerial Photography
IKONOS/DG/OV-3
IKONOS/ImageSat
Thematic Mapper
0.0001
Landsat MSS
DigitalGlobe
AVHRR
MODIS
ASTER
ASTER
IRS 1C
GOES
SPOT
SPOT
Figure 10 There are spatial and temporal resolution considerations that must be made for certain applications (Color Plate 1).
19
Remote Sensing of the Environment
RADAR imagery is an especially useful application of polar- Angular information is central to the use of remote sensor
ized energy. data in photogrammetric applications. Stereoscopic image
analysis is based on the assumption that an object on the ter-
Angular Information rain is remotely sensed from two angles. Viewing the same
terrain from two vantage points introduces stereoscopic par-
Remote sensing systems record very specific angular char- allax, which is the foundation for all stereoscopic photo-
acteristics associated with each exposed silver halide crystal grammetric and radargrammetric analysis (Light and Jensen,
or pixel (Barnsley, 1999). The angular characteristics are a 2002).
function of (Figure 13a):
Suborbital (Airborne) Remote Sensing Systems
• the location in a three-dimensional sphere of the
illumination source (e.g., the Sun for a passive system or High-quality photogrammetric cameras mounted onboard
the sensor itself in the case of RADAR, LIDAR, and aircraft continue to provide aerial photography for many
SONAR) and its associated azimuth and zenith angles, Earth resource applications. For example, the U.S. Geologi-
cal Survey’s National Aerial Photography Program (NAPP)
• the orientation of the terrain facet (pixel) or terrain cover systematically collected l:40,000-scale black-and-white or
(e.g., vegetation) under investigation, and color-infrared aerial photography of much of the United
States every 5 to 10 years. Some of these photogrammetric
• the location of the suborbital or orbital remote sensing data are now being collected using digital frame cameras. In
system and its associated azimuth and zenith angles. addition, sophisticated remote sensing systems are routinely
mounted on aircraft to provide high spatial and spectral res-
There is always an angle of incidence associated with the olution remotely sensed data. Examples include hyperspec-
incoming energy that illuminates the terrain and an angle of tral sensors such as NASA’s AVIRIS, the Canadian Airborne
exitance from the terrain to the sensor system. This bidirec- Imaging Spectrometer (CASI), and the Australian HyMap
tional nature of remote sensing data collection is known to hyperspectral system. These sensors can collect data on
influence the spectral and polarization characteristics of the demand when disaster strikes (e.g., oil spills or floods) if
at-sensor radiance, L, recorded by the remote sensing sys- cloud-cover conditions permit. There are also numerous
tem. radars, such as Intermap’s Star-3i, that can be flown on air-
craft day and night and in inclement weather. Unfortunately,
A goniometer can be used to document the changes in at- suborbital remote sensor data are usually expensive to
sensor radiance, L, caused by changing the position of the acquire per km2. Also, atmospheric turbulence can cause the
sensor and/or the source of the illumination (e.g., the Sun) data to have severe geometric distortions that can be difficult
(Figure 13b). For example, Figure 13c presents three to correct.
20
Remote Sensing of the Environment
nm a.m. nm a.m.
nm p.m. nm p.m.
c. Comparison of hourly three-dimensional plots of BRDF for smooth cordgrass (Spartina alterniflora) data
collected at 8 a.m., 9 a.m., 12 p.m., and 4 p.m. at the boardwalk site on March 21 – 22, 2000, for band 624.20 nm.
Figure 13 a) The concepts and parameters of the bidirectional reflectance distribution function (BRDF). A target is bathed in irradiance
(dEi) from a specific Sun zenith and azimuth angle, and the sensor records the radiance (dLr) exiting the target of interest at a
specific azimuth and zenith angle. b) The Sandmeier Field Goniometer collecting smooth cordgrass (Spartina alterniflora)
BRDF measurements at North Inlet, SC. Spectral measurements are made at Sun zenith angle of θi and Sun azimuth angle of
ϕi and a sensor zenith angle of view of θr and sensor azimuth angle of ϕr . A GER 3700 spectroradiometer, attached to the
moving sled mounted on the zenith arc, records the amount of radiance leaving the target in 704 bands at 76 angles (Sandmeier,
2000; Schill et al., 2004). c) Hourly three-dimensional plots of BRDF data.
21
Remote Sensing of the Environment
Current and Proposed Satellite Remote Sensing Systems The physical climate subsystem is sensitive to fluctuations
in the Earth’s radiation balance. Human activities have
Remote sensing systems onboard satellites provide high- caused changes to the planet’s radiative heating mechanism
quality, relatively inexpensive data per km2. For example, that rival or exceed natural change. Increases in greenhouse
the European Remote Sensing satellites (ERS-1 and 2) col- gases between 1765 and 1990 have caused a radiative forc-
lect 26 × 28 m spatial resolution C-band active microwave ing of 2.5 W m-2. If this rate is sustained, it could result in
(RADAR) imagery of much of Earth, even through clouds. global mean temperatures increasing about 0.2 to 0.5 °C per
Similarly, the Canadian Space Agency RADARSAT obtains decade during the next century. Volcanic eruptions and the
C-band active microwave imagery. The United States has ocean’s ability to absorb heat may impact the projections.
progressed from multispectral scanning systems (Landsat Nevertheless, the following questions are being addressed
MSS launched in 1972) to more advanced scanning systems using remote sensing (Asrar and Dozier, 1994):
(Landsat 7 Enhanced Thematic Mapper Plus in 1999). The
Land Remote Sensing Policy Act of 1992 specified the • How do clouds, water vapor, and aerosols in the Earth’s
future of satellite land remote sensing programs in the radiation and heat budgets change with increased
United States (Asker, 1992; Jensen, 1992). Unfortunately, atmospheric greenhouse-gas concentrations?
Landsat 6 with its Enhanced Thematic Mapper did not
achieve orbit when launched on October 5, 1993. Landsat 7 • How do the oceans interact with the atmosphere in the
was launched on April 15, 1999, to relieve the United States’ transport and uptake of heat?
land remote sensing data gap. Unfortunately, it now has seri-
ous scan-line corrector problems. Meanwhile, the French • How do land-surface properties such as snow and ice
have pioneered the development of linear array remote sens- cover, evapotranspiration, urban/suburban land use, and
ing technology with the launch of SPOT satellites 1 through vegetation influence circulation?
5 in 1986, 1990, 1993, 1998, and 2002.
The Earth’s biogeochemical cycles have also been changed
by humans. Atmospheric carbon dioxide has increased by 30
The International Geosphere–Biosphere Program (IGBP) percent since 1859, methane by more than 100 percent, and
and the United States Global Change Research Program ozone concentrations in the stratosphere have decreased,
(USGCRP) call for scientific research to describe and under- causing increased levels of ultraviolet radiation to reach the
stand the interactive physical, chemical, and biological pro- Earth’s surface. Global change research is addressing the
cesses that regulate the total Earth system. Space-based following questions:
remote sensing is an integral part of these research programs
because it provides the only means of observing global eco- • What role do the oceanic and terrestrial components of the
systems consistently and synoptically. The National Aero- biosphere play in the changing global carbon budget?
nautics and Space Administration (NASA) Earth Science
Enterprise (ESE) is the name given to the coordinated plan • What are the effects on natural and managed ecosystems
to provide the necessary satellite platforms and instruments of increased carbon dioxide and acid deposition, shifting
and an Earth Observing System Data and Information Sys- precipitation patterns, and changes in soil erosion, river
tem (EOSDIS), and related scientific research for IGBP. The chemistry, and atmospheric ozone concentrations?
Earth Observing System (EOS) is a series of Earth-orbiting
satellites that will provide global observations for 15 years The hydrologic cycle links the physical climate and bio-
or more. The first satellites were launched in the late 1990s. geochemical cycles. The phase change of water between its
EOS is complemented by missions and instruments from gaseous, liquid, and solid states involves storage and release
international partners. For example, the Tropical Rainfall of latent heat, so it influences atmospheric circulation and
Mapping Mission (TRMM) is a joint NASA/Japanese mis- globally redistributes both water and heat (Asrar and Dozier,
sion. 1994). The hydrologic cycle is the integrating process for the
fluxes of water, energy, and chemical elements among com-
EOS Science Plan: Asrar and Dozier (1994) conceptual- ponents of the Earth system. Important questions to be
ized the remote sensing science conducted as part of addressed include these three:
NASA’s ESE. They suggested that the Earth consists of two
subsystems, 1) the physical climate, and 2) biogeochemical • How will atmospheric variability, human activities, and
cycles, linked by the global hydrologic cycle, as shown in climate change affect patterns of humidity, precipitation,
Figure 14. evapotranspiration, and soil moisture?
22
Remote Sensing of the Environment
External
Sun Forcing Volcanoes
Functions
Marine
Ocean bio-
dynamics geochemistry
Terrestrial
energy Terrestrial
and ecosystems
moisture Soil and water
chemistry
Human Activities
Figure 14 The Earth system can be subdivided into two subsystems—the physical climate system and biogeochemical cycles—that are
linked by the global hydrologic cycle. Significant changes in the external forcing functions and human activities have an im-
pact on the physical climate system, biogeochemical cycles, and the global hydrologic cycle. Examination of these subsystems
and their linkages defines the critical questions that the NASA Earth Observing System (EOS) is attempting to answer (adapt-
ed from Asrar and Dozier, 1994).
• How does soil moisture vary in time and space? These and other research questions are articulated in
NASA’s current Earth System Science focus areas (Asrar,
• Can we predict changes in the global hydrologic cycle 2004). The models that address these research questions
using present and future observation systems and models? require sophisticated remote sensing measurements. To this
end, the EOS Terra satellite was launched on December 18,
1999. It contains five remote sensing instruments (MODIS,
23
Remote Sensing of the Environment
ASTER, MISR, CERES, and MOPITT) designed to address multispectral bands (Table 3). DigitalGlobe, Inc.
many of the research topics (King, 2003). The EOS Aqua ,launched QuickBird on October 18, 2001, with a 61 × 61 cm
satellite was launched in May, 2002. The Moderate Resolu- panchromatic band and four 2.44 × 2.44 m multispectral
tion Imaging Spectrometer (MODIS) has 36 bands from bands. Orbimage, Inc. launched OrbView-3 on June 26,
0.405 to 14.385 μm that collect data at 250 × 250 m, 500 × 2003, with 1 × 1 m panchromatic and 4 × 4 m multispectral
500 m, and 1 × 1 km spatial resolutions. MODIS views the bands.
entire surface of the Earth every one to two days, making
observations in 36 spectral bands of land- and ocean-surface Remote Sensing Data Analysis
temperature, primary productivity, land-surface cover,
clouds, aerosols, water vapor, temperature profiles, and
fires. Remote sensor data are analyzed using a variety of image
processing techniques (Figures 5 and 15), including:
The Advanced Spaceborne Thermal Emission and Reflec-
tion Radiometer (ASTER) has five bands in the thermal • analog (visual) image processing, and
infrared region between 8 and 12 μm with 90-m pixels. It
also has three broad bands between 0.5 and 0.9 μm with 15- • digital image processing.
m pixels and stereo capability, and six bands in the short-
wave infrared region (1.6 – 2.5 μm) with 30-m spatial reso- Analog and digital analysis of remotely sensed data seek to
lution. ASTER is the highest spatial resolution sensor detect and identify important phenomena in the scene. Once
system on the EOS Terra platform and provides information identified, the phenomena are usually measured, and the
on surface temperature that can be used to model evapo- information is used in solving problems (Estes et al., 1983;
transpiration. Haack et al., 1997). Thus, both manual and digital analysis
have the same general goals. However, the attainment of
The Multiangle Imaging SpectroRadiometer (MISR) has these goals may follow different paths.
nine separate charge-coupled-device (CCD) pushbroom
cameras to observe the Earth in four spectral bands and at Human beings are adept at visually interpreting images pro-
nine view angles. It provides data on clouds, atmospheric duced by certain types of remote sensing devices, especially
aerosols, and multiple-angle views of the Earth’s deserts, cameras. One could ask, “Why try to mimic or improve on
vegetation, and ice cover. The Clouds and the Earth’s Radi- this capability?” First, there are certain thresholds beyond
ant Energy System (CERES) consists of two scanning radi- which the human interpreter cannot detect “just noticeable
ometers that measure the Earth’s radiation balance and differences” in the imagery. For example, it is commonly
provide cloud property estimates to assess their role in radi- known that an analyst can discriminate only about nine
ative fluxes from the surface of the Earth to the top of the shades of gray when interpreting continuous-tone, black-
atmosphere. Finally, the Measurements of Pollution in the and-white photography. If the data were originally recorded
Troposphere (MOPITT) scanning radiometer provides with 256 shades of gray, there might be more subtle informa-
information on the distribution, transport, sources, and sinks tion present in the image than the interpreter can extract
of carbon monoxide and methane in the troposphere. visually. Furthermore, the interpreter brings to the task all
the pressures of the day, making the interpretation subjective
The National Polar-orbiting Operational Environmental Sat- and generally unrepeatable. Conversely, the results obtained
ellite System (NPOESS) Preparatory Project (NPP) to be by computer are repeatable (even when wrong!). Also, when
launched will extend key EOS measurements in support of it comes to keeping track of a great amount of detailed quan-
long-term monitoring of climate trends and global biological titative information, such as the spectral characteristics of a
productivity until the NPOESS can be launched sometime in vegetated field throughout a growing season for crop identi-
the future. The NPP will contain MODIS-like instruments fication purposes, the computer is very adept at storing and
such as the Visible Infrared Imaging Radiometer Suite manipulating such tedious information and possibly making
(VIIRS). With a five-year design life NPP will provide data a more definitive conclusion as to what crop is being grown.
past the designed lifetimes of EOS Terra and Aqua satellites This is not to say that digital image processing is superior to
through the launch of NPOESS (NOAA NPOESS, 2006). visual image analysis. Rather, there may be times when a
digital approach is better suited to the problem at hand. Opti-
Commercial Vendors: Space Imaging, Inc., launched mum results are often achieved using a synergistic combina-
IKONOS-2 on September 24, 1999. The IKONOS-2 sensor tion of both visual and digital image processing.
system has a 1 × 1 m panchromatic band and four 4 × 4 m
24
Remote Sensing of the Environment
Figure 15 Analog (visual) and digital image processing of remotely sensed data use the elements of image interpretation.
Analog (Visual) Image Processing seen in books, magazines, the television and the Internet.
Furthermore, we are adept at bringing to bear all the knowl-
Human beings use the fundamental elements of image inter- edge in our personal background and collateral information.
pretation summarized in Figure 15, including grayscale We then converge all this evidence to identify phenomena in
tone, color, height (depth), size, shape, shadow, texture, site, images and judge their significance. Precise measurement of
association, and arrangement. The human mind is amazingly objects (length, area, perimeter, volume, etc.) may be per-
good at recognizing and associating these complex elements formed using photogrammetric techniques applied to either
in an image or photograph because we constantly process (a) monoscopic (single-photo) or stereoscopic (overlapping)
profile views of Earth features every day and (b) images images. Numerous books have been written on how to per-
25
Remote Sensing of the Environment
form visual image interpretation and photogrammetric mea- Image Enhancement: Images can be digitally enhanced to
surement. identify subtle information in the analog or digital imagery
that might otherwise be missed. Significant improvements
There is a resurgence in the art and science of visual image have been made in our ability to contrast stretch and filter
interpretation as the digital remote sensor systems provide data to enhance low and high frequency components, edges,
increasingly higher spatial resolution imagery. Many people and texture in the imagery (e.g., Emerson et al., 1999). In
are displaying IKONOS 1 × 1 m and QuickBird 61 × 61 cm addition, the remote sensor data can be linearly and nonlin-
imagery on the computer screen and then visually interpret- early transformed into information that is more highly corre-
ing the data. The data are also often used as a base map in lated with real-world phenomena through principal
GIS projects (Clarke, 2001). components analysis and various vegetation indices (Town-
shend and Justice, 2002).
Digital Image Processing
Photogrammetry: Significant advances have been made in
Scientists have made significant advances in digital image the analysis of stereoscopic remote sensor data obtained
processing of remotely sensed data for scientific visualiza- from airborne or satellite platforms using computer worksta-
tion and hypothesis testing (e.g., Estes and Jensen, 1998; tions and digital image processing photogrammetric algo-
Townshend and Justice, 2002; Kraak, 2003). The methods rithms (e.g., Adams and Chandler, 2002). Soft-copy
are summarized in Donnay et al. (2001), Bossler et al. photogrammetric workstations can be used to extract accu-
(2002), Jensen (2005), and others. Digital image processing rate digital elevation models (DEMs) and differentially cor-
now makes use of many elements of image interpretation rected orthophotography from the triangulated aerial
using the techniques summarized in Figure 15. The major photography or imagery (Light and Jensen, 2002; Linder,
types of digital image processing include image preprocess- 2003). The technology is revolutionizing the way DEMs and
ing (radiometric and geometric correction), image enhance- orthophotos are produced for rural and urban–suburban
ment, pattern recognition using inferential statistics, applications.
photogrammetric image processing of stereoscopic imagery,
expert system (decision- tree) and neural network image Parametric Information Extraction: Scientists attempting
analysis, hyperspectral data analysis, and change detection to extract land-cover information from remotely sensed data
(Figure 5). now routinely specify if the classification is to be:
Radiometric Correction of Remote Sensor Data: Analog • hard, with discrete mutually exclusive classes, or fuzzy,
and digital remotely sensed imagery may contain noise or where the proportions of materials within pixels are
error that was introduced by the sensor system (e.g., elec- extracted (Seong and Usery, 2001);
tronic noise) or the environment (e.g., atmospheric scatter-
ing of light into the sensor’s field of view). Advances have • based on individual pixels (referred to as a per-pixel
been made in our ability to remove these deleterious effects classification) or if it will use object-oriented image
through simple image normalization techniques and more segmentation algorithms that take into account not only
advanced absolute radiometric calibration of the data to the spectral characteristics of a pixel, but also the spectral
scaled surface reflectance (for optical data). Calibrated characteristics of contextual surrounding pixels. Thus, the
remote sensor data allows imagery and derivative products algorithms take into account spectral and spatial
obtained on different dates to be compared (e.g., to measure information (Herold et al., 2003; Hodgson et al., 2003a;
the change in leaf area index between two dates). Funda- Tullis and Jensen, 2003).
mental digital image processing principles are discussed in
Jensen (2005). Once these issues are addressed, it is a matter of determining
whether to use parametric, nonparametric, and/or nonmetric
Geometric Correction of Remote Sensor Data: Most ana- classification techniques. The maximum likelihood classifi-
log and digital remote sensor data are now processed so that cation algorithm continues to be the most widely used para-
individual picture elements are in their proper planimetric metric classification algorithm. Unfortunately, the algorithm
positions in a standard map projection. This facilitates the requires normally distributed training data in n bands (rarely
use of the imagery and derivative products in GIS or spatial the case) for computing the class variance and covariance
decision support systems. matrices. It is difficult to incorporate nonimage categorical
data into a maximum likelihood classification. Fortunately,
26
Remote Sensing of the Environment
fuzzy maximum likelihood classification algorithms are tion about how individual classification decisions were
now available (e.g., Foody, 1996). made (Zhang and Wang, 2003).
Nonparametric Information Extraction: Nonparametric Ideally, computers can derive the rules from training data
clustering algorithms, such as ISODATA, continue to be without human intervention. This is referred to as machine-
used extensively in digital image processing research. learning (Huang and Jensen, 1997; Jensen, 2005). The ana-
Unfortunately, such algorithms depend on how the seed lyst identifies representative training areas. The machine
training data are extracted and it is often difficult to label the learns the patterns from these training data, creates the rules,
clusters to turn them into information classes. For these rea- and uses them to classify the remotely sensed data. The rules
sons there has been a significant increase in the development are available to document how decisions were made.
and use of artificial neural networks (ANN) for remote sens-
ing applications (e.g., Qiu and Jensen, 2005). The ANN does Hyperspectral: Special software is required to process
not require normally distributed training data. ANN may hyperspectral data obtained by imaging spectroradiometers
incorporate virtually any type of spatially distributed data in such as AVIRIS and MODIS. Kruse et al. (1992), Landgrebe
the classification. The only drawback is that sometimes it is and Biehl (2006), Digital Research Systems (2006) and oth-
difficult to determine exactly how the ANN came up with a ers have pioneered the development of hyperspectral image
certain conclusion because the information is locked within analysis software. The software reduces the dimensionality
the weights in the hidden layer(s). Scientists are working on of the data (number of bands) to a manageable degree, while
ways to extract hidden information so that the rules used can retaining the essence of the data. Under certain conditions
be more formally stated. The ability of an ANN to learn the software can be used to compare the remotely sensed
should not be underestimated. spectral reflectance curves with a library of spectral reflec-
tance curves. Analysts are also able to identify the type and
Nonmetric Information Extraction: It is difficult to make proportion of different materials within an individual picture
a computer understand and use the heuristic rules of thumb element (referred to as end-member spectral mixture analy-
and knowledge that a human expert uses when interpreting sis) (Lu and Weng, 2004; Platt and Goetz, 2004).
an image. Nevertheless, there has been progress in the use of
artificial intelligence (AI) to try to make computers do Modeling Remote Sensing Data Using a GIS Approach:
things that, at the moment, people do better. One area of AI Remotely sensed data should not be analyzed in a vacuum
that has great potential for image analysis is the use of expert without the benefit of collateral information such as soil
systems that place all the information contained within an maps, hydrology, and topography (Ramsey et al., 1995). For
image in its proper context with ancillary data and extract example, land-cover mapping using remotely sensed data
valuable information. Duda et al. (2001) describe various has been significantly improved by incorporating topo-
types of expert system decision-tree classifiers as nonmetric. graphic information from digital terrain models and other
GIS data (e.g., Stow et al., 2003). GIS studies require timely,
Parametric digital image classification techniques are based accurate updating of the spatially distributed variables in the
primarily on summary statistics such as the mean, variance, database that remote sensing can provide (Clarke, 2001).
and covariance matrices. Decision-tree or rule-based classi- Remote sensing can benefit from access to accurate ancillary
fiers are not based on inferential statistics, but instead “let information to improve classification accuracy and other
the data speak for itself” (Gahegan, 2003). In other words, types of modeling. Such synergy is critical if successful
the data retains its precision and is not dumbed down by expert system and neural network analyses are to be per-
summarizing it through means, etc. Decision-tree classifiers formed (Tullis and Jensen, 2003). A framework for model-
can process virtually any type of spatially distributed data ing the uncertainty between remote sensing and geographic
and can incorporate prior probabilities (McIver and Friedl, information systems was developed by Gahegan and Ehlers
2002). There are three approaches to rule creation: 1) explic- (2000).
itly extracting knowledge and creating rules from experts, 2)
implicitly extracting variables and rules using cognitive Scene Modeling: Strahler et al. (1986) describe a frame-
methods (Lloyd et al., 2002), and 3) empirically generating work for modeling in remote sensing. Basically, a remote
rules from observed data and automatic induction methods sensing model has three components: 1) a scene model,
(Tullis and Jensen, 2003). The development of a decision which specifies the form and nature of the energy and matter
tree using human-specified rules is time-consuming and dif- within the scene and their spatial and temporal order; 2) an
ficult. However, it rewards the user with detailed informa- atmospheric model, which describes the interaction between
the atmosphere and the energy entering and being emitted
27
Remote Sensing of the Environment
from the scene; and 3) a sensor model, which describes the tographic theory or database topology design) often produce
behavior of the sensor in responding to the energy fluxes poor output products that do not communicate effectively.
incident on it and in producing the measurements that con-
stitute the image. They suggest that the problem of scene Image maps offer scientists an alternative to line maps for
inference, then, becomes a problem of model inversion in many cartographic applications. Thousands of satellite
which the order in the scene is reconstructed from the image image maps have been produced from Landsat MSS
and the remote sensing model. For example, Woodcock et al. (1:250,000 and 1:500,000 scale), TM (1:100,000 scale) and
(1997) inverted the Li-Strahler Canopy Reflectance Model AVHRR, and MODIS data. Image maps at scales >1:24,000
for mapping forest structure. are possible using imagery with a spatial resolution of < 1 ×
1 m (Light and Jensen, 2002). Because image map products
Basically, successful remote sensing modeling predicts how can be produced for a fraction of the cost of conventional
much radiant flux in certain wavelengths should exit a par- line maps, they provide the basis for a national map series
ticular object (e.g., a conifer canopy) even without actually oriented toward the exploration and economic development
sensing the object. When the model’s prediction is the same of the less-developed areas of the world, most of which have
as the sensor’s measurement, the relationship has been mod- not been mapped at scales of 1:100,000 or larger.
eled correctly. The scientist then has a greater appreciation
for energy–matter interactions in the scene and may be able Remote sensor data that have been geometrically rectified to
to extend the logic to other regions or applications with con- a standard map projection are becoming indispensable in
fidence. The remote sensor data can then be used more effec- most sophisticated GIS databases. This is especially true of
tively in physical deterministic models (e.g., watershed orthophotomaps, which have the metric qualities of a line
runoff, net primary productivity, and evapotranspiration map and the information content of an aerial photograph or
models) that are so important for large ecosystem modeling. other type of image.
Recent work allows one to model the utility of sensors with
different spatial resolutions for particular applications, such Unfortunately, error is introduced in the remote sensing pro-
as urban analysis (Collins and Woodcock, 1999). cess and must be identified and reported. Innovations in
error reduction include: 1) recording the lineage of the oper-
Change Detection: Remotely sensed data obtained on mul- ations applied to the original remote sensor data, 2) docu-
tiple dates can be used to identify the type and spatial distri- menting the geometric (spatial) error and thematic (attribute)
bution of changes taking place in the landscape (Friedl et al., error of the source materials, 3) improving legend design,
2002; Zhan et al., 2002). The change information provides especially for change detection map products derived from
valuable insight into the processes at work (Alberti et al., remote sensing, and 4) improved accuracy assessment. The
2004; Auch et al., 2004). Change detection algorithms can remote sensing and GIS community should incorporate tech-
be used on per-pixel and object-oriented (polygon) classifi- nologies that track all error in final map and image products.
cations. Unfortunately, there is still no universally accepted This will result in more accurate information being used in
method of detecting change or of assessing the accuracy of the decision-making process.
change detection map products. Digital image processing
change detection principles are discussed in Jensen (2005). Earth Observation Economics
Information Presentation
28
Remote Sensing of the Environment
Easy to use
Difficult to
understand
low high
Figure 16 Remote sensing Earth observation economics. The goal is to minimize the knowledge gap between the information
delivery system, remote sensing experts, and the information consumer (user). The remote sensing–derived econom-
ic, social, strategic, environmental, and/or political information must be cost-effective, and easy to use to achieve
equilibrium (adapted from Miller et al., 2003).
has been around since the 1960s. There is an increasing IKONOS, OrbView-2 and Orbview-3 image products. Geo-
number of experts that can use analog and/or digital image Eye plans to launch a new sensor in 2007 with a spatial res-
processing techniques to extract information from the imag- olution of 0.41 × 0.41 m (GeoEye, 2006).
ery. Finally, there is the information consumer (user) of the
remote sensing–derived information. The user generally The equilibrium can also be impacted by remote sensing
needs information of economic, social, strategic, environ- technology experts that do not have a good understanding of
mental and/or political value (Liverman et al., 1998). the user information requirements. In fact, some remote
sensing experts are often baffled as to why the consumers
In order for the revenues generated by the information deliv- don't embrace the remote sensing-derived information.
ery system to be sufficient to support the capital and operat- What they fail to consider is that the consumers generally
ing costs of the system, there must be a balance have no motivation to switch to remote sensing-derived
(equilibrium) between the value of the information, as per- information on economic, social, environmental, strategic,
ceived by the user (consumer), and the revenue necessary to or political attributes simply because it is based on new tech-
support the system (Miller et al., 2001, 2003). The equilib- nology. Furthermore, the consumers on the right side of the
rium has been achieved for airborne photogrammetric and diagram often have little knowledge of remote sensing tech-
LIDAR mapping applications for several decades. Time will nology or of how it is used to derive information.
tell if the balance between perceived value and cost can be
maintained in the spaceborne case. Mergers are occurring. Miller et al. (2001; 2003) suggest that this situation creates a
On January 12, 2006, ORBIMAGE acquired Space Imag- knowledge gap between the remote sensing experts and the
ing’s assets and now functions as GeoEye, Inc., providing information consumers (user) (Figure 16). Bridging the
29
Remote Sensing of the Environment
30
Remote Sensing of the Environment
gap is mandatory if we are to use remote sensing to solve Earth Resource Analysis Perspective
earth resource management problems. It is unlikely that the
user community can devote the time to learn the physics of
remote sensing and methods of analog and/or digital image
processing and GIS modeling necessary to produce useful Remote sensing is used for numerous applications such as
information. Conversely, there is considerable interest on the medical image analysis (e.g., x-raying a broken arm), non-
technology side of the problem to build a communication destructive evaluation of products on an assembly line, and
bridge. Therefore, one way to decrease the size of the knowl- analysis of Earth resources. Earth resource information is
edge gap is for the remote sensing technologists to work defined as any information concerning terrestrial vegetation,
more closely with the user communities to understand their soils, minerals, rocks, water, and urban infrastructure as well
requirements. This will lead to more useful remote sensing- as certain atmospheric characteristics. Such information may
derived information of value to the user communities. be useful for modeling the global carbon cycle, the biology
and biochemistry of ecosystems, aspects of the global water
Advances in remote sensing image delivery systems by and energy cycle, climate variability and prediction, atmo-
commercial firms such as Google, Inc., and their Google spheric chemistry, characteristics of the solid Earth, pop-
Earth application are having a tremendous impact on the ulation estimation, and monitoring land-use change and
public’s use and appreciation of remote sensor data (Fal- natural hazards (Johannsen et al., 2003).
lows, 2006).
31
Remote Sensing of the Environment
References
Adams, J. C. and J. H. Chandler, 2002, “Evaluation of Lidar and Colwell, R. N. (Ed.), 1983, Manual of Remote Sensing, 2nd. Ed.,
Medium Scale Photogrammetry for Detecting Soft-cliff Falls Church: ASP&RS.
Coastal Change,” Photogrammetric Record, 17(99): 405–418.
Colwell, R. N., 1984, “From Photographic Interpretation to Re-
Alberti, M., Weeks, R. and S. Coe, 2004, “Urban Land-Cover mote Sensing,” Photogrammetric Engineering and Remote
Change Analysis in Central Puget Sound,” Photogrammetric Sensing, 50(9):1305.
Engineering & Remote Sensing, 70(9):1043–1052.
Colwell, R. N., 1997, “History and Place of Photographic Inter-
American Society for Photogrammetry and Remote Sensing, pretation,” in Manual of Photographic Interpretation, 2nd
1952, 1966, Manual of Photogrammetry, Bethesda: ASP&RS. Ed., W. R. Phillipson (Ed.), Bethesda: ASPRS, 33–48.
Asker, J. R., 1992, “Congress Considers Landsat ‘Decommer- Cracknell, A. P. and L. W. B. Hayes, 1993, Introduction to Re-
cialization’ Move,” Aviation Week & Space Technology, May mote Sensing, London: Taylor & Francis, 293 p.
11, 18–19.
Curran, P. J., 1987, “Remote Sensing Methodologies and Geog-
Asrar, G., 2004, Earth Science Applications Plan, Washington: raphy,” Intl. Journal of Remote Sensing, 8:1255–1275.
NASA, 89 p.
Curran, P. J., Milton, E. J., Atkinson, P. M. and G. M. Foody,
Asrar, G. and J. Dozier, 1994, EOS: Science Strategy for the 1998, “Remote Sensing: From Data to Understanding,” in
Earth Observing System, Woodbury, MA: American Institute P. E. Longley, et al. (Eds.), Geocomputation: A Primer, NY: John
of Physics, 342 p. Wiley, 33–59.
32
Remote Sensing of the Environment
Dahlberg, R. W. and J. R. Jensen, 1986, “Education for Cartography Fallows, J., 2006, “Spy’s-Eye View,” Atlantic Monthly,
and Remote Sensing in the Service of an Information Society: (March):140–144.
The United States Case,” American Cartographer, 13(1):51–71.
Fisher, P. F. and R. E. Lindenberg, 1989, “On Distinctions
Dartnell, P. and J. V. Gardner, 2004, “Predicting Seafloor Facies among Cartography, Remote Sensing, and Geographic Infor-
from Multibeam Bathymetry and Backscatter Data,” Photo- mation Systems,” Photogrammetric Engineering & Remote
grammetric Engineering & Remote Sensing, 70(9):1081– Sensing, 55(10):1431–1434.
1091.
Foody, G. M., 1996, “Approaches for the Production and Evalu-
Davis, B. A., 1999, “Overview of NASA’s Commercial Remote ation of Fuzzy Land Cover Classifications from Remotely
Sensing Program,” Earth Observation, 8(3):58–60. Sensed Data,” Intl. Journal of Remote Sensing, 17(7):1317–
1340.
Digital Research, 2006, Environment for Visualizing Images:
ENVI, www.digitalresearch.com Friedl, M. A., McIver, D. K., Hodges, J. C. F., Zhang, X. Y., Mu-
choney, D., Strahler, A. H., Woodcock, C. E., Gopal, S.,
Dehqanzada, Y. A. and A. M. Florini, 2000, Secrets for Sale: Schneider, A., Cooper, A., Baccini, A., Gao, F. and C. Schaaf,
How Commercial Satellite Imagery Will Change the World, 2002, “Global Land Cover Mapping from MODIS: Algo-
Washington: Carneggie Endowment for Intl. Peace, 45 p. rithms and Early Results,” Remote Sensing of Environment,
83:287–302.
Dobson, J. E., 1993, “Commentary: A Conceptual Framework
for Integrating Remote Sensing, Geographic Information Sys- Fussell, J., Rundquist, D. and J. A. Harrington, 1986, “On De-
tems, and Geography,” Photogrammetric Engineering & Re- fining Remote Sensing,” Photogrammetric Engineering &
mote Sensing, 59(10):1491–1496. Remote Sensing, 52(9):1507–1511.
Donnay, J., Barnsley, M. J. and P. A. Longley, 2001, Remote Gahegan, M., 2003, “Is Inductive Machine Learning Just Anoth-
Sensing and Urban Analysis, NY: Taylor & Francis, 268 p. er Wild Goose (or Might It Lay the Golden Egg)?” Intl. Jour-
nal of GIScience, 17(1):69–92.
Duda, R. O., Hart, P. E. and D. G. Stork, 2001, Pattern Classifi-
cation, N.Y: John Wiley, 394–452. Gahegan, M. and M. Ehlers, 2000, “A Framework for the Mod-
elling of Uncertainty Between Remote Sensing and Geo-
Emerson, C. W., Lam, N. and D. A. Quattrochi, 1999, “Multi- graphic Information Systems,” ISPRS Journal of Photogram-
scale Fractal Analysis of Image Texture and Pattern,” Photo- metry & Remote Sensing, 55:176–188.
grammetric Engineering & Remote Sensing, 65(1):51–61.
GeoEye, 2006, ORBIMAGE Acquires Space Imaging and
Engman, E. T., 2000, “Soil Moisture,” in Schultz, G. A. and E. Changes Brand Name to GeoEye, Dulles: GeoEye, Inc., http:/
T. Engman (Eds.), Remote Sensing in Hydrology and Water /www.orbimage-acquisition.com/news/geoEye.htm.
Management, Berlin: Springer, 197–216.
Goetz, S. J., 2002, “Recent Advances in Remote Sensing of Bio-
Estes, J. E. and J. R. Jensen, 1998, “Development of Remote physical Variables: An Overview of the Special Issue,” Re-
Sensing Digital Image Processing Systems and Raster GIS,” mote Sensing of Environment, 79:145–146.
History of Geographic Information Systems, T. Foresman
(Ed.), NY: Longman, 163–180. Haack, B., Guptill, S. C., Holz, R. K., Jampoler, S. M., Jensen,
J. R. and R. A. Welch, 1997, “Urban Analysis and Planning,”
Estes, J. E., Hajic, E. J. and L. R. Tinney, 1983, “Fundamentals Manual of Photographic Interpretation, Bethesda: ASP&RS,
of Image Analysis: Visible and Thermal Infrared Data,” Man- 517–553.
ual of Remote Sensing, R. N. Colwell, (Ed.), Bethesda: AS-
PRS, 987–1125. Herold, M., Guenther, S. and K. Clarke, 2003, “Mapping Urban
Areas in the Santa Barbara South Coast Using IKONOS Data
and eCognition,” eCognition Application Note, Munich:
Definiens Imaging GmbH, 4(1):3.
33
Remote Sensing of the Environment
Hodgson, M. E., Jensen, J. R., Tullis, J. A., Riordan, K. D. and Jensen, J. R., Qiu, F. and K. Patterson, 2001, “A Neural Network
C. M. Archer, 2003a, “Synergistic Use of LIDAR and Color Image Interpretation System to Extract Rural and Urban Land
Aerial Photography for Mapping Urban Parcel Impervious- Use and Land Cover Information fro Remote Sensor Data,”
ness,” Photogrammetric Engineering & Remote Sensing, Geocarto International, 16(1):19–28.
69(9):973–980.
Jensen, J. R., Saalfeld, A., Broome, F., Cowen, D., Price, K.,
Hodgson, M. E., Jensen, J. R., Schmidt, L., Schill, S. and B. A. Ramsey, D., Lapine, L. and E. L. Usery, 2005, “Chapter 2:
Davis, 2003b, “An Evaluation of LIDAR- and IFSAR-derived Spatial Data Acquisition and Integration,” in R. B. McMaster
Digital Elevation Models in Leaf-on Conditions with USGS and E. L. Usery (Eds.), A Research Agenda for Geographic
Level 1 and Level 2 DEMS,” Remote Sensing of Environment, Information Science, Boca Raton: CRC Press, 17–60.
84(2003):295-308.
Johannsen, C. J., Petersen, G. W., Carter, P. G. and M. T. Morgan,
Hodgson, M. E., Jensen, J. R., Raber, G., Tullis, J., Davis, B., 2003, “Remote Sensing: Changing Natural Resource Manage-
Thompson, G. and K. Schuckman, 2005, “An Evaluation of ment,” Journal of Soil & Water Conservation, 58(2):42–45.
LIDAR derived Elevation and Terrain Slope in Leaf-off Con-
ditions,” Photogrammetric Engineering & Remote Sensing, Joseph, G., 2000, “How Well Do We Understand Earth Observa-
71(7):817–823. tion Electro-optical Sensor Parameters?” ISPRS Journal of
Photogrammetry & Remote Sensing, 55:9–12.
Huang, X. and J. R. Jensen, 1997, “A Machine Learning Ap-
proach to Automated Construction of Knowledge Bases for Karaska, M. A., Huguenin, R. L., Beacham, J. L., Wang, M.,
Image Analysis Expert Systems That Incorporate Geographic Jensen, J. R., and R. S. Kaufman, 2004, “AVIRIS Measure-
Information System Data,” Photogrammetric Engineering & ments of Chlorophyll, Suspended Minerals, Dissolved Organ-
Remote Sensing, 63(10):1185–1194. ic Carbon, and Turbidity in the Neuse River, N.C.,” Photo-
grammetric Engineering & Remote Sensing, 70(1):125–133.
Jensen, J. R., 1992, “Testimony on S. 2297, The Land Remote
Sensing Policy Act of 1992,” Senate Committee on Com- Kraak, M., 2003, “Geovisualization Illustrated,” ISPRS Journal
merce, Science, and Transportation, Congressional Record, of Photogrammetry & Remote Sensing, (2003):390–399.
(May 6):55–69.
Kruse, F. A., Lefkoff, A. B., Boardman, J. W., Heidebrecht, K.
Jensen, J. R., 2005, Introductory Digital Image Processing: A B., Shapiro, A. T., Barloon, P. J. and A. F. H. Goetz, 1992,
Remote Sensing Perspective, Upper Saddle River: Prentice- “The Spectral Image Processing System,” Proceedings, Intl.
Hall, 525 p. Space Year Conference, Pasadena, 10 p.
Jensen, J. R. and D. C. Cowen, 1999, “Remote Sensing of Ur- Landgrebe, D. and L. Biehl, 2006, An Introduction to MULTI-
ban/Suburban Infrastructure and Socioeconomic Attributes,” SPEC, W. Lafayette: Purdue University, 50 p.
Photogrammetric Engineering & Remote Sensing, 65(5):611–
622. Light, D. L. and J. R. Jensen, 2002, “Photogrammetric and Re-
mote Sensing Considerations,” in Manual of Geospatial Sci-
Jensen, J. R. and R. E. Dahlberg, 1983, “Status and Content of ence & Technology, Bossler, J. D., Jensen, J. R., McMaster, R.
Remote Sensing Education in the United States,” Intl. Journal B. and C. Rizos (Eds.), London: Taylor & Francis, 233–252.
of Remote Sensing, 4(2):235–245.
Linder, W., 2003, Digital Photogrammetry: Theory and Applica-
Jensen, J. R. and S. Schill, 2000, “Bi-directional Reflectance tions, Berlin: Springer-Verlag, 189 p.
Distribution Function (BRDF) of Smooth Cordgrass (Spartina
alterniflora),” Geocarto International, 15(2):21–28. Liverman, D., Moran, E. F., Rindfuss, R. R. and P. C. Stern,
1998, People and Pixels: Linking Remote Sensing and Social
Jensen, J. R., Botchway, K., Brennan-Galvin, E., Johannsen, C., Science, Washington: NRC, 244 p.
Juma, C., Mabogunje, A., Miller, R., Price, K., Reining, P.,
Skole, D., Stancioff, A. and D. R. F. Taylor, 2002, Down to Lloyd, R., Hodgson, M. E. and A. Stokes, 2002, “Visual Catego-
Earth: Geographic Information for Sustainable Development rization with Aerial Photographs,” Annals of the Association
in Africa, Washington: National Academy Press, 155 p. of American Geographers. 92(2):241–266.
34
Remote Sensing of the Environment
Lu, D. and Q. Weng, 2004, “Spectral Mixture Analysis of the DAR Vegetation Point Removal Process,” Photogrammetric
Urban Landscape in Indianapolis with Landsat ETM+ Imag- Engineering & Remote Sensing, 68(12):1307–1315.
ery,” Photogrammetric Engineering & Remote Sensing,
70(9):1053–1062. Ramsey, R. D., Falconer, A. and J. R. Jensen, 1995, “The Rela-
tionship Between NOAA-AVHRR Normalized Difference
Maune, D. F., (Ed.), 2001, Digital Elevation Model Technologies Vegetation Index and Ecoregions in Utah,” Remote Sensing of
and Applications, Bethesda: ASP&RS, 538 p. Environment, 53:188–198.
McIver, D. K. and M. A. Friedl, 2002, “Using Prior Probabilities Rizos, C., 2002, “Introducing the Global Positioning System,”
in Decision-tree Classification of Remotely Sensed Data,” Re- in Manual of Geospatial Science and Technology, Bossler, J.
mote Sensing of Environment, 81:253–261. D., Jensen, J. R., McMaster, R. B. and C. Rizos (Eds.), Lon-
don: Taylor & Francis, 77–94.
Miller, R. B., Abbott, M. R., Harding, L. W., Jensen, J. R., Jo-
hannsen, C. J., Macauley, M., MacDonald, J. S. and J. S. Robbins, J., 1999, “High-Tech Camera Sees What Eyes Can-
Pearlman, 2001, Transforming Remote Sensing Data into In- not,” New York Times, Science Section, September 14, D5.
formation and Applications, Washington: NRC, 75 p.
Sandmeier, S. R., 2000, “Acquisition of Bidirectional Reflec-
Miller, R. B., Abbott, M. R., Harding, L. W., Jensen, J. R., Jo- tance Factor Data with Field Goniometers,” Remote Sensing
hannsen, C. J., Macauley, M., MacDonald, J. S. and J. S. of Environment, 73:257–269.
Pearlman, 2003, Using Remote Sensing in State and Local
Government: Information for Management and Decision Schill, S., Jensen, J. R., Raber and D. E. Porter, 2004, “Temporal
Making, Washington: NRC, 97 p. Modeling of Bidirectional Reflection Distribution Function
(BRDF) in Coastal Vegetation,” GIScience & Remote Sens-
NASA, 2006, Airborne Visible/Infrared Imaging Spectrometer ing, 41(2):116-135.
(AVIRIS) home page, http://aviris.jpl.nasa.gov/.
Seong, J. C. and E. L. Usery, 2001, “Fuzzy Image Classification
NOAA NPOESS, 2006, National Polar Orbiting Operational En- for Continental-Scale Multitemporal NDVI Images Using In-
vironmental Satellite System, http://www.ipo.noaa.gov/. variant Pixels and an Image Stratification Method,” Photo-
grammetric Engineering & Remote Sensing, 67(3):287–294.
Nemani, R. R., Keeling, C. D., Hashimoto, H., Jolly, W. M., Pip-
er, S. C., Tucker, C. J., Myneni, R. B. and S. W. Running, Shippert, P., 2004, Spotlight on Hyperspectral, Boulder: Re-
2003, “Climate-Driven Increases in Global Terrestrial Net search Systems, www.geospatial_online.com/shippert, 5 p.
Primary Production from 1982 to 1999,” Science,
300(6):1560–1563. Skidmore, A. K., 2002, “Chapter 2: Taxonomy of Environmental
Models in the Spatial Sciences,” in Environmental Modelling
Platt, R. V. and A. F. H. Goetz, 2004, “A Comparison of AVIRIS with GIS and Remote Sensing, A. K. Skidmore (Ed.), London:
and Landsat for Land Use Classification at the Urban Fringe,” Taylor & Francis, 8–25.
Photogrammetric Engineering & Remote Sensing, 70(7):813–
819. Stow, D, Coulter, L., Kaiser, J., Hope, A., Service, D., Schutte,
K. and A. Walters, 2003, “Irrigated Vegetation Assessments
Pruitt, E. L., 1979, “The Office of Naval Research and Geogra- for Urban Environments,” Photogrammetric Engineering &
phy,” Annals, Association of American Geographers, Remote Sensing, 69(4):381–390.
69(1):106.
Strahler, A. H., Woodcock, C. E. and J. A. Smith, 1986, “On the
Qiu, F. and J. R. Jensen, 2005, “Opening the Neural Network Nature of Models in Remote Sensing,” Remote Sensing of En-
Black Box and Breaking the Knowledge Acquisition Bottle- vironment, 20:121–139.
neck of Fuzzy Systems for Remote Sensing Image Classifica-
tion,” Intl. Journal of Remote Sensing, 25(9):1749–1768. Teillet, P. M., Gauthier, R. P., Chichagov, A. and G. Fedosejevs,
2002, “Towards Integrated Earth Sensing: Advanced Technol-
Raber, G. T., Jensen, J. R., Schill, S. R. and K. Schuckman, 2002, ogies for in situ Sensing in the Context of Earth Observation,”
“Creation of Digital Terrain Models using an Adaptive LI- Canadian Journal of Remote Sensing, 28(6):713–718.
35
Remote Sensing of the Environment
Townshend, J. R. G. and C. O. Justice, 2002, “Towards Opera- Woodcock, C. E., Collins, J. B., Jakabhazy, V., Li, X., Macomb-
tional Monitoring of Terrestrial Systems by Moderate-resolu- er, S. and Y. Wu, 1997, “Inversion of the Li-Strahler Canopy
tion Remote Sensing,” Remote Sensing of Environment, Reflectance Model for Mapping Forest Structure,” IEEE
83:351–359. Transactions Geoscience & Remote Sensing, 35(2):405–414.
Tullis, J. A. and J. R. Jensen, 2003, “Expert System House De- Zhan, X., Sohlberg, R. A., Townshend, J. R. G., DiMiceli, C.,
tection in High Spatial Resolution Imagery Using Size, Shape, Carrol, M. L., Eastman, J. C., Hansen, M. C. and R. S. De-
and Context,” Geocarto International, 18(1):5–15. Fries, 2002, “Detection of Land Cover Changes Using MO-
DIS 250 m Data,” Remote Sensing of Environment, 83:336–
Walsh, S. J., Evans, T. P., Welsh, W. F., Entwisle, B. and R. R. 350.
Rindfuss, 1999, “Scale-dependent Relationships Between
Population and Environment in N.E. Thailand,” Photogram- Zhang, Q. and J. Wang, 2003, “A Rule-based Urban Land Use
metric Engineering & Remote Sensing, 65(1):97–105. Inferring Method for Fine-resolution Multispectral Imagery,”
Canadian Journal of Remote Sensing, 29(1):1–13.
Wolter, J. A., 1975, The Emerging Discipline of Cartography,
Minneapolis: University of Minnesota, Department of Geog-
raphy, unpublished dissertation.
36
Remote Sensing of the Environment
Infrastructure In
Emergency
0.001 Response
Weather
GOES
Prediction
Aerial Photography
IKONOS/DG/OV-3
IKONOS/ImageSat
Thematic Mapper
0.0001
Landsat MSS
DigitalGlobe
AVHRR
MODIS
ASTER
ASTER
IRS 1C
GOES
SPOT
SPOT
Color Plate 1 The nominal spatial and temporal resolution characteristics for selected applications are presented. When conducting
a remote sensing project, there is usually a trade-off between spatial and temporal resolution requirements. Generally,
as temporal resolution requirements increase, it is usually necessary to lower the spatial resolution requirements so that
the amount of remote sensor data collected does not become unmanageable. Fortunately, many applications that re-
quire very detailed spatial information (e.g., land-use mapping) normally do not require high temporal resolution data
(i.e., data collected once every five to ten years may be sufficient). There are exceptions to these general rules. For
example, precision agriculture, crop yield investigations, traffic studies, and emergency response applications some-
times require very high spatial and temporal resolution remote sensor data. The volume of data collected can create
serious data management and analysis problems. There are also trade-offs with the other resolutions (e.g., spectral, ra-
diometric, polarization) that may need to be considered.
37
38
Electromagnetic Radiation Principles
E
nergy recorded by a remote sensing system undergoes fundamental interac-
tions that should be understood to properly interpret the remotely sensed data.
For example, if the energy being remotely sensed comes from the Sun, the
energy
• finally reaches the remote sensor, where it interacts with various optics,
filters, film emulsions, or detectors.
From Chapter 2 of Remote Sensing of the Environment: An Earth Resource Perspective, Second Edition. John R. Jensen.
Copyright © 2007 by Pearson Education, Inc. Published by Pearson Prentice Hall. All rights reserved.
39
Electromagnetic Radiation Principles
Convection
Energy Transfer
Pulse
Conduction of
warm
Pan air
in contact
a. with burner b. Terrain
Radiation
λ
Sun Earth
Electromagnetic
c. wave
Figure 1 Energy may be transferred three ways: conduction, convection, and radiation. a) Energy may be conducted directly from one
object to another as when a pan is in direct physical contact with a hot burner. b) The Sun bathes the Earth’s surface with
radiant energy causing the air near the ground to increase in temperature. The less dense air rises, creating convectional cur-
rents in the atmosphere. c) Electromagnetic energy in the form of electromagnetic waves may be transmitted through the vac-
uum of space from the Sun to the Earth.
Electromagnetic Radiation Models magnetic (Figure 2). The two vectors are at right angles
(orthogonal) to one another, and both are perpendicular to
the direction of travel.
To understand how electromagnetic radiation is created, How is an electromagnetic wave created? Electromagnetic
how it propagates through space, and how it interacts with radiation is generated whenever an electrical charge is
other matter, it is useful to describe the processes using two accelerated. The wavelength (λ) of the electromagnetic radi-
different models: the wave model and the particle model ation depends upon the length of time that the charged parti-
(Englert et al., 1994). cle is accelerated. Its frequency (ν) depends on the number
of accelerations per second. Wavelength is formally defined
Wave Model of Electromagnetic Energy as the mean distance between consecutive maximums (or
minimums) of a roughly periodic pattern (Figures 2 and 3) and
is normally measured in micrometers (μm) or nanom
In the 1860s, James Clerk Maxwell (1831–1879) conceptu- eters (nm). Frequency is the number of wavelengths that
alized electromagnetic radiation (EMR) as an electromag- pass a point per unit time. A wave that sends one crest by
netic wave that travels through space at the speed of light. It every second (completing one cycle) is said to have a fre-
took many years for scientists like Leon Foucault and Albert quency of one cycle per second, or one hertz, abbreviated 1
A. Michelson to determine the speed of light, c, as Hz. Frequently used measures of wavelength and frequency
299,792,458 meters per second (i.e., m s-1), or 186,282.397 are found in Table 1.
miles s-1. These values are often generalized to 3 x 108 m s-
1
, 300,000 km s-1 or 186,000 miles s-1. A useful relation for The relationship between the wavelength (λ) and frequency
quick calculations is that light travels about 1 ft per nanosec- (ν) of electromagnetic radiation is based on the following
ond (10-9 s) (Rinker, 1999). The electromagnetic wave con- formula, where c is the speed of light (Rott, 2000):
sists of two fluctuating fields—one electric and the other
40
Another random document with
no related content on Scribd:
többször meg nem áldhatná“. Ki ne mondana Ament imájára? E
rettenthetlen szerető sziv imája bizony áldást hozott hazájára!
Szóltunk egy derék katonáról s egy derék irórúl, mint a közelmult
korszak angol lovagjának példányairól, – ne szólanánk tehát egy
harmadikról is, kinek sziv-emelő történetét – meg vagyok győződve
– idősebb hallgatóim közül sokan olvasák és édesen emlékeznek rá:
ne szólanánk egy derék egyházi férfiuról is, megemlékezvén Heber
Reginaldról, az angol férfiak egyik legjelesbikéről? A kedves költő, a
kitüntető előnyöknek, szép tulajdonoknak: születésnek, elmeélnek,
hirnévnek, jellem-nagyságnak, jólétnek boldog birtokosa – ez
szülötte-földének, Hoderelnek, szeretett lelki-pásztora, ki „jó tanácsot
adott hiveinek szorongattatásikban, segedelmet bajaikban, vigaszt
megpróbáltatásaikban – gyakran saját életének veszélyeztetésével
térdelvén betegágyaik mellett, biztatva, bátoritva, hol szükségét látta;
békéltetve a háborúskodókat, bőkezüen segitve a szükölködőket.“
Midőn az indiai püspökséggel megkinálták: először visszautasitá;
de később megfontolván (s ügyét oda terjesztvén fel, hol az ily
kegyes emberek kételyeiket elő szokták adni): vissza vette tagadó
válaszát s felkészült küldetésére és szeretett gyülekezete
elhagyására. „Édes gyermekeim, szeressétek egymást s
bocsássatok meg egymásnak“ – ezek voltak a végső szent szavak,
melyeket siró nyájához intézett. És avval elvált tőlök, talán azt is
tudva, hogy soha többé nem látja őket. Épen mint az előbb emlitett
derekaknál: nála is szeretet és kötelesség volt az élet czélja. Boldog
ő és boldogok azok, kik oly dicsőségesen maradtak hivei e kettőnek!
Nejéhez e kedves sorokat irja útjáról:
De ha az esti s hajnal-csillag
Az Ur előtt térdelve lát:
Érzem, hogy – bár oly távol estél –
Mond’sz értem te is egy imát…
1.D. The copyright laws of the place where you are located also
govern what you can do with this work. Copyright laws in most
countries are in a constant state of change. If you are outside
the United States, check the laws of your country in addition to
the terms of this agreement before downloading, copying,
displaying, performing, distributing or creating derivative works
based on this work or any other Project Gutenberg™ work. The
Foundation makes no representations concerning the copyright
status of any work in any country other than the United States.
1.E.6. You may convert to and distribute this work in any binary,
compressed, marked up, nonproprietary or proprietary form,
including any word processing or hypertext form. However, if
you provide access to or distribute copies of a Project
Gutenberg™ work in a format other than “Plain Vanilla ASCII” or
other format used in the official version posted on the official
Project Gutenberg™ website (www.gutenberg.org), you must, at
no additional cost, fee or expense to the user, provide a copy, a
means of exporting a copy, or a means of obtaining a copy upon
request, of the work in its original “Plain Vanilla ASCII” or other
form. Any alternate format must include the full Project
Gutenberg™ License as specified in paragraph 1.E.1.
• You pay a royalty fee of 20% of the gross profits you derive from
the use of Project Gutenberg™ works calculated using the
method you already use to calculate your applicable taxes. The
fee is owed to the owner of the Project Gutenberg™ trademark,
but he has agreed to donate royalties under this paragraph to
the Project Gutenberg Literary Archive Foundation. Royalty
payments must be paid within 60 days following each date on
which you prepare (or are legally required to prepare) your
periodic tax returns. Royalty payments should be clearly marked
as such and sent to the Project Gutenberg Literary Archive
Foundation at the address specified in Section 4, “Information
about donations to the Project Gutenberg Literary Archive
Foundation.”
• You comply with all other terms of this agreement for free
distribution of Project Gutenberg™ works.
1.F.
Most people start at our website which has the main PG search
facility: www.gutenberg.org.