Satelite Communication+radar Communication

You might also like

Download as pdf
Download as pdf
You are on page 1of 464
REMOTE SENSING Principles and Interpretation THIRD EDITION For information about this book, contact: Waveland Press, Ine. 4180 IL Route 83, Suite 101 Long Grove, IL 60047-9580 (847) 634-0081 info@waveland.com www.waveland.com COVER IMAGE, ‘This Landsat thematic mapper (TM) satellite image eovers portions of Grand Canyon ational Park and the Navajo Indian Reservation in northern Arizona. TM band 2 (green) 's shown in blue; band 4 (reflested infrared) in green; and band 7 (reflected infrared) in red. ‘The green ares in the northwest portion of the image isthe forested Kaibab uplift with levations up 103,000 m, The Coloraclo River enters at the north central margin of the ‘mage and flows south through Marble Canyon, whieb is cut into a platform of Kaibab limestone (Permian age) with a distinctive ble signature, The platform is bordered on the cast by Mesorcie strata of the Echo Clilfs and on the west by the Vermillion Cis with yellowish signatures. The river flows southeest along the flank of the Kaibab uplift where it is joined on the cast by the Little Colorado River. The Colorade River tums wost and flows through the spectacular gorge of the Grand Canyon, whichis incised into a sequence of Palecaoie strata underlain by Precambrian crystalline rocks, The distinctive orange signature at the east end of the Canyon marks outerops of younger Precambrian strata South of the Canyon, elevations are lower, which accounts for the sparser vegetaticn. The dark area in the southeast portion of the image is th basaltic reeks (Quaternary to Tertiary age). The southeast margin shows « helt of clouds with dark shadows, San Francisco yoleanic Bld of Image provided courtesy Remote Sensing Enterprises, Ine Copsright © 1997, 1987, 1978 by Floyd F. Sabins Reissued 2007 by Waveland Press, Inc. 10-digit ISBN 1-57766-507-4 13-digit ISBN 978-1-57766-507-6 All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or ‘ransmitted in any form or by any means without permission in wring from the publisher Printed in the United States of America CONTENTS PREFACE, x INTRODUCTION TO CONCEPTS AND SYSTEMS 1 Units of Measure / 2 Multispectral Imaging Systems } 20 Elearomugnetc Energy J 2 Hyperspectral Scanning Systems / 28 Electromagnetic Spoctrum / 3 ‘Sources of Remote Sensing Information / 27 Image Characteristics / 6 Comments 29 Vision 11 Questions / 30 Remote Sensing Systems / 13 References / 30 Spectral Reflectance Curves / 19 Adaitional Reading / 31 PHOTOGRAPHS FROM AIRCRAFT AND SATELLITES. 33 Interactions between Light and Mater / 33 Normal Color and IR Color Photograpias Compared / 56 Film Technology 35, Hiph-Altinude Aesal Photographs / 59 (Characteristics of Aerial Photographs / 37 Sources of Aerial Photographs / 59 Photomosaies / 42 New Technology / 59 Stereo Pais of Aerial Photographs / 45 Photographs from Satellites / 61 Low-Sun-Angee Photographs / 49 Comments / 66 Black-and-White Photographs / 49 Questions / 66, Color Science / 52 Referesces (66 Normal Color Photograpss / 54 Additional Reading / 66 IR Color Photographs / 55 LANDSAT IMAGES 69 Landsats1.2.and 3 / 69 Seasonal Influence on Images. / 91 Landsats ¢ané$./ 71 Lincaments / 93 ‘Orbit Patios / 8: Plate-Tectorie Ineepretations / 95 Pathand-Row Index Maps / 84 Comments. / 101 Selectingand Ordering Images / $4 Questions / 102 Landsat Mosaics. / 85 References | 104 Regional Interpretation, Saudi Arabia / 85 Additional Reading / 104 Detiled Image Intepretation,Saheran Atlas Mourtains, Algeria / 88 EARTH RESOURCE AND ENVIRONMENTAL SATELLITES 105 SPOT Satellite, Fance / 105 Fature Satelite Systems 132 ‘SPOT and Lanisat Images Compared / 115 Comments | 133 IRS-1 Satelite, Japan} 118; Questicns / 134 India Remote Sensing Satellite / 119 References | 134 Geostationary Environmental Satelites 121 ‘Additional Reading (134 Polar-Ortiting NOAA Eavironmenial Suellites / 127 Environmental and Earth Resources Images Compared / 132 contents vil THERMAL INFRARED IMAGES 135 ‘Thermal Processes and Properties | 133 Mauns Loa, Hawaii / 161 IR Detecion end Imaging Technology | 143 Suellie Thermal IR Images / 163 ‘Characteristics of IR Images / 145 ‘Thermal IR Specira / 169 Conducting Airborme IR Surveys / 148 ‘Thormal IR Multspecal Seanner / 170 Land Use and Land Cover—Ann Arbor, Michigan / 151 Comments / 172 Heal-Loss Surveys (152 Questions / 173 Indio Hil, Caiforsia | 154 References / 174 Imler Roud Area, Califomia | 158 Additional Reading / 175, Western Transvaal, South Africa / 161 6 RADAR TECHNOLOGY AND TERRAIN INTERACTIONS, a7 Radar Systems / 177 Comments 211 Characteristics of Radar images / 184 Questions / 211 Radar Return and Image Signatures / 196 References / 211 Polarization / 207 Additional Reading / 211 Inteferomenry / 209 7 ‘SATELLITE RADAR SYSTEMS AND IMAGES 213 Seacat / 213 JERS. / 233 SIR‘A Mission / 214 ERS-I / 235, ‘SIR-A Images of Topical Terai, Indonesia | 214 ‘Almaz-1 / 236 Radar Interaction with Vegetwed Terrain / 224 Radarsat / 236 SIR-A Images of Desert Terrain, Esstern Sahara / 226 Magellan Mission to Venus / 240 Radar Ineaction with Sand-Covered Terrain / 227 Comments | 253 Land Use and Land Cover, Imperial Valley, Calfomia / Questions / 253, 228 References / 253 SIR-B Mission / 28 ‘Additional Reading / 254 SIR-C Mission / 231 8 DIGITAL IMAGE PROCESSING 255 vi contents: Strtureof Digital Images / 255 Image-Processing Overview 258 Image Restoration / 259 Image Enhancement / 256 Information Extraction / 280 Hardware and Software for image Processing / 290 Comments / 291 Questions / 292 References / 292 ‘Additional Reading / 293 METEOROLOGIC, OCEANOGRAPHIC, AND ENVIRONMENTAL APPLICATIONS 295 UV Radiation and Ozone Concentration / 285 Environmental Pollution / 322 Climate nd Weather / 297 Comments / 334 Ocean Productivity / 300 Questions / 336 (Ocean Currents / 302 References / 336 Sealce / 304 ‘Additional Reading / 338 Banymery / 314 1 oO OIL EXPLORATION 339 [Exploration Programs / 339 ‘Other Energy Resouress / 357 Sudan Project / 339 Comments | 387 Papua New Guinea Project / 344 ‘Questions / 3585 CCenral Arabian Arch Project / 346 References | 358 (Oil Exploration Research / 385 ‘Additional Reading / 358 DAD sincra. sesonsnon os 12 Regional Lineamen’s and Ore Deposits of Nevada / 361 Regional Lineaments and Ore Deposits of South Africa ‘and Australia | 364 Local Fractures and Ore Deposits | 365 ‘Mapping Hydzothewmally Altered Rocks / 366 Goldfield Mining District, Nevada / 365 Porthyry Copper Deposits / 371 Collahuasi Mining District. Chile J 372 Borate Minerals, Salar de Uyuni, Bolivis / 377 LAND USE AND LAND COVER: GEOGRAPHIC INFORMATION SYSTEMS Additional Mineral Exploration Projets / 378 Hyperspectral Scanners for Mineral Exploration / 378 Strategy for Mineral Exploration / 380 Mineral Exploration in Covered Terrain / 380 Comments | 384 Questions / 384 Relereaces / 384 Additional Reading / 385 387 Neel to Classify Land Use ani Lard Cover / 387 Mulilevel Classification System / 387 “Mutilevel Classification System of the Los Angeles Region / 388 Digital Classification of Land Use, Las Vegas, Nevada 401 Intespreting Land Use and Changes in Phoenix, Arizone / 403 Vegetation Mapping with AVHRR images / 404 Geosrphic Information Systems / 407 ‘Achaesigy / 411 Comments] 415 Questions / 15 References (415 Aditonal Reading / 416 contents 13 NATURAL HAZARDS 4ai7 Eanhquskes / 417 Forestand Range Fires / 445 Landslides. / 425 Comments 447 Land Subsidence / 427 Questions / 447 Volcanoes / 428 References / 447 Floods / 443 Additional Reading / 449 1 4 COMPARING IMAGE TYPES: SUMMARY 451 Death Valley | 451 Questions / 463, Summary / 462 References / 463 Commenes / 462 Additional Reading / 463 APPENDIX: BASIC GEOLOGY FOR REMOTE SENSING 465 GLOSSARY an REFERENCES CITED IN COLOR PLATES 483 INDEX 485 CONTENTS PREFACE Much has happened since the previous edition of this book appeared in 1986. Major advances in the past decade indicate the dynamic mature of the science of remote sensing and its ex- panding applications. I have attempied to express this dyna- mism in the third edition. This book is designed for an introductory university course Jn remote sensing. No prior training in remote sensing is 1e~ quired. Courses in introductory physics, physical geography. and physical geology are useful background, but are not essen- tial, for users of this book. This text follows the format of the remote sensing course | teach in the Earth and Space Sciences Department at UCLA, which emphasizes the interpretation of images and their application to a range of disciplines, ORGANIZATION OF THE BOOK The first chapter introduces the major remote sensing systems and the interactions between electromagnetic energy and m- terials that are the basis for remote sensing. The six following chapters describe the major imaging system: photographs, Landsat, earth resource and environmental satellites, termal infrared, and radar. For each system the following topics sre covered: |. physical properties of materials and their interactions with electromagnetic energy that determine signatures on images 100em 4 nTRODUCTION To CONCEPTS AND SYSTEMS Remarks Incoming radiation completely absorbed by the upper atmosphere and not available for remote sensing. Completely absorbed by the atmosphere, Not employed in remote sensing. Incoming wavelengths less than 0.3 um completely absorbed by ozone in the upper atmosphere, ‘Transmitted through the atmosphere. Detectable with film and photodetectors, but atmospheric scattering is severe. Imaged with film and photodetectors. Includes reflected energy peak of earth at 0.5 jm. Interaction with matter varies with wavelength, Atmospheric transmission windows are separated by absorption bands. Reflected solar radiation that coniains no information bout thermal properties of materials. The interval from 0.7 to 0.9 um is detectable with film and is called the photographic [R band. Principal stmospherie windows in the thermal region. Images at these wavelengths are acquired by optical-mechanical scanners and special vidicon systems but not by film, Longer wavelengths that can penetrate clouds, fog, and rain, Images may be acquired in the active cr passive mode. Active form of microwave remote sensing. Radar images are acquired at various wavelength bands. Longest-wavelength portion of electromagnetic spectrum, ULTRA 1 VIOLET |VISIBLE “Reflected IR INFRARED HzO H20 CO, ; H20) _Thermal if cop Hz0 03 | C02 2 ys z : i él s Wnvelensth 05 10-15-20 99 408814520 adam [| tlormat Color Photos Airgran HEHE Ts $n ester Pros Seamed ta HE42424}L Landsat tuttispectia! Seanner H HIP Y FS tLaron nema mapper FS HAP} 4) sPor mutispecrat scanner CAA BA mxosom sieRARED microwave aan10 a TO SIRC — yo ; Atmospneric Teanemision band ragar bend rodor S band radar band radar ss eta | abe [a ws ee Uae Te Figure 1-3 Expanded diagrams of the visible and infrared regions (upper) and _mlerowave regions (lower) for transmission through the aimospnere, Gases r sporsible for aamospheric absorption bards ae indicated. Wavelength tands recorded by commonly used remote sensing systerms are shown (middle), cearth during daytime may be recorded as a function of wave~ length. The maximum amount of energy is reflected at the 0.5- Jum wavelength, which coresponds to the green wavelengihs ff the visible region and is called the reflected energy peak (Figure 1-2). The earth also radiates energy both day and night, with the maximum energy radiating at the 9.7-4um wavelength, This radiant energy peak occurs in the thermal portion of the IR region (Figure |-2) ‘The earth's atmosphere absorbs energy in the gamma-ray. X-ray, and most of the ultraviolet (UV) regions: therefore, these regions are not used for remote sensing, Terrestrial remote sens- ing records energy in the microwave, infrared, and visible re ‘gions, as well as the long wavelength portion of the UV region. Figure 1-3 shows deails of these regions, The horizontal axes show wavelength on a logarithmic scale; the vertical axes show the percentage of electromagnetic energy that is transmitted ELECTROMAGNETIC SPECTRUM 5 through the earth’s atmosphere. Wavelength intervals with high transmission are called atmospheric windows and are used to ‘acquire remove sensing images. The major remote sensing re- sions (visible, infrared. and microwave) are further subdivided into bands, such as the blue, green, and red bands of the visible region (Figure 1-3). Horizontal lines in the center of the i sgram show wavelength bands in the UV through thermal IR re- gions recorded by major imaging systems such as cameras and For the Landsat systems, ihe numbers. entity spe- cific bands recorded by these systems, The characteristics ofthe remote sensing regions are summarized in Table 1-3. Passive remote sensing systems rovord the energy that nat rally radiates or reflects from an object. An active system sup- plies its own source of energy, directing itat the object in onler to measure the retumed energy. Flash photography is an exam: ple of active remote sensing, in contrast to available light pho- ography, which is passive. Another common form of active re- mote sensing is radar (Table 1-3), which provides its own source of electromagnetic energy at microwave wavelengths. Sonar systems transmit pulses of sonic energy. Atmospheric Effects Our eyes inform us thatthe atmosphere is essentially anspor ent t light, and we tend to assume that this condition exists for all electromagnetic energy. In fact. the gases of the atmoyphere absorb electromagnetic energy at specific wavelength intervals called absorption bands. Figure 1-3 shows these absorption bands together with the geses in the atmosphere responsible for the absorption. Wavelengths shorter than 0.3 Am are completely absorted by the ozone (O,) layer inthe upper atmosphere (Figure 1-3). This absorption is essential to life on earth, because prolonged ‘exposure (0 the intense energy of these short wavelengths de- stroys living tissue. For example, sunburn occurs more readily at high mountain elevations. than at sea level. Sunburn is caused by UV energy, much of which is absorbed by the at mosphere at sea level. At higher elevations, however, there is less atmosphere to absorb the UV energy. Ciouals consist of aeroso!-sized particles of liquid water that absorb and scatter electromagnetic radiation at wavelengchs less than about (.1 cm, Only radiation of microwave and longer wavelengths is capable of penetrating eloads without being scattered, reflected, or absorbed IMAGE CHARACTERISTICS In general usage. an image is any pictorial representation, ie speciive of the wavelength or imaging device used to produce it. A photograph is a type of image that records wavelengths from 0.3 to 0.9 ym that have interacted with light-sensitive chemicals in photographic film. Images can be described in terms of certain fundamental properties regardless of the wave lengih at which the image is recorded, These properties are 6 INTRODUCTION TO CONCEPTS ANO SYSTEMS scale, brightness, contrast, and resolution, The tone andl texture of images are functions of the fundamental properties. Scale Scale isthe ratio of the distance between «wo points on an im age to the corresponding distance on the ground. A common scale on US. Geological Survey topographic maps is 1:24,000, which means that one unit on the map equals 24,000 units on the ground. Thus 1 cm on the map represents 24,000 cm (240 ‘m) om the ground, or 1 in. represents 24,000 in. (2000 ft). The ‘maps and images of this book show scales graphically as bars. The deployment of imaging systems on satellites has changed the concepts of image scale. In this book, scales of mages are designated as follows: ‘Small scale (greater than Lem = Skm or more 1:500,000) (in. = 8 mi or more) Intermediate scale (1:50,000 Lem=05 105 km) 10 1:500,000) (1in, = 0.8 t08 mi Large seale (less than Lem =05 km or less 150,000) (1 in, = 8 mi or less) ‘These designations differ from the traditional scale concepts of aerial photography. Forty years ago, 1:62,500 was the minimum scale of commercially available photographs and was considered small-scale. Today sensing systems on high: altitude airerafi and satellites can acquire photographs and images of excellent quality a much smaller Scales. Optimum image scale is determined by how the images are to be inter preted. With the advent of satelite images, many investigators have been surprised at the amount and types of information ‘that can be interpreted from very small scale images. Brightness and Tone Remote sersing systems detect the intensity of electromagnetic rafiation that an object reflects, emits, or scatters at particular wavelength bands Variations in intensity of electromagnetic ra- ition from the tetan are displayed as variations in brightness ‘on images. On positive images, such at those in thie book, the brightness of objects is directly proportional to the intensity of electromagnetic radiation that is detected from that object. Brightness isthe magnitude of the response produced inthe «ye by light itis a subjeetive sensation that can be determined only approximately. Luminance is a quaniitative measure of the intensity of light from a source and is measured witha de vice called a photometer, or light meter. People who interpret images rarely, if ever, make quantitaive measurements of brightness variations on an image. Variations in brightness may bbe calibrated with a gray scale such as the one in Figure I Each cistinguishable shade from black 0 white is « separate tone. In practice. most interpreters do not use an actual pray scale the way one would use & centimeter scale; they character Figure 1- ize areas on an image 2s light, intermediate, or dark in tone us- ing their own mental concept ofa gray scale. ‘On aerial photographs the tone of an object is primarily de~ termined by the ability of the object to reflect incident. sun- light, although atmospheric effects and the spectra sensitivity ‘of the film are also factors. On images acquired at other wave- length regions. tone is determined by other physical properties of objects. On thermal IR images the tone of an object is pro- portional to the heat radiating from the object. On radar images the tone of an object is determined by the intensity at which the transmitted beam of radar energy is scattered back to the receiving antenna, Contrast Ratio Conirast ratio (CR) is the ratio between the brightest and dark- est parts of the image and is defined as 104 Bmax Gray scale 2B R= Bow a3) Brin Where By is the maximum brightness of the scene and By the minimum brightness. Figure 1-5 shows images of high, ‘medium, ard low contrast, together with profiles of brightness ‘variation across each image. On a brightness seale of O © 10, these images have the following contrast ratios: A. Highcontast — CR= $=45 B. Medium contrast. CR=4=25 . Low contast = CR= $= 15 Note that when Bin =O, CR is infinity; when Byyy = Brass CR is unity, This discussion is summarized from the extensive review by Slater (1983), which descrites other terms for con trast. In addition to describing an entre scene, contrast ratio is Brin IMAGE BRIGHTNESS DISTANCE ‘A.High contast rato. 8. Medium contrast ati ©. Low contact rat. Figure 1-5 Images of different conteastratics (upper) with corresponding brightness profiles (lower). IMAGE CHARACTERISTICS 7 1 l= line—pairs.cm~ 6 ine—pairs-cm Mess 1 il 5 line—pairs.cm~ 4 line—pairs.cm~ 3 line—pairs-cm il A, Resolution targets 02 mm 04 mm 08 mm 08 mm 4.0 mm 8. Detacton target. Figure 1-6 Resolution nd detection targets with high contrat ratio. View this chart fom a distance fof Sim (16.5 fy). For, determine the most closely spaced set of bars you can reslve. For B, deter mine the narrowest line you ean detec. 8 INTRODUCTION To CONCEPTS ANO SYSTEMS also used to describe the ratio between the brightness of an ob- Ject on an image and the brightness of the adjacent bac ‘ground. Contrast ratio is a vital factor In determining the abit- ity to resolve and detect objects. Images with a low contrast ratio are commonly referred to as “washed out.” with monotonous, nearly uniform tones of ‘gray, Low contrast may result from the following causes: 1. The objects and background of the scene may have a nearly nifonm electromagnetic response at the particular wave~ length band that the remote sensing system recorded. In ther words, the scene has an inherently low contrast ratio. Scattering of electromagnetic energy by the atmosphere can reduce the contrast of a scene, This effect is most pro- nounced in the shorter wavelength portions of the photo graphic remote sensing band, as described in Chapter 2 3. The remote sensing system may lack sufficient sensitivity to detect and record the contrast of the terrain. Incorrect recording techniques can also resalt in low contrast images even though the scene has a high contrast ratio. when reconded by other means A low contrast ratio, regardless of the cause, can be improved by digital enhancement methods, as described in Chapter 8. ‘Spatial Resolution and Resolving Power This book defines spatial resolution as the ability to distin- guish between two closely spaced objects on an image, More specifically itis the minimum distance between two objects at Which the images of the objects appear distinct and separat Objects spaced together mere closely than the resolution limit will appear asa single object on the image. Forshaw and others (1983) discuss alternative definitions of spatial resolution. Resolving power and spatial resolution are two elosely « lated concepts. The term resolving power applies to an imag~ ing system or a component of the system, whereas spatial res- ‘olution applies to the image produced by the system. For example, the lens and film of a camera system each have @ characteristic resolving power tha, together with other factors, determines the resolution of the photographs. Spatial resolution of a photographic system is customatily determined by photographing a standard resolution target, soch as the one shown in Figure |-6A. under specified conditions of illumination and magnification. The resolution targets, or har charts, consis of altemating black and white bars of equal Width called line-pairs. Spacing of resolution targets is ex- pressed in line-pairs. For the target with 5 fine-pairs « em, cach black bar is 0.1 em wide and separated by a white bar of the same widhh, The photograph is viewed under magnifc tion, and the observer determines the most closely spaced set Of line-pairs for which the bary and spaces ate discemible Spatial resolution of the photographic system is sated as the number of line-pairs per millimeter of the resolved target. AB = BC 2% BAC=tradian Figure 1-7 Radian sytem of angular measurement. Human judgment and visual characteristics are critical compo- ‘nents in this analysis, which therefore is not completely objec- tive and reproducible, Spatial resolution is different for objects of different shape, size, arrangement, and contrast ratio. An al: temative method of describing resolution is the modulation transfer function (MTF), which employs a bar chart with pro- gressively closer spacing of the bars (McKinney, 1980). ‘An alternative method of measuring spatial resolution is an- gular resolving power, which is defined as the angle subtended by imaginary lines passing from the imaging system and two targets spaced a the minimurn resolvable distance. Angular re- solving power is commonly measured in radians. As shown in Figure 1-7, a radian (rad) is the angle subtended by an are BC ofa circle having a length equal to the radius AB of the circle, Because the circumference of a circle has a length equal to 2r ‘imes the radius, there are 27, of 6.28, cad in a circle. A radian corresponds to 57.3° or 3438 min, and one milliradian (mrad) is 10~ rad. In the radian system of angular measurement, Angle a4) r where Fis the length of the subteniled arc and ris the radius of the circle. A convenient relationship is that at a distance r of 1000 units, 1 mrad subtends an are of | unit. Figure 1-8 illus: trates the angular resolving power of a remote sensing system (the eye) that can resolve the center bar chart of Figure 1-6 at 2 distance of 5 m, This chart has 5 line-pai are separated by 1 mm, For these targets with « high contrast ratio, the angular resolving power is 0.2 mrad, ‘Resolving power and spatial resolution will be discussed for each remote sensing sysem described in this book, but you should remember the following points MAGE CHARACTERISTICS — 10 mm Aesowving Figure 1-8 Angular resolving power (in milizadians) fora remote sensing system that can resolve line-pais - cm" ata distance of Sm. 1, Theoretical resolving power of a system is rarely achieved in actual operation. 2. Resolution alone does not adequately determine whether an ‘mage is suitable fora particular application. Resolution is the minimum separation between two objects, for which the images appear distinct and separate: it is not the size of the smallest object that can be seen, By knowing the resolution and scale of an image, however, one can esti- imate the size of the smallest detectable object. Other Characteristics of Images Detectabiity is the ability of an imaging system to record the presence or absence of an object, although the identity of the object may be unknown, An object may be detected even 10 INTRODUCTION TOCONCEPTS ANO SYSTEMS though itis smaller than the theoretical resolving power of the imaging system, Recognizability is the ability to identify an object on an im- age. Objects may be detected and resolved and yet not be rec cognizable. For example, roads on an image appear as narrow lines that could also be rlreads or canals. Unlike resolution, there are no quantitative measures for recognizability and de tectability, It is important for the interpreter to understand the significance and correct use of these terms. Rosenberg (1971) summarizes the distinctions between them. ‘A signature is the expression of an object on an image that enables the object to be recoznized. Characteristics of an ob- {ect that control its interaction with electromagnetic energy de- iiure. For eaample, the special signature of aa object is its brightness measured at a specific wavelength of energy. Texture isthe frequency of change and arrangement of tones ‘on an image. Fine, medium, and coarse are qualitative terms used to describe texture ‘An interpretation key is a characteristic or combination of characteristics that enables an object tbe identified on an im age. Typical keys are size, shape, tone, and color. The assock tions of different characteristics are valuable keys. On images of cities, one may recognize single-family residential areas by the association of a dense strect network, lawns, and small buildings. The associations of certain lancforms and vegetation species are keys for identifying different types of rocks VISION Of aur five senses, two (touch and vision) detect electromap- netic radiation. Some of the nerve endings in our skin detect thermal IR radiation as heat but do not form images. Vision is the most important sense and accounts for most of the informa- ‘ion input to eur brain Vision is not only an important remote sensing system in its own right, bu iti also the means by ‘which we interpret the images produced by other remove sens- ing systems. The following section analyzes the human eye as 4 remote sensing system. Much of the information is surima- rized from Gregory (1966). Structure of the Eye For such a complex structure, the human eye (Figure 1-9) ap- pears deceptively simple. Light enters through the clear Cornea, which is separated from the lens by fluid called the aqueous humor. Toe iris isthe pigmented par of the eye that controls the variable aperture calfed the pupil. It is coramonly thought that variations in pupil size allow the eye to function overa wide range of light intensities. However, the pupil varies in area over a ratio of only 16:1 (that is, the maximum area is 16 times the minimum area), whereas the eye functions over a brightness range of about 100,000:1. The pupil contracts to limit the light rays to the central and optically best part of the lens, except when the full opening is needed in dim light. The pric nerve Figure 1-9 Strictue of the human eye, pupil also contracts for near vision, increasing the depth of field for near objects ‘A common misconception is that the lens refracts (bends) the incoming rays of light to form the image. The amount that light bends when passing through two adjacent media is deter- ‘mined by the differeace in the refractive indices (2) of the two ‘media; the greater the difference, the greater the bending. For the eye, the maximum difference is between air (n = 1.0) and the comea (7 = 1.3), and this interface is where the maximum light refraction occurs. Although the lens is relatively unimpor- tant for forming the image, itis important in accommodating, ‘or focusing, for near and far vision. In cameras, this accommo- dation is done by changing the position of the lens relative to the film. Inthe human eye, the shape rather than the position of the lens is changed by muscles that vary the tension on the lens. For near-vision tension, the muscles release, allowing the lens to become thicker in the center and assume a more convex crass section, With age, the cells of the lens harden and the lens becomes too rigid to accommodate for different distances, this isthe time in life when bifocal glasses may become neces- sary to provide for near and far vision. An inverted image is focused on the retina, a thin sheet of interconnected nerve cells that includes the light receptor cells called rods and cones. which convert light into electrical im- pulses. The rods and cones receive their names irom their lon- giludinal shapes when viewed microscopically. ‘The cones function in daylight conditions to give color, or photopic. vi sion. The reds function under low illumination and give vision only in tones of gray, called scotepic vision. Rods and cones are not uniformly cistributed throughout the retinal surface. ‘The maximum concentration and organization of receptor cells, isin the forea (Figure 1-9), a small region at the center of the retina that provides maximum visual acuity. You can. demon- strate the existence and importance of the fovea by concentrat- ing on a single letter on this page. The rest of the page and ‘even the nearby words and letters will appear indistinct be- ‘cause they are outside the field of view of the fovea. The eye is in continual metion to bring the foves to bear on all parts of the page or scene. Close to the fovea is the blind spot, where the oplic nerve joins the eye and there are no receptor cells. ‘The electrical impulses from the receptor colls are transmitted tothe brain, which interprets them as visual perception, Resolving Power of the Eye ‘The diameter of the largest receptor cells in the fovea (2 jm) determines the resolving power ofthe eye. Multiplying this di- ameter maximum by the refractive index ofthe vitreous humor (= 1.3) determines an effective diameter (4 jim) for the re- ceptor ells The moze distance. or distance from the retina to the len, is about 20 mm, or 20,000 jm, The effective width of the receptos is 4/20,000 (1/5000) ofthe image distance. Image distance is proportional to object distance, which is the dis- tance from the eye to the object. An object forms an image that vison 11 Tiinepairs “en! 8tine—pairs -em-1 sia Sline—pairs “em! 0.6 mm Aline—pairs - ent 0.8mm 3tine—pairs en! 4.0 mm |, Resouton targets 8, Detesontarges Figure 1-10 Resolution and detection targets with medium contrast ratio, View this chat from a distance of Sm (165 ft). For A, determine the most closely spaced set of bars you can resolve. For B, determine the narrowest line you can detect, Compare these values with thse determined for Figure 1-6. 12 INTRODUCTION TOCONCEPTS AND SYSTEMS fills the width of a receptor if the object wih is 1/5000 the ‘object distance. Therefore, adjacent objects must be separated by 1/5000 the object distance for their images to fall on alter- nate receptors and be resolved by the eye. ‘You can estimate the resolving power of your eyes in the following manner. View the resolution targets of Figure 1-6 ata distance of 5 m (16.4 ft), and determine the most closely spaced set of Tine-pairs that you can resolve. Also determine the narrowest of the burs in Figure 1-GB that you ean detect. Make these determinations now, before reading further, be- se the following tent may influence your perception of the targets. For the high-contrast resolution targets of Figure 1-6A at a distance of 5 m, the normal eye should be able to resolve the middle set that has $ line-paies - em! The black and white bars are 1 mm wide. The instantaneous field of view (FOV) of any detector is the solid angle through which & detector is sen- sitive to radiation. Equation 1-4 is used to caleulate the /FOV of the eye, where the radius (r) is SCO0 mm and the length of the subtended are (L) is ! mm: L irov = rad (1-4) 1mm sag "5000 men = 0.2% 10-9 rad = 0.2 mrad Figure 1-8 shows the relationships of the resolution targets 10 the IFOV of the eye. The 0.2-mrad IFOV of the eye means that at a distance of 1000 units, the eye can resolve high-contrast targets that are spaced no closer than 0.2 units. Detection Capability of the Eye When the detection targets of Figure 1-6B are viewed from a distance of 5 m, most readers can detect the narrowest bar, which is 0.2 mm wide, Recall, however, that at this distance the minimum separation at which bar targets can be resolved is 1.0 ram, This test illustrates the difference between resolution and detection Detection is influenced not only by the size of objects but also. by their shape and orientation. For example, if dots are used in place of lines in Figure 1-6B, the digmeter of the smallest de tectable dot would be considerably larger than 0.2 mm, Effect of Contrast Ratio on Resolution and Detection ‘The resolution and detection targets in Figure 1-10 have the same spacing as those in Figure 1-6, but the contrast ratio kas been reduced by the addition of a gray background, To evalu fe the effect ofthe lower conirast ratio, view Figure 1-10 from, a distance of 5 m, and determine which targets can be resolved and detected. Using this figure, most readers can resolve only 3 line-pairs+ em! and the smallest detectable target is the 0.6- mm-wide line, These dimensions are larger than the 5 fine ples «car! and the 0.2-mam line of the high-contrast target and demonstrate the effect of a lower contrast ratio on resolution and detection, REMOTE SENSING SYSTEMS ‘The eye is familiar example of a remote sensing system. The inorganic remote sensing systems described in this book be- Jong to the two major categories framing systems and sean ning systems. Framing Systems Framing systems instantaneously eequire an image of an area, ‘or frame. on the terrain. Cameras and vidicons ate common ex- amples of such systems (Figure I-11). A camera employs 2 Figure 1-11 Framing system for acquiring remote sensing images. (Cameras ane vidicons are framing systems, REMOTE SENSNG SYSTEMS 19 Tens to form an image of the scene at the focal plane, which is the plane at which the image is sharply defined, A shuter ‘opens at selected intervals to allow light to enter the camera, where the image is recorded on photographic film. A vidicon is, 4 type of television camera that records the image on « photo sensitive electronically charged surface. An electron beam then. sweeps the surface to detect the pattern of charge differences that constitute the image. The electron beam produces a signal that may be transmitted and recorded on magnetic tape for evertual display oa film, Successive frames of camera and vidicon images may be ac- quired with forward overlap (Figure 1-11). The overlapping portions of the two frames may be viewed with a stereoscope to produce a three-dimensional view, as described in Chapter 2. Film is sensitive only to portions of the UV, visible, and re flected IR regions (0.3 to 0.9 jim). Special vidicons are sensi tive into the thermal bund of the IR region. A framing system ccan instantaneously image a large area because the system has 4 dense array of detectors located at the focal plane: The retina Of the eye has a network of rods apd cones, the emulsion of camera film contains tiny grains of silver halide, and a vidicon surface is coated with sensitive phosphors. Scanning Systems A scanning system employs a single detector with & narsow field of view that is swept across the terran to produce an im- age. When photons of elecromignetic energy radiated or re Mected from the trrain encounter the detector, an electrical signal is prosiuced that varies in proportion to the number of photons. The electrical signal is amplified, recorded on mag- netic tape, and played back later to produce an image, All scanning systems sweep the detectors field of view across the terrain in a series of parallel sean lines. Figure 1-12 shows the four common scanning modes: cross-track scanning, circular scanning, along-trck scanning, and side scanning, Cross-Track Scanners The widely used crosstrack scan- ners employ a faceted mirror that is rotated by an electric mo- tor, with a horizontal axis of rotation aligned parallel with the flight direction (Figure 1-12A). The mirror sweeps across the terrain ina pattern of parallel scan lines oriented normal (per- Pendicular) to the flight direction. Energy radiated or reflected from the ground is focused onto the detector by secondary mir= rors (not shown). Images recorded by cross-track scanners, and. other scanner systems, are described by two characteristics: spectral resolution and spatial resolution. Spectral resolution refers to the wavelength interval that is revouded by a detector. In Figure 1-13 the vertical scale shows, the response, or signal strength, of a detector as a function of wavelength, shown in the horizontal scale. As the wavelength increases, the detector response increases to a maximum ad then decreases. Spectral resolution, or bandwidth. is defined as the wavelength interval recorded at 5( percent of the peak Detector Response % 0.40 0.50 0.60 0.70 Wavelength, um Figure 1-13 Spectral resolution, or bandwidth, of adetecor, Bandwidth of this detector is 0.10 jum. response of a detecior. In Figure 1-13 the 50 pereent limits ‘occur at 0.50 and 0.60 Lm, corresponding to a bandwicth of 10 wm. The section on Reflectance Spectra from Hyper: spectral Data describes the effects of different bandwidths oo image data, Spatial resolution was defined earlier, using the eye as an ‘example. The physical dimensions of a detector determine its spatial resolution which is expressed as angular resolving Power, measured in milliradians (mrad). Angular resolving power determines the IFOV (defined earlier). As shown i Figure 1-12, the [FOV subtends an area on the terrain called a ground resolution cell. Dimensions of a ground resolution cell are determined by the detector IFOV and the altitude of the scanning system. A detector with an /FOV of | mrad at an alti- tude of 10 km subtends a ground resolution cell of 10 by 10 m. Figure 1-14 shows the important relationship between size of ground resolution cells and spatial resolution of images. ‘The different ceil sizes for Figure |-14B-D were produced by com: puter processing of the original 10-m data of Figure I-14 The images cover a portion of the town of Victorville, Califomia, and the adjacent desert terrain (Figure I-15). Resolving fine spatial detail of the urban area requires 10-by: 10-m cells. Larger linear features, such as the fault and the Mojave River are detectable with 30-by-30-m cells. The im- ages in Figure 1-14 are shown at a large scale (124,000) where the 10-m version is obviously optimum. At smaller seales (1:100,00), however, larger cells, such as 30 m (Figure 1-14C), produce readily interpretable images. The angular jeld of view (Figure 1-12A) is that portion of the minor sweep, measured in degrees, thit is recorded as & sean line, The angular field of view and the altitude of the sys- tem determine the ground swath, which is the width of the EMOTE SENSNG SYSTEMS 18 ©. 30by 30m, 10.80 Dy 60 m Figure 1-14 Images displayed witnaiferent ground resolution cells, Spatial resolution is determined by the size of the cell. The rea isa portion of Victorville in southem California, 16 INTRODUCTION TO CONCEPTS ANO SYSTEMS Native Urban areas Bare soil Bedrock vegetation 0 1.0 mi 0 1.0 km Figure 1-15 Location map of Viciorille, California, showing lan terrain strip represented by the image, Ground swath is ealeu- lated as angular field tan ‘The distance between the scanner and terrain is greater at the margins of the ground swath than at the center of the swath As a result, ground resolution cells are larger toward the mar- gins than at the center, which results in a geometric distortion that is characteristic of cross-track Scanner images. This distor- tion is corrected by digital processing, as shown in Chapter 8 At the high altitude of saellites, a narrow angulr field of view is sufficient to cover a broad swath of terrain. For this reason the rotating mirror is replaced by a flat mirror that oscillates beck and forth through an angle of approximately 15°. An ex ‘ample is the scanner of Landsat described in Chapter 3 ‘The strength of the signal generated by a detector is a func tion of the following factors: Energy flux The amount of energy reflecied or radiated from terrain is the energy flux. For visible detectors, this flux is lower on a dark day than on a sunny day, Altitude Fora given ground resolution cell, the amount lof energy reaching the detector is inversely proportional 1 the quate ofthe distance. At greateraltitudes the signal strength is weaker Spectral bandwidth of the detecior The signal is stronger for detectors that respond te a broader bandwidth of energy. For example, a detector that is sensitive to the entire visible range will receive more energy than a detector thats sensi- tive to a narrow band, such as visible ed. Instantaneous field of view Both the physica size of the sensitive element ofthe detector and the effective focal length of the scanner optics determine the /FOV. A small FOV is requiced for high spatial resolution bet also restrict the signal strength (amount of energy received by the detect). ‘Dwell re ‘The time requited forthe detector [FOV t0 sweep across a ground resolution cell isthe diel me. A longer dwell ime allows more energy 10 impinge on the devector, which creates a stionger signal For a eross-track scanner, the dwell time is determined by the detector IFOV and by the velocity at which the sean mirror sweeps the /FOY across the terrain. As shovn in Figure 1-16A ‘typical airbome scanner with a detector FOV of | mrad, 2 90° angular field of view, and operating at 2 x 10 see per sean Tine at an altitude of 10 km has a dwell time of 1 x 10-S see per ground resolution cell. It is instructive to compare the dwell ime with the ground speed of the airerat. At a typical ‘ground speed of 720 km + h-!, or 200 m « sec, the aircraft crosses the 10 m of a ground resolution cell in 5 x 10" sec. is 5% 103 times faster ‘The erass-track seanner time of 1 ¥ 10 than the ground velocity of the aircraft. The high scanner speed relative to ground speed is required to prevent gaps between adjacent scan lines. ‘The shont dwell time of cross-track scanners imposes con straints on the other factors that determine signal strength, For example, the /FOV and spectral bandwidth must be large enough to produce a signal of sufficient strength to overcome the inherent electronic noise of the system, The signal-to- noise ratio must be sufficiently high for the signal 10 be recognizable. REMOTE SENSING SYSTEMS 17 oirecton "= Number Ces perLine 2000 c0¥s A. Ooss-tack scanner. B.Along:rack scanner Figure 1-16 Dvell tine calculated for eross-track and along-track seanners. Circular Scanners In a circular scanner the scan motor and mirror are mounted with a vertical axis of rotation that sweeps a circular path on the terrain (Figure 1-12B), Only the forward portion of the sweep is recorded to produce images. An advan- lage of this system is that the distance between scanner and terrain is constant and all the ground resolution cells have the same dimensions. The major disadvantage is that most image processing and display systems are designed for linear scan data: therefore the circular sean data must be extensively refor: matted prior t0 processing. Circular scanners are used for reconnaissance purposes in aircraft. The axis of rotation is tilted forward to acquire images of the terrain well in advance of the aircraft position. The im- ages are displayed in real time on a screen in the cockpit to ‘suide the pilot. Airhome circular scanners with IR detectors are called FLIR (forward looking IR) systems. Along-Track Scanners For scanner systems to achieve finer spatial and spectral resolution, the dwvell time for each ground resolution cell must be increased. One method is (0 eliminate the scanning mirror and provide an individual detector for each ground resolution cell across the ground swath (Figure |-12C), 18 INTRODUCTION TOCONGEPTS ANO SYSTEMS ‘The detectors are placed in a linear array in the focal plane of the image formed by a lens system. ‘The long axis of the linear array is oriented normal to the Aight path, and the [FOV of each detector sweeps a ground res lution cell along the terrain parallel with the flight track direc- tion (Figure 1-12C). Along-track scanning refers to this move ‘ment of the ground resolution cells. These systems are also called pushbroom scanners because the detectors are analo- _gous tothe bristles of a broom pushed along the floor, For along-track scanners, the dwell ime of a ground resolu- tion cell is determined solely by the ground velocity, as Figure L.16B illustrates. For a jot aircraft flying at 720 km « br! or 200 m + see"!, the along-irack dwell time for a 10-m cell is 5 10 sec, which is 5 x 10° times greater than the dwell time for a comparable cross-track scanner. The increased dwell time allows two improvements: (1) detectors can have a smaller IFOY, which provides finer spatial resolution, and (2) detectors ccan have a narrower spectral bandwidth, which provides higher spectral resolution. Some experimental airborne along-track scanners operate with a spectral bandwidth of 0.01 um, Typical cerss-rack scanners have bandwidths of 0.10 Lim, which is a spectral resolution coarser by one order of magnitude, Side-Scanning Systems The cross-track, alone-track scanners just described are passive systems, since they detect and record energy naturally reflected! or radiated from the terrain. Active systems. which provide their own en- ‘ergy sources, operate primarily in the side-scanning mode. The example in Figure 1-12D is a radar system that transmits pulses of microwave energy to one side of the flight path (range direction) and records the energy scattered from the ter- rain back 0 the antenna, as described in Chapter 6, Another system is side-scanning sonar, which transmits pulses of sonic energy in the ocean to map bathymetric Features (Chapter 9). circular, and Scanner Systems Compared Cross-track and along-track scanners have different characteristics that are surimarized in the following char. Cross-track —_Along-track scanner scanner Angular field of view Wider Narrower Mechanical system Complex Simple Optical systerm Simple Complex Spectral range Wider range Narrower range, of detectors but expanding Dwell ume Shorer Longer The selection of a scanner system involves a number of cholees, or trade-offs, Cross-track scanners are generally pre- ferred for reconnaissance surveys because the wider angular field of view records images with a wide swath width. Along- track scanners are preferred for recording detailed spectral and spatial information. The longer dwell time can accommodate detectors with a narrow bandwidth ora small IFO. SPECTRAL REFLECTANCE CURVES ‘The various framing and scanning systems record. images. Another important aspect of remote sensing is the acquisition of nonimaging data. Spectral reflectance curves, or reflectance spectra, record the percentage of incident energy, typically sunlight, that is reflected by a material as a function of wave~ length of the energy. Figure 1-17 shows reflectance spectra of vegetation and typical rocks, The horizontal axis shows the wavelength of incident energy, which ranges from the visible into reflected IR spectral regions. The vertical axis shows the percentage of incident energy reflected at the different wav lengths. The curves are offset vertically to prevent confusing overlaps. Downward excursions of a curve are called absorp- ‘ion features because they represent absorption of incident en- ergy. Upward excursions are called reflectance peaks. These speciral features are valuable clues for recognizing matcrils, such as the rocks in Figure 1-17. on remote sensing images. For this reason reflectance spectra of many materials have Matticpsctea HHH HA Daedalus Mutispectral Scanner (10 bands) HHH GER Hyperspectral Scanner (63 bards} 2abands bands icon (k bande) s2bands AVIRIS Hyperspectral Scanner (224 bends) or Aumosphors R&B \_abserpiida Bonds Reflectance, % 0.50 1.00 1.50 2.00 2.50 Wavelength, um Figure 1-17 Reflectance spectra of rocks and vegetation. Special ‘bands of typical multispectral ané hyperspectal systems are shown. ‘been recorded and published. Huni (1980) published a number of mineral spectra and explained the interactions beiween en- ergy and matter that cause the spectral features at different ‘wavelengths. Hunt (1980) also provided references to his ex- tensive publications of spectra of rocks and minerals. Clark 4 others (1990) described laboratory spectroscopy and the ccauses of absorption features in mineral spectra. They also pro: vide an extensive collection of mineral spectra. Grove and oth ers (1992) published laboratory spectra of 150 minerals, Price (1995) describes 11 published collections of reflectance spec {a In the visible and reflected IR regions (0.4 to 2.5 jm) that inelude soils. agriculture, grasses. shrubs, rocks. minerals, fab rics, metals, and building materials, Price assembled the 3417 SPEGTRALREFLECTANCE CURVES 19 Figure 1-18 Spectrometer for recording reflectance spectra inthe field or laboratory. Courtesy Analytical Spectral Devices, Ine. Roulder. CO. specira into a standardized digital format on a personal com- puter diskette, Copies of the diskette are available for research purposes from J.C. Price Beltsville Agricultural Research Center USDA Agricultural Research Service Beltsville, MD 20705 ‘These spectra should be useful for comparing and identifying speeira recorded from hyperspectral scanners, which are de- scribed in a following section. Refleciance spectra are recorded by instruments called re- flectance spectrometers. Figure 1-18 is a typical spectrometer that may be used in the field. In the laboratory it uses a light source that matches the characteristics of sunlight, Modern spectrometers record spectra in a digital format that may be displayed directly in real time on the screen of a laptop com- puter, Some spectrometer systems provide « digital reference library of spectra of known materials, such as rocks, soils, and vegetation. Optional software can compare the spectrum of an unknown material with the library and provide « possible iden- tification, This. spectral matching technique is subject to ‘misidentifications, however, as analyzed by Price (1994). Some manufacturers of spectrometers suitable for remote sensing are listed below; readers may contact them for infor- ‘mation on specifications, optional equipment, and cost. Analytical Spectral Devices (ASD) 4760 Walnut Sueet, Suite 105 Boulder, CO 80301 Phone: 303-444-6522 20 INTRODUCTION TO CONCEPTS AND SYSTEMS Geophysical & Environmental Research Corp. (GER) ‘One Bennett Common, Millbrook, NY 12545 Phone 914-677-6100 Integrated Spectronics Pry. Ld. P.O. Box 437 Baulkhau Fill NSW, 2153, Australia Phone: (2-887-8760 ‘The ground resolution cell of a handheld spectrometer cov- ers only a few square centimeters. The ground resolution cell ‘ofa scanner in an aircraft or satellite covers tens to hundreds of ‘square meters, Because of this difference in coverage one must be cautious when using spectra to evaluate scanner images. Longshaw (1976) analyzed and described the problems of us ing laboratory spectra to interpret images of rock outcrops. MULTISPECTRAL IMAGING SYSTEMS A mutispectral image is an array of simultaneously acquired images that record separate wavelength intervals, or bands. Multispectral images differ from images such as conventional photographs, which record a single image that spans a broad spectral range. Much of this book deals with multispectral images that are acquired in all the remote sensing. spectral regions. Slater (1985) provides a survey of multispectral systems, Mutkispectral systems differ in the following characteristics: + Imaging technology —framing or scanning method ‘+ Total spectral range recorded + Number of spectral bands recorded + Range of wavelengths recorded by each spectral band (bandwidth) ‘The upper portion of Figure 1-17 shows the wavelength bands recorded by representative aircraft multispectral systems, Which are listed in Table 1-4 and described in the following sections. Multispectral systems are also widely used in satel lites, as described in subsequent chapters. Muhispectral im- ages are acquired by two methods: framing systems or scan- ning systems, Multispectral Framing Systems ‘The simplest multispectral framing systems consist of several ‘cameras or vidicons mounted together and aligned to acquire Simultaneous multipe images of an area, Today the preponder- ance of multispectral imagery is acquired by scanners, but vvidicon framing systems serve a useful purpose. Multispectral Cameras Multispectral cameras employ 4 range of film and filter combinations to acquire black-and- Table 1-4 Reprosentaive aircraft mulispectral imaging systems System Technology Vidicon Framing (vidicon) Daedalus scanner Scanning (cross-track) GER scanner Scanning (crosy-track) AVIRIS scanner Scanning (cross-track) SFSI scanner ‘Scanning (along-track) white photographs that record narrow spectral bands. The shu ters are linked together and triggered simultaneously. The systems are also called multiband cameras. ‘These were the original multispeciral systems and are mentioned for historical purposes because they have essentially been replaced by vidi- ‘con systems. Lowman (1969) and the National Aeronautics and Space Administration (NASA, 1977) show examples of mulispeetral photographs. Muttispectral Vidicons Multispectral vidicons operate in two modes: (1) two or more individual systems record images a different wavelength bands, or 2) a single sysxem records multiple bands. Several multispectral vidicon systems have been configured; these range from two to four black-and-white vidicons that record narrow bands in the visible and reflecied Scan SQ)_Mirror ‘Angular Field of View A Crossirack Spectral range, Bandwidth, wa Bends um 0.40 100.90 4 0.10 0.38 10 1.10 0 0.03 10 0.20 0.50 102.50 a 0.025 w 0.175 040 102.50 208 0.010 ns 0.010 IR regions. Marsh and others (1991) used a two-vidicon sys- tem that records red and reflected IR. bands to analyze a haz- andous waste sie in Arizona, The system was also used 10 as- sess land cover in the Mato Grosso, Brazil (Marsh, Walsh, and ‘Sobrevila, 1994). Neale and Crowiher (1994) assembled a sys tem from three commercial vidicons that eecords green, red, and reflected IR bands. ‘Table 1-4 lisis the characteristics of a commercial aircraft vidicon system that acquires four spectral bands (blue, green, red, and reflected IR) that can be composited into color im- ages. Monday and others (1994) used such images to map the city of Irving, Texas (70 mit), in 8 months, rather than the 3 years estimated for conventional mapping. King (1995) de scribes the technology and applications of multispectral vidi ‘can systems and provides references to related publications. B.Along:track Figure 1-19 Multispectral scanner systems, MUCTSPECTRALIMAGING SYSTEMS 21 INTRODUCTION TO CONCEPTS AND SYSTEMS 2 ‘Multispectral Scanning Systems Mutispeceral scanner systems are widely used to acquire im- ages from aircraft and satellites. Both cross-irack and along- track sysiems are used, Cross-Track Multispectral Scanner Images Cross-rack seamers employ aspecirometer to disperse the incoming energy ito a spectrum (Figure 1-19A). Detectors are positioned to record specific wavelength bunds of energy (denoted 2y + «in the figure). Figure 120 shows images of San Pablo Bay, California, acquired by a cross-track multispectral scanner ‘manufactured by the Daedalus Corporation. Figure 1-21 shows the categories of land use and land cover in the San Pablo Bay area, which is the northern extension of San Francisco Bay. Table 1-5 lists the characteristics of the scanner and the 10 ‘multispectral images. Plate 1A is color composite image that ‘was prepared by projecting bands 2 (blue), 4 (green). and 7 (red) in blue, green, and red, respectively. Many other color combinations can easily be ereated The black-and-white images of the individual spectral bands (Figore 1-20) demonstrate the relationships among wave- Fength, atmospheric scattering. contrast ratio, and spatial reso- lution, Band | in the UV and blue region records the shortest wavelengths of all the bands and has the maximum atmos- phere scattering, resulting in alow contrast ratio and poor spa~ tial resolution. The network of sireets in the city of Vallejo is a useful resolution target; as the wavelength of the images in- creases, the ability to discem the streets improves and reaches ‘ maximum in the reflected IR region (bands 8, 9, nd 10). Figure 1-21. Land-se and land-cover types ofthe San Pablo Bay area, California. interpreted from airraft multispectral scanner images, TABLE 1-5 Daedalus aircraft multispectral scanner and images Aireralt altitude 19.5 km ‘Scanner /FOV 1.25 mrad Ground resolution cell 24 by 24m ‘Scan angle ay Image swath width 147 km Wavelength, Band* wm Spectral band 1 0.38 to 0.42 UV and blue 2 0.4 t0 0.45 Blue a 0.45 t0 0.50 Blue 4 0.50 t0 0.55 Green 5 055 to 0.60 Green 6 0.60 10 0.65 Red 1 0.65 t0 0.70 Red 8 0.70 to. 4.80 Reflected IR 9 0.80 t0 0.90 Reflected IR 10 0.900 1.10 Reflected IR 4, and Tin blue gren, and red produss a nema color "Combining tand ‘image Water, vegetation, and urban areas are the major types of land cover and land use in the San Pablo Bay area. In Figure 1-17 the bandwidth for each multispectral image can be com ported tothe spectral reflectance curve for vegetation. Vegeta tion has a higher reflectance inthe green band than in the blue and red bands where chlorophyll absorbs energy. The re flectance of vegetation inereases abruptly in the reflected IR region. These spectral characteristics of vegetation are also seen in the brightness signatures of vegetated hills in the im ages at corespondirg wavelengths. The signature of water is also different in the various spectral bunds. In San Pablo Bay. patterns of suspended silt are obvious in the visible bands; in the IR bands, however, water has s uniform dark signature be cause these wavelengths are completely absorbed. Some of the salt evaporating ponds, identified in Figure 1-21, have red and pink signatures in Plate 1A because of red microorganisms. Along-Track Multispectral Scanner Images Along-track multispectral scanners employ multiple linear arrays of detec: tors with each array recording 4 separate band of energy (Figure 1-19B). Because of the extended dwell time, she detec tor bandwidth may be narrow and produce an adequate signal. Along track scannerimages acquited by SPOT and other satel: lite systems are illusiated in Chapier 4, Side-Scanning Multispectral Images Aircraft and satel- Tite radar systemts have been developed that record two or more wavelengths of microwave energy. Typical images are shown inChapter 7 MULTISPECTRALIMAGNG SYSTEMS — 23, TUTTI FOTO OT TTT TIT TT: INTRODUCTION TO CONCEPTS AND SYSTEMS 24 HYPERSPECTRAL SCANNING SYSTEMS From the beginning of remote sensing, imaging technology has. advanced in two major ways: 1. Improvement in the spatial resolution of images accom: plished primarily by decreasing the [FOV of detectors, 2. Improvement in the spectral resolution of images—accom- plished by increasing the number of spectral bands and de creasing the bandwidth of each band. Conventional muhispectral scanners record up t0 10, oF so, speciral bands with bandwidths on the order of 0.10 yum. Hyperspectral scanners are & special type of multispectral scanner that records many tens of bands with bandwidths on the order of (LOL um, Today these systems are used only on aircraft, but eventually they will be carried on satellites. Table 1 lists characteristics of some current hyperspecttal scanners. GER Hyperspectral Scanner Figure 1-22 shows the 63 hyperspectral bancs recorded by & system developed by Geophysical & Environmental Research, Incorporated (GER). The area is the Cuprite mining district in west-central Nevada (Figure 1-23). The band number and wavelength are shown for every fifth image. In Figure 1-17 the tick marks show the spectral distribution of the GER image bands. Only seven bands are located in the interval of 1.00 10 2,00 pm, which is dominated by absorption bands caused by water vapor: 24 bands are in the region of 0.50 to 1.00 ym; 32, bands are in the region of 2.00 to 2.50 jum. The poor quality of bands 24 to 28 (Figure 1-22) is due to the absorption of energy by atmospheric water vapor, Hyperspectral images are recorded in digital format which results in a large volume of data that can be analyzed using the image-processing techniques described in Chapter 8. Daedalus Enterprises, Ine., of Ann Arbor, Michigan, also manufactures an aircraft hyperspectral scanner that records 102 bands of im: gery in the visible, reflected IR, and thermal IR regions. AVIRIS Hyperspectral Scanner Jet Propulsion Laboratory (JPL) developed a hyperspectral scanner system called the airborne visiblelinfrared imaging specirometer (AVIRIS), which acquires 224 images each with 1 spectral bandwidth of 10:nm in the region of 0.4 to 2.5 4m. (Figure 1-17), AVIRIS is carried in a NASA U-2 aircraft at an altitude of 20 km. The images have a swath width of 10.5 km and « spatial resolution of 20 m (Vane and others, !993). Plate 1B isa color image of the Cuprite mining distict and was pre- pared from the following AVIRIS bands: The band at 221 um is shown in blue, 2.138 um in green, and 2.088 jim in red. In Figure 1-17 the spectral positions oF these bunds are indicated by the letters R, G, and B along the AVIRIS range. In the color image of Plate 1B the blue-gray tones are voleanic rocks that Alluvium| Playa Deposits Thirsty CanyonTuft Silicified rocks ‘Stonewall Playa ° 10 mi ° 1.0 km Figure 1-23 Location map of Cuprite, Nevada, hhave been replaced by silica (Figure 1-23). The surrounding orange and red tones are volcanic rocks that have been re: placed to various degrees by clays and other minerals. ‘The blue tones are younger voleanic rocks that have not been re: placed by ether minerals, The ability to recognize replacement ‘minerals on AVIRIS images is important for mineral explore: tion, as described in Chapter 11 HYPERSPECTRAL SCANNING SYSTEMS — 25 AVIRIS is currently an experimental system, but simiar sys- tems will become commercially available in the future ‘Specialized computer systems are required to process the large volumes of data in the 224 bands of imagery. JPL has convened 1 series af workshops where investigators have reported results of AVIRIS projects. Vane (1988) and Green (1990, 1991) have prepared proceedings volumes for these workshops. The jour- nal Remote Sensing of the Environment devoted one issue (vol. 44, nos. 2/3, May/June, 1993) toa series of papers that describe wide range of applications of AVIRIS images. Reflectance Spectra from Hyperspectral Data Hyperspectral scanners are also called imaging spectrometers. The narrow spectral bands of hyperspectral images may be converted into reflectance spectra (Van der Meer, 1994) Kaolinite Alunite Reflectance, % Buddingtonite 2.00 2.10 2.20 Wavelength, en 230 240 Figure 1-24 Spectra of minerals derived from AVIRIS data (slid 1s) and measured by laboratory spectrometer (dated lines) From Van de Meer (1994, Figure 3). 26 _RODUCTION TOCONCEPTS AND SYSTEMS Figure 1-24 shows spectra that were calculated from AVIRIS. dita of Cuprite, Nevada. Each solid spectral curve represents fan array of 5-by-5 ground resolution cells, which is an area of 100 by 100 m on the ground. The three curves represent areas where three differen! minerals (kaolinite, alunite, and budding System Sandwich Range GEOSCAN 40mm = 4 GR rm AviRIS tonm =H Lb spectrometer Continuous Feflectance, % AvIRIS La. spec 2.97221 Key features. 232. 238 Loree teriitivoitaristiriid 200 «20 2202S Wavelength, jam Figure 1-25 Spectra ofthe clay mineral keolinite record by hy~ perspectral scanners ard by laboratory spectrometer, Banwidihs of the scanners are shown, Vertical lines indicate absorption Features that Ate keys for recopnizing the kaolinite spectram, Compiled from data in Kruse (1996), tonite) oseur. Kaolinite is @ clay mineral, alunite is aluminam sulfate, and tuddingtonite is an ammonium feldspar. For each area the percentage of reflectance for each AVIRIS band is ploted asa function of wavelenath. The values are comnected to produce the sold curves in Figure 1-24. The doted lines are laboratory spectra for the three minerals, which ave similar 1 the AVIRIS spectra. The differences between the AVIRIS spec- tra and the spectrometer spectra are explained as follows: The ground resolution cells of AVERIS include a variety of materi als in addition tothe predominant mineral, and these contami- nate an AVIRIS spectrum, whereas 2 laboratory spectrum rep- reseats a pure sample of each mineral Table 1-4 lists characteristics of a Canadian along-track scanner called the SSI, which is an acronym for SWIR (short wave IR) Full Spectrum Imager. Neville and ethers (1995) de scribe the system and show spectra and images. Figure 1-25 shows reflectance spectra for kaolinite that were produced by Kruse (1996) from data acquired by the GER 63 bands) and AVIRIS (224 bands) hyperspectral scanners. Also shown isa spectrum from the Geoscan multispeciral scanns, Which records up to 24 bands that are selected from 46 avs able bands. Figure 1-25 lists the bandwidths of these systems Table 1-6 Remote sensing joumals and societies Journal Canadian Jounal of Remote Seasing Earth Observation Magazine Geo Abstracts—G. Remote Sensing, Photogrammetry, and Cartography Geocarto International IEEE Transactions on Geoscience and Remote Sensing International Jouraal of Remote Sensing and shows the handass range graphically. For comparison, the bottom kaolinite spectrum was recorded by a laboratory spec- trometer and provides the highest spectral resolution, The ver tical ines show four key absorption features from the labora- tory spectrum that are diagnostic for identifying the kaolinite spectrum. Only AVIRIS, with a 10-nm bandwidth, records all the key features. Similar results were obtained for specira of, buddingtonite and alunite. Despite their lower spectral resolu- tion, Kruse (1996) notes that the GER and Geosean systems fare useful for projects that do not require the identification of specific minerals, Jn summary, the spectial resolution (bandwidth) of a remote sensing system (Figure 1-13) determines the ability to distin- ‘euish and identify materials based on their spectral character- isities. Spatial resolution (Figure 1-14) determines the ability todistinguish objects based on their geometric characteristics SOURCES OF REMOTE SENSING INFORMATION ‘Table 1-6 lists scientific journals devoted to remote sensing that are published by technical societies or commercial publishers, Publisher Canadian Aeronautics and Space Institute Saxe Building 60-75 Sparks Street Ottawa, Canada KIP SAS OM, Ine. 13741 E. Rice Place, Suite 125 Aurora, CO 80015 Geo Abstracts Regency House 34 Duke Street Norwich NR3 3AP- United Kingdom Geocarto Intemational Centre G.PO. Box 4122 Hong Kong, IEEE Remote Sensing and Geoscience Seciety Institute of Blectrical and Electronics Engineers 445 Hoes Lane Piscataway, NJ 08854 emote Sensing Society clo Taylor and Francis, Lid Rankine Road Basingstoke, Hants, RG24 OPR United Kingdom (concluded on the following page) SOURCES OF REMOTE SENSINGINFORMATION 27 28 Table 1-6 feoneluded) Journal Photogrammetric Engineering and Remote Sensing Reflections Remote Sensing in Canada Remute Sensing Newsletter Remote Sensing of Environment Remote Sensing Reviews Washingion Remote Sensing Lener Table 1-7 Remote sensing organizations and conferences. Organization Address Publisher American Society for Photogrammetry and Remote Sensing 210 Little Falls Steet Falls Church, VA. 22086 Radarsat International 3851 Shell Road, Suite 200 Richmond, BC Vox 2W2 Canada Canadian Remote Sensing Society 130 Slater Stet, Suite 818 Ottawa, Ontario KIP 6E2 Canada Geological Remote Sersing Group efo Dr. Suart Marsh British Geological Survey Keyworth, Nowingham NGI2 56G United Kingdom Elsevier Science Publishing Company 655 Avenue of the Americas New York, NY 10010-5107 Gordon and Breach Science Publishers PO, Box 786, Cooper Station New York, NY 10276 M. Felscer, Publisher P.O. Box 2075 Washington, DC 20013, Conference Environmental Research Inaitute PO, Box 134001 of Michigan ‘Ann Arbor, MI 48113 ROS Data Center Sioux Falls, SD 57108 of U.S. Geological Survey Jet Propulsion Laboratory 4800 Oak Grove Drive Pasadena, CA 91103 345 East 47ih Street New York, NY 10017 IERE Geoscience and Remote ‘Sensing Society INTRODUCTION TO CONCEPTS AND SYSTEMS “Remote Sensing of Environment” “Thematic Conferences on Remote Sensing” “Pecora Symposium on Remote Sensing” (annual conference) Publishes reports: conducts conferences and workshops “IEEE Intemational Geoscience and Remote ‘Sensing Symposium” (annual symposium) ‘Table 1-8 Remote sensing images and information on Internet an the Workwide Web Facility Data Canada Center for Landsat 5 and SPOT Remote Sensing FOSAT Landsat, IRS, JERS, ERS Goddard Space Flight Center. AVHRR. Japan NASDA JERS-1 Johrson Space Center Images from manned satellites Jet Propulsion Laboratory Educational Outreach Center Jet Propulsion Laboratory Imaging radar Jet Propulsion Laboratory Public Information Office NASA Starting point NASA Scientific and tech. info. National Oceanic end Atmospheric Administration National Oceanic and Atmospheric Administration Image catalog Weather satellite data Radarsat Canada Radarsat Syracuse University SIR-C teachers guide U.S. Geological Survey Global Land Information System USS. spy satellite images Weather images U.S. Geological Survey United Kingdom Membership in these societies is open to all investigators. Cracknell (1992) has published a directory of remote sensing journals and societies that sre based in Europe, In addition to these joumals, many articles on remote sensing are published in journals devoted to other disciplines such as geology, goog- raphy, and oceanography. Table -7 lists remote sensing con- ferences conducted by various organizations. ‘Under the editorship of RN. Colwell (1983), The American Society of Photogrammetry and Remoic Sensing (ASPRS) published the second edition of the Manual of Remote Sensing. which is a useful reference. ASPRS is currently preparing third edition of the manual. Table I-8 is a partial listing of remote sensing images and information that are accessible through Intemet and the Worldwide Web. Both Internet and the Worldwide Web are dy- namic environmenis in which new sites are being opened and existing sites are being modified or discontinued. Therefore Table 1-8 is a sample of sites that were available in early 1996. COMMENTS Remote sensing is defined as the science of acquiring, process- ing, and interpreting images and related data obtained from Address hutp:/www.ccrs.emr.caledqy.Atml hutp:iwww-ecsat.com hup/xtreme.gsfe.nasagov/ hitp:/fndsn.eoe.nasa.g0.ip/ hutp:fimages.jse.nasa.gov/himlfhome htm http:!vww.jpl.nasa,gov/education him! hitp:/southpost,jpLnasa. gov) huap:/ovwjplanasa. gow hutp:/hypatiagsfe.nasa gov/NASA.homepage.html hutp:/worw.sti.nasa.g0¥ hutp:/www.esdim.noaa gov/NOAA.Catalog/NOAA.Catalog.himl utp: Awww .nede.noaa.gov http/radarsat.sou.ge.ca hupferiit.syredu/NASA/nasa.htm hup:fecewww.cr.usps.gov/glis/is.him hupfedewww.cr.usgs.gov/delass.delass,biml hitp:/Avebnexor.co.uk/iservipo/weather/weather him! aircraft and spacecraft that record the interaction between mat- ter and electromagnetic energy, ‘The electromagnetic spectrum is divided into wavelength re gions. The regions employed in remote sensing range from short-wavelength UV energy to the long-wavelength mi crowave and radio energy. The electromagnetic regions are fur- ther subdivided inio narrow wavelength bands. Electro- magnetic energy imeracts with matter by being scatiered, reflected, transmitted, absorbed, or emitted, Subsequent chap {ets ofthis book describe these interactions for radiation of the different wavelength bands. together with the technology em- ployed in sensing the radiation, The interpretation of an image depends upon its scale, tone, texture, contrast ratio, and spatial resolution. In this text, spa- tial resolution refers to the minimum distance between two ob- jects at which they can be distinguished. Remote sensing sys: tems operate in the framing mode or the scanning mode, Multispectral images consist of two of more simultaneously recorded spectral bands of imagery. Any three bands ean be combined, one each in blue, green, and red, to produce color images. comments 29 QUESTIONS 1. Use Equation 1-1 1 calculate the wavelength in centime- ters of radar energy ata frequency of 10 Gz, What isthe frequency in gigabertz of radar energy at a wavelength of 25 em? 2. What is the temperature of boiling water st sea level in de~ grees Kelvin? 3. Distinguish betwoen the earth's radiant energy peak and ihe reflected energy peak. 4, The atmosphere is essential for life on earth, but it causes, problems for remote sensing. Describe these problems. '5. Use Equation 1-3 to calculate the contrast ratio between a target with a brightness of 17 and a background with a brightness of & 6. On images acquired from a satellite (at a 910-km altitude) targets on the ground separated by 80 m can be resolved. Use Equation I-4 to calculate the angular resolving power (in milliradians) ofthe scanning system. 7. Assume that your eyes have the normal resolving power (0.2 mrad) and that you are an airline passenger at an alti ‘ude of 9 km. For targe's on the ground with high contrast ratio, what i the minimum separation (in meters) at which you can resolve these targets? 8. An airbome cross-track scanner has the following charac~ veristics: IFOV = (.5 mnrad; angular field of view = 45°; sean mirror rotates at 4000 rpm (revolutions per minute. The aircraft alitude is 10 km. Calculate the following: Size of ground resolution cell = by __m Width of ground swat km Dwell time for a ground resolution cell see, 9. ‘sn along track scanner has detectors with a mrad IFOV. The scanter is carted in an aircraft at an altitude of 1S km and « ground speed of 600 km «I, Calculate the follow- ings Ground resolution cell =__by_m Dwell time for a ground resolution cell = __see 10, Refer to the aireraft multispectral scanner images and map of San Pablo Bay (Figures 1-20, 1-21), Select the three mages that show maximum differences in brightness (e- flectance) fr the following terran categories: ‘Vegetation in nomheast comer of the scene: bands Silty water in San Pablo Bay adjacent to Mare Island: bands Untan areas of Vallejo: bands __, Salt pons in nore prion af he image wee bands REFERENCES Clark, R.N, ad others, 1990, High spectral resolution reflectance spectroscopy of minerals: Journal of Geophysical Research, v.95, p.12,653-12.680. 80 INTRODUCTION TOCONGEPTS AND SSTEMS. Colwell. R.N.. ef. 1983, Manual of remote sensing. seeand edition American Society of Photogrammetry, Falls Church, VA. Cracknell, AP, 1992, Leamed societies, leamed joumals and other publications: Inernstional Journal of Remote Sensing, v. 13, p. 1217-1228, Fiicher, W. A. and others, 1975, History of romete seasing in Reeves, RG, ed, Manual of remote sensing: ch. 2, p. 27-50, American Sociely of Photogrammetry, Fills Church, VA Forshaw. M. R.A Haskell, P. Miller, DJ. Stanley, and J. 8. G “Towrshent, 1983, Spatial resolution of remotely sensed imagery — 4 rosiew paper Intnafonal Joumal Remete Sensing, v4, p. 497-520. Gren, R. 0., 1999, Proceedings of the sesond Airbome VisitlefInared Imsping Spectrometor (AVIRIS) workshop: Jet Propulsion Laboratory Publication 90-54, Pasadena, CA, Green, 0, 1991, Proceedings of the third Aittome VisitieIntared Imaging Spectrometer (AVIRIS) workshop: Jet Propulsion Laboratory Publication 91-28, Pasadena, CA, ‘Gregory R. Ly 1966, Bye and brain, the psychology of secing: Worké University Library, McGraw-Hill Book Co., New York, NY. Grove, C. LS. J. Hook, E.D. Payler, 1992, Laboratory reflectance spectra of 150 minerals. 0.4 49 25 misrometers: Jet Propulsion Laboratory Putlication 92-2, Pasadena, CA, Hunt, G. L., 1980, Eiectremagnetic radiation—the communication link in remote sensing, ix Siegal, B. S., and Gillespie, A. R., eds. Remote sensing in geology: Joan Wiley & Sons, New York, NY. King, D. L. 1995, Airborne multispecteal di sensors—a critical review of sysiem designs and applications: ‘Canadian Journal of Remote Sensing, v.21, p.24-273. Kruse, F,A., 1996, Cuprite Nevada—supplemertal field tip informa tion, in Sauls, L., ed., Remote sensing field trip of Red Rock ‘Canyon, Death Valley, Goldfield, and Cuprite: Eleventh ‘Thematic Conference on Applied Geologic Remo Sexsing, Environmental Research Institute of Michigan, Ann Arbor, Ml. Longshaw, T.G., 1976, Application of an analytical approach to Field spectroscepy in geological remote sensing: Modem geology, v. 5 p.93-107, ‘Lowman, P.D., 1969, Apollo 9 multispectral photography-—geologie analysis: NASA Goddard Space Fight Center, Report X-644-60. 423, Greenbelt, MD. Marsh, S. EJ. L. Walsh, C. T. Lee, and LA. Graham, 1991, “Multtemporal analysis of hazardous waste sites through the use of 44 new bi-spectal video remote sensing system and standard color. IR photography: Photogrammetric Engineering and Remote Sensing, 57,p. 1221-1226, Marsh, S. E, J. L, Walsh, and C: Sobrevils, 1994, Evaluation of air- omic video daa for land.cover classification accuracy assesiment in an isolated Brazilian forest: Remote sensing of environment, v 48, p. 61-59. McKinney, R. G., 1980, Photographic materials and processing i Slama, C.C. ,ed., Manual of photogrammetry fourth edition: ch. 6, p. 305-366, American Society of Photograramety, Falls Church, vA Monday, H. M.,J. 8. Urban, D, Mulawa, and C. A. Benkelman, 1994, City of frving auizes high resolution multispectral imagery for N. P.D.E. S. compliance: Photogrammetric Engineering and Remote Sensing, v.60, p. 411-416, ul camera and video NASA, 1877, Skylab explores the earth: NASA SP.250, Washington. De. Neale, C. M. U. and B. G, Crowther, 1994, An alborne multspectal video/radiometer remote sensing system-—development avd cali- bration: Remote Sensing of Environment, v.49, p. 187-194. fvile, RA, N, Rowlands, R. Maco, and | Powell, 1995, SS Canada’s fint sirbome WIR. imaging spectrometer: Canadian Journal of Remote Sensing v.21, p. 328-336, Price, J. C., 1994, How unigue ate spectral signatures?: Remote Sensing of Eavicoamert, v.49, p. 181-186. Price, J.C, 1005, Esamples of high ecolution visite to near-infrared reflectance spectra and a stardardized collection for remote sensing udies: Intemational Jouraal of Remote Sersing, % 16, p, 93-1000. Roseaberg, P, 1971, Resolution, detectability, and recognizabily Photogrimmetric Engineering, v.37. 1244-1258 Slate, P.N., 1983, Photographic systems for remote sessing in Colwell, K.N., ed, Manual of remote sensing, second edhion ch, 6,p. 231-291, American Society Photogrammetry, Falls Church, VA. Slater PN. 1985, Survey of multispecteal imaging systems for ean ‘observation, in R. N. Colwell, ed, Manual of remote sensing, sec- ‘ond edition: ch. 6, p. 231-291, American Society for Phato- ‘grammetry and Remote Sensing, Falls Church, VA Suits, G. H,, 1983, The nature’ of electromagretic radiation in Colwell, R.N.,e€), Manual of remote sensing, second edition: 2, p. 37-60, American Society of Photogrammetry, Falls Church, Va. Van der Meer, F, 1994, Extraction of mineral absorption features, from high-spectral resolution data using nor-parametric geostatsti- cal techniques: Inernational Joumal of Remote Sensing,» 15. . 2193-2214, Vane, G., ed., 1988, Proceedings of the Airborne: Visble/nfrared Imaging Spectrometer (AVIRIS) performance evaluation work- shop: Jet Propulsion Laboratory Publication 88-38, Pasadena, CA, Vane, G., RO. Green, T.G. Chien, H.T. Enmark. E.G. Hansen, ‘and W. M. Portes, 1993, The Airbome Visiblefafared. Imaging Specttometer(AVIRIS): Remote Seasing of Environment, ¥ 4, 1-143, ADDITIONAL READING Avery, TE-and GL, Berlin, 1992, Fundamentals of remote sensing and airphoto interpretation, fifth edition: Macmillan, New York. NY. eaumont, E, A. and Foster, N. H., eds, 1992, Remote sensing: American Aswciation of Petroleum Geologiss, Treatise of Petrcieum Geology Reprint Series, n0. 18, Tusa, OK. Bx-Der, E. and FA. Keuse, 1995, Surface mineral napping of Makhtesh Ramon Negev, Isrel using GER 63 channel scanner ata: Intemational Journal of Remote Sensing, v. 16, p. 3520-3853 Colwell, R.N, and others, 1963, Basic matter and energy relation ‘hips linolved in remote “reconnaissance: Photogrammatric Engineering and Remote Sensing, v.29, p. 761-798 Elachi,., 1987, Introduction to the physics and techniques of remote sensing! John Wiley & Sons, Now York, NY. Everitt, J. H, and others, 1995, A thre-camera multispectral digital video imaging system: Remote Sensing of Environment, ¥. 54 p.333-39, Hook, $.J.,C, D.Elvidge, M, Rast, and H. Watanabe, 1991, An eval: uation af dhortvavelength-infrared (SWIR) data fm the AVIRIS and GEOSCAN insruments for mineralogic mapping at Cuprite. Neva: Geophysics, v.56. p. [432-1440, Hat, E.. 1988, Keyguide to information sources in remote sensing Mansell, Londen, England. Lilesand, T. M- and RW. Kiefer, 1994, Remose sensing and image integpretation, thitd edition: Jon Wiley & Sons, New York, NY. Ross, W. G,, 1996, Physica principles of remote Sensing: Cambridge University Press, Cambridge, England ‘Southworth, C, 5. 1985, Characteristics and avaitaibty of data from ‘arth imaging sitellies: US. Geological Survey. Bulletin 1631 Vane, G., ed, 1990, Inaging spectroscopy of the terrestrial environ ment SPIE—The Irgerational Society for Optical Engineering, v 1298, Vane, G, and A. FH, Goetz, 1993, Terrestrial imaging spectrome try —eurent status, future tres: Remote Sensing of Environment, v4, p. 17-126. ADDITIONALREADNG 31 CHAPTER PHOTOGRAPHS FROM AIRCRAFT AND SATELLITES Photographs acquired from aircraft (aerial photographs) were the first form of remote sensing imagery, and they remain the most widely used images today. Knowing the techniques for interpreting aerial photographs is essential background for un- derstanding other remote sensing images. Indeed, serial photo graphs are used throughout this text to aid in explaining ther mal IR, radar, and other kinds of images. Inthe enthusiasm for satelite images and new forms of airhorne remote sensing, ene should not overlook the advantages of aerial photograpis Topographic maps are made from aera photographs, and many engineering projects use aerial photographs. Soil conservation studies, agricultural erop inventories, and city planning all em- ploy aerial photograpts. Geologic mapping and exploration commonly begin vith an aralysis of photographs. Inthe early 1974s, interpretation of aerial photographs led to the discovery of several valuable ol fields in Irian Jaya, Indonesia, Philipson (1996) edited a manual of photozraphic interpretation ‘Many photographs have been acquired from earth-orbiting satelite and are described in this chapter INTERACTIONS BETWEEN LIGHT AND MATTER {As with other forms of electromagnetic energy, light may be reflected, absorbed, or transmitted by matter. Aerial photo raps record the light reflected by a surface, which is deter- ‘mined by the property called albedo. Aledo is the ratio of the nergy reflected from a surface to the energy incident on the surface, Dark surfaces have a low albedo, and bright surfaces havea high albedo. Light that is not reflected is transmitted or absorbed by the materal. During its transmission throygh the atmosphere, light interacts with the gases and particulate mat- ter in a process called scattering, which has a strong effect on aerial photographs ‘Atmospheric Scattering Aimospheric scattering results from multiple interactions be- tween li as shown in Figure 2-1, The two major processes, selective scattering and nonselective scattering, are related tothe size of particles in the atmosphere. In selective scattering the shorter wavelengths of UV energy and blue light are scattered more severely than the longer wavelengths of red light and 1R en- cegy. Selective scattering is caused by fumes and by gases such as nitrogen. oxygen, and carbon dioxide. The selective scatte ing of blue light causes the blue color of the sky. The red skies, at sunrise and sunset are due to sunlight passing horizontally through the atmosphere, which scatters blue and green wave- lengths so only red light reaches the viewer, trays and the gases and particles of the atmosphere, TiwT SoURGE NONSELECTIVE SCATTERING Eg 2 oe 0s 08 ays OSD Am WavELenGTA Figure 22 Aunonphcre xatring as «fursion of wanctengt, The shaded region shows the range of scatering caused by typical atmes- fpbetes, From Slater (1983, Figure 6-15). In nonselective scattering all wavelengths of light are equally scattered. Nonselective scatiering is caused by dust, clouds, and fog in which the panicles are much larger than the wavelengths of light, Clouds and fog are aerosols of very water droplets; they are white because the droplets scatter all wavelengths equally. The curves in Figure 2 scattering as a function of wavelength, Nonselective scattering fs shown by the horizontal line, Scattering in the atmosphere results from a combination of selective and nonselective processes, The range of atmospheric scattering is shown by the shaded area in Figure 2-2. The lower curve of the shaded area represents a cleur atmosphere and the Uupper curve a hazy atmosphere, ‘Typical atmospheres have scattering characteristics that are intermediate between these extremes The important point for aerial photography is that the earth's atmosphere scatters UV and blue wavelengths at least twice as strongly as red light Light scattered by the atmosphere illuminates. shadows, which are never completely dark but are bluish in color. This scattered illumination is referred to as skylight to distinguish it from direct sunlight. striking characteristic of photographs ‘aken by Apollo astronauts on the surface of the moon is the black appearance of the shadows, The lack of atmosphere on the moon precludes any scattering of light into the shadowed areas, 34 PHOTOGRAPHS FROM ARORAFT AND SATELLITES. Effects of Scattering on Aerial Photographs Scattered light that enters the camera isa source of illumina- tion but contains no information about the terrain, This extra il lumination reduces the cortrast ratio (CR) of the scene, thereby reducing the spatial resolution and detectability of the photograph. Figure 2-3 diagrams the effect of scattered light ‘on the contrast ratio of a scene in which a dark area (rightness = 2) is surmounded by a brighter background (brightness = 5) For the original scene with no scattered light (Figure 2-3A,B), the contrast ratio is determined from Equation 1.3 as follows: Figure 2-3C shows the appearance of the scene in conditions of heavy have, where the atmosphere contributes 5 brightness > s DISTANCE A. Orignalssene 8. Brightness pote of mage with io ecatored light SS 1, ©. Profle ofimage with 5 ©. Brightness profle and cortrast brightness uns aaced by ratio ot image with sattereg catered ligt light DISTANCE Figure 2-3 Effect of an image ter light en the contrat ratio of haze sires og waver ALTER TRANSMITTANCE —= Inert 3 | erereoin ry es creen feo LENGTH Figure 2-8 Atmospheric seattesing diagram and transmission curves of Filters used inaerial photography, Shorter wavelengths o the left ofeach filer curve are absorbed, From Slater (1988, Figure 6:37), units of seattered light. As the brightness profile of Figure 2-3. shows, scattered light adds uniformly to all parts of the scene and results ina contrast ratio of is Thus atmospheric scattering has reduced the contrast ratio of the scene from 2.5 to |.4, which lowers the spatial resolution (on a photograph of that scene. Chapter | demonstrated this re- lationship between contrast ratio and resolving power. The ef fect of atmospheric scattering on aerial images is illustrated later in this chapter. Filtering out the selectively scattered shorter wavelengihs before they reach the film reduces the effects of atmospheric scattering. Superposed on the scattering curve of Figure 2-4 are the spectral transmittance curves of typical filters used in aerial photography showing the wavelengths absorbed and transmitted. There is & trade-off with filters: although they re- duce haze, they also remove the spectral information contained in the wavelengths that are absorbed. FILM TECHNOLOGY Photographic film consists ofa flexible transparent base coated with a layer of light-sensitive emulsion approximately 100 jim in thickness (Figure 2-5A). The emulsion is initially a suspen- sion in solidified gelatin of grains of silver halide (a salt) afew micrometers oF les in diameter. The grains have been precip tated from solution rather rapidly to make them regular, with ‘numerous points of imperfection onthe surface. After the emul- sion is deposited on the film base, further processing of the ‘grains increases their sensitivity to light. Photegraphic expo- sure is the photochemical reaction between photons of light in- cident upon silver halide grains in the emulsion. When a photon sttkes one of th grains, an electron in the silver halide ersta is given enough energy to move freely about and may be trapped at an imperfection inthe grain. By combining with & silver ion lacking one electron, the electron may then conver the ion inte a silver atom (Jones, 1968). This aiom cannot re- ‘main an atom for long by itself buf two electrons inthe grain are liberated within about a second, a stable combination of sil ‘er atoms will frm a the imperfection Figure 2-5B), ‘The success of the photographic method depends upon the requirement that a silver halide grain receive more than one FILMTECHNOLOGY 35 types _ea ‘A. Coss section of fm, B. Exposure of sver halide grains. A A Sie ©. Developed fim negatve. Figuro 2-5 Film wehnelogy. photon within @ short ime. If only one photon were needed, random photons caused by normal ionic vibration would scon convert all the grains to silver. The stable combinations of silver atoms are large enough to trigger the conversion of the entire silver halide grain to metallic silver when the film is chemically developed. Before chemical development, however, exposed silver halide grains have the same appearance as unexposed rains, soat this stage the film is said to contain a latent image. Developing is the chemical process of changing the latent image into a real image by converting the exposed silver halide rains into opaque grains of silver. The film is immersed in the developer, which is a water solution containing a reducing agent that does not interact with unexposed silver halide grains. For exposed grains with silver atoms at a point of im- perfection, however, the agent starts reducing the silver halide to silver. As Figure 2-5C shows, the entire grain converts to metallic silver once the reduction process begins. Fixing is the next process, which removes unexposed grains, leaving clear areas in the emulsion. The resulting film is called 1 negative film because bright targets form dark images on the film. When the film is priated onto photographie paper, the dark negative images are reversed and bright targets appear bright on the print 36 PHOTOGRAPHS FROM ARCRAFT AND SATELLITES. One advantage of phoiogrephie remote sensing is the enor mous amplification that occurs in the development process. A few photons absorbed in a grain of silver halide with a velume ‘of | Um? will produce more than 10! atoms of developed sik ver, which is an amplification of more than 1 billion times Other advantages of photographic remote sensing are high re solving power, low cost, versatility, and ease of operation. Another advantage is the capacity of film 10 store large amounis of information. On the developed film, each grain, ‘whether exposed oF unexposed, records information about the scene, There are more than 150 million (1.5% 10%) such grains ‘on a 65-em? (Lin) piece of film, James (1966) is a standard reference on the phoiogrephie process ‘The three major disadvantages of photographic remote sens- ing are the followin, 1. tr isrestricted to te spectral region of 0.3 100.9 pm. 2. Ih is restricted by weather, lighting conditions and atmos pherie eects 3. Information is recorded in a nondigital format. tn ower wo be computer processed, photographs must be converted into dyecolor Special eC I TO TE YoY Yellow Blue uM uM ‘Magenta Green e c Cyan ed Z ZZ pian aroes a ‘A. Negative fim (Koéacoln clearemusion TT Yyetlow Biue Mw m Magenta Green Cyan Red Fie EL Blant areas are 8B, Positive fm (Kodachrome), clearemusion Figure 2.26 Crows sections of negative and positive color film, showing how images are formed on the three emulsion layers as its complementary color on the negative. The image on a negative color film is projected onto photographic paper, coated with sensitive emulsions, that is developed to produce & color print. Positive Color Film Figure 2-25B shows a cross section of positive color film, which records a scene in its rue colors. A red subject forms & clear image on the red-sensitive emulsion, which becomes cyan where it is aot exposed by red light. The red subject forms @ magenta image on the green-sensitive layer and a yellow image on the blue-sensitive layer. When viewed with transmitted white light, the yellow and magenta images ab- sorb blue and green, respectively, and allow a red image to be projected. A white subject forms clear images on all three layers, ‘The original positive film transparency may be viewed on a light table, which provides maximum resolution. However, the film rolls require special handling and viewing equipment and tare not suitable for use in the field, Black-and-white prints, color prints, and color transparencies can be made from nega- live color film. Paper prints, despite their slightly lower resolu tion, are more versatile and easily used inthe field, IR COLOR PHOTOGRAPHS In JR coler film the spectral sensitivities of the emulsion layers are changed to record energy of other wavelengths, including photographic IR (0.7 to 0.9 wm). This film is sold as Kodak ‘Acrochrome Infrared film, type 2443, which is available only as positive film. Plate 1D isan IR color photograph of the area covered by the normal color photograph in Plate IC. IR color z 8 7 Normal corm \° WI III . (YF Wonca cor B YG I 8. 5 a 2 can 28 « : ‘ & Redo 32 SE 8 6. Anospteresaterng| * 7 ome 8 boos & fin oO 08 Oum eevecreo sue | oneew | neo | (i | es | D. Special retectarce curves ot vepetaton Figure 2-26 Spectral sensitivity of normal and IR color film, to gether with sn atmospheric seattering diagram and vegetation re Alctance spectra. From Sabins (I973B, Figure 2, film was originally designed for military connaissance and ‘was called camouflage detection film. The name false color {film is occasionally used, but IR color film is the preferred ‘name, IR color film is best described by comparing it with normal color positive film. Figure 2-26A.B shows the spectral sensi ity of the three emulsion layers that produce blue, green, and red images. In normal color film (Figure 2-26A) each layer is exposed by the corresponding wavelength band of light. The blue-imaging layer is exposed by blue light, the green-imaging layer by green light, and the red-imaging layer by red light. In IRCOLORPHOTOGRARHS 55 IR color film (Figure 2-26B) the photochemistry of each layer is changed, and they are sensitive to different wavelengths of light: the blue-imaging layer is exposed by green light; the sreet-imaging layer is exposed by red light andthe red-imag- ing layers exposed by reflected IR energy. All thre layers are also sensitive to blue light, which is eliminated by placing a yellow (minus-blue) filer over the camera lens. The shading in Figure 2-268 highlights the curve showing blue wavelengths removed by the Wraten 12 yellow filer. Removing these strongly scattered wavelengths (Figure 2-26C) improves the contrast ratio and spatial resolution of IR color film Because the term infrared suggests heat, some users mistak cenly assume that the red tones on IR color film record varia- tions in temperature. A few moments’ thought will show that this is not the ease. Ifthe IR-scasitive layer were sensitive to ambient heat, it would be exposed by the warmth of the cam- era body itself. As pointed out in Chapter 1, thermal radiation ‘occurs at wavelengths longer than 3 jim, which is beyond the senstivity range of IR film (0.7 0 0. jum). To repeat, the red~ imaging layer of IR color film is exposed by reflected IR en- ray, not by thermal IR energy. Inadgltion to large sizes for aerial cameras, IR color film is available in 35-mm size for use in ordinary cameras. The cost of the 20-exposure cassettes and processing is comparable to that of normal color films. A user can evaluate IR color film at ‘minimal expense with this format. Its useful 10 acquire nor- mal color photographs with a second camera to compare with IR color photographs. (R color film may deteriorate with time and excessive hea if keeping the film for more than a Few weeks, store it in a freezer. Allow the frozen film to reach room temperature before opening the sealed container to prevent moisture from condensing en the wold film.) A yellow (minus-blue) filter, such as the Kodak Wratten 12, is used with IR color film, This film-ane-filter combina- sion has an approximate spced of ASA 100, Some experimen tation will be necessary to determine the optimum exposure because conventional light meters do not measure the same spectral rogion to which the film is sensitive. Some cameras have an IR setting on the focusing ring that is intended for IR black-and-white film. Do not use this seting for IR color film because two of the three emulsion layers are sensitive to Visi bie wavelengths NORMAL COLOR AND IR COLOR PHOTOGRAPHS COMPARED Plate 1C.D shows a normal color photograph and an IR color photograph that were simultancously acquired of the UCLA. campus in the westem part of Los Angeles. Figure 2-27 is a lo- cation map. Comparing these photographs is a useful way to understand their different characteristics. Table 2-3 compares, the color signatures of common subjects on the two types of photographs. 56 PHOTOGRAPHS FROM ARORAFT AND SATELLITES. Santa Monica Mountains Bel Air Lake Drake Stadium ucla 0 Pauley (J Geology Pavillion ra Cemetery ° Ami ° km Figure 2-27 Location map of the UCLA sea, Los Angeles, Califor, ‘Signatures of Vegetation “The most striking difference between the photographs in Plate ICD is the red color of healthy vegetation in the IR color pho- tograph, which is explained by the spectral reflectance curves of vegetation shown in Figure 2-20D. Spectral reflectance curves show the percentage of incident energy reflected by ‘material asa function of wavelength. Blue and red light are ab- sorbed by foliage. Up to 20 percent of the incident green light is reflected, causing the familiar green color of leaves on nor: ‘mal color photographs. The spectral reflectance of vegetation Increases abruptly in the photographic IR region, whict in cludes the wavelengths that expose the red-imaging layer in IR color film. Table 2.3. Terrain signatures on normal cole fi and IR color fl Subject Normal cotor film IK color film Healthy vegetation: Broadleat type Green ed tomagenta Needle-leaf type Green Reddish brown to purple ‘Stressed vegetation: Previsual suige Green Pink to blue Visual stage Yellowish green Cyan ‘Autumn leaves Redito yellow Yellow to white Clear water Blue-green Dark blue t black Silty water Light green Light blue Damp ground Slightly darker than dry soil Distinctly darker than dry soil Shadows Blue with details visible Black with few details visible Water penetration Goo Contacts between land and water Red bed outcrops Red Figure 2-28 isa diagrammatic cross section of a leaf that ex- plains these spectral signatures of vegetation, The transparent epidermis allows incident sunlight to penetrate into the meso~ phyll, which consists of two layers: (1) the palisade paren- chyma of closely spaced cylindrical cells, and (2) the spongy parenchyma of irregular cells with abundant interstices filled with air. Both types of mesophyll cells contain chlorophyll, which reflects part of the incident green wavelengths and ab- sorbs all of the blue and red energy for photosynthesis. The longer wavelengths of photographic IR energy penetrate into the spongy parenchyma, where the energy is strongly scattered and reflected by the boundaries between cell walls and air spaces. The high IR reflectance of leaves is caused not by chlorophyll but by the intemal cell structure. Gausman (1985) tives details of optical properties of plant leaves in the visible and reflected IR regions. Buschmann and Nagel (1993) de- scribe the roles of chlorophyll and cell structure in spectral e- fectance of leaves. Detection of Stressed Vegetation Vegetation may be stressed because of drought, disease, insect Infestation, or other factors that deprive the leaves of water. Figure 2-29 compares the internal structure of nonstressed and stressed leaves. The nonstressed leaf (Figure 2-29A) has a cell structure and reflectance characteristics comperable t0 those in Figure 2-28. In the stressed leaf (Figure 2-29B), the shortage of water causes the mesophyll ces to collapse, which sirongly Poor to fair discrimination Moderate to poor Excellent discrimination Yellow Fetlected Paten ndeaes roe Figure 2-28 Diagrammatic cross section of a leaf, showing intr- action with incident energy. Incident blue and red wavelengths re absorbed by chlorophyll in the process of photosynthesis. Incident ‘green wavelengths are partially reflected by chlorophyll. Incident IR ‘energy isstongly scatered and reflected by cell walls inthe meso phyil, Modified from Buschmann and Nagel (1993, Figure). NORMAL COLOR AND IR COLOR PHOTOGRAPHS COMPARED 5 8. Stessea, Figure 2:29 Phoomicrograpis of cross sections of nonstressed and stressed lever. Collapse of cells in the mesopiyll layer stongly re duces reflectance of incident IR energy. From Evert and Nixon (1980, Figure 1). Courtesy J. H, Event, U.S. Department of Agriculture reduces IR reflectance from the sporgy parenchyma, This de- creased reflectance diminishes the red signature in IR color photographs. Chlorophyll is still proven, and the foliage may have a green signature in normal color photographs for some time after the onset of stress. In IR color photographs, how- ever, stressed foliage has a distinctive blue signature. The loss of IR reflectance isa previsual symptom of plant stress because it often occurs days or even weeks before the visible green color begins to change. The previsual effect may be used for early detection of disease and insect damage in crops and forests. Evidence of plant tess is een inthe intramural play- ing field east of Drake Stadium (Figure 2-27 and Plate 1C.D), which is watered by a sprinkler system. In the normal color photograph the fied is entirely green, but inthe IR color pho ‘ograph the red signature is interrupted by blue strips that ii- cate inadequately watered turf Autumn Senescence of Veget In the autumn, leaves of deciduous trees undergo senescence and turn red, yellow, and brown. Figure 2-30 compares spectra of green and senescent foliage. The green chlorophyll has de- 58 PHOTOGRAPHS FROM ARCRAFT AND SATELLITES 50 » Reflectance, % 8 Seale change 07 08 08 1.0 1.1 1 Wavelength, um Figure 2-30 Reflectance spectra of green and senescent foliage. In the autumn, chlorophyll deteriorates, which reduces the absorption of inciders red energy. The development of anhhoeyanin and tannin causes ihe yellow-red fll colors. From Schwaller and Tkach (1985, Figure?) ccayed, and red wavelengths are no longer absorbed. The or- ganic compounds anthocyanin and tannin are formed, causing the familiar autumn colors (Boyer and others, 1988). The spec- trum for senescent foliage (Figure 2-30) shows nearly equal re- flectance values in the gieen, red, and photographic IR toands, which results ina white signature in IR color photographs. Boyer and others also describe the changes in leaf physiology and spectral reflectance during senescence. Signatures of Other Terrain Features ‘The small lake north of UCLA has a dark green signature in the normal color photograph that blends with the vegetation. In the IR color photograph the lake has a dark blue signature that ‘contrasts with the red signature of vegetation, This ability to ‘enhance the difference between vegetation and water is espe cially valuable for mapping drainage patios in heavily forested terrain, Silly water has a light blue signature in IR color photographs, One can recognize damp ground on IR color photographs by its relatively darker signature, caused by absorption of IR energy. Shadows are darker in IR color pho- tographs than in normal color photographs because the yellow filler eliminates blue igh. ‘The IR color photograph in Plate 1D has. better contrast ra- tio than the normal color photograph, for two reasons: 1. The yellow filter eliminates blue light, which is preferen- tially scattered by the atmosphere, as shown by the curve in Figure 2-26C. Eliminating much of the scattering improves, the contrast ratio. 2. For vegetation, soils, and rocks, reflectance differences are ‘commonly greater in the photographic IR region than in the visible region ‘The higher contrast ratio of the IR color photograph results in improved spatial resolation, whichis evident when one com- pares finer details of te two photographs. Onthe slopes of the ‘Santa Monica Mountains (upper left comer), for example, closely spaced shribs may be separated more readily inthe IR color phoograph. In the urban areas, individual buildings are ‘more distincly separate in the IR color example. HIGH-ALTITUDE AERIAL PHOTOGRAPHS Aerial photographs have traditionally been acquired at alti- tudes of approximately 6000 m or less, resulting in scales of 1:40,000 or larger (Table 2-1). In the 1970s, improvements in cameras and film enabled acquisition of photographs at hhighor altitudes, which provide adequate resolution for many applications. The advantage of high-altitude, smaller-scale photographs is that fewer photographs are required to cover NASA High-Altitude Photographs For a number of years, NASA has been acquiring photographs of the United States from U-2 and RB-57 reconnaissance ir- craft at altitudes of 18 km above terrain with standard aerial cameras (152-mm focal length) on film with a 23-by-23-em format, The resulting photographs cover 839 km? at a scale of 1:120,000. Black-and-white, normal color, oF IR. eolor film is, lased: many missions employ two cameras to acquire pho- ‘ographs with two different film types. Coverage of NASA photagraphs is concentrated over nut- ‘merous large regional test sites for which repeated coverage ver several years may be available, Many areas lack this coverage. National High Altitude Photography Program The National High Altitud> Photography (NHAP) program, coonfinated by the U.S. Geological Survey, began in 1978 10 acquire coverage of the United States with a uniform scale and format. From aircraft at an altitude of 12 km, two cameras (23- by-23-cm format) acquire black-and-white photographs and IR color photographs. The black-and-white phorographs are ac- quired using a camera with a 152-mm focal length to produce Photographs at a scale of 1:80,000, which cover 338 km? Figure 2.31 is an NHAP photograph of Washingion, D.C. A stereo pair of these photographs (not illustrated) covers the area of a stanard U.S. Geological Survey topographic quad- rangle (1:24,000 scale) The stereo pair can be used to produce new maps or update existing maps. “The IR color photographs are acquired using a camera with a 210-cm focal length to produce photographs at a scale of 1:58,000, which cover 178 km? ‘SOURCES OF AERIAL PHOTOGRAPHS: ‘The distribution center for aerial photographs aquired by the US. Geological Survey and NASA is US. Geological Survey EROS Data Center Sioux Falls, SD57198 ‘An inguiry to the EROS Data Center (EDC) should specify the latitude and longitude boundaries of the desired area and the type of photography required. The major categories are as follows: 1. Aerial mapping photographs: typically at 1:40.000 scale or larger 2. National High Altitude Photographs: black-and-white and 1:58,000 color IR 3. NASA aircraft photographs: 1:120,000-scale black-and- ‘white, normal color, and IR color 1:80,000-seale The EDC will provide computer listings of available pho- tographs, price lists, and instrictions for selecting and oréering the desired coverage, ‘The Agricultural Stabilization and Conservation Service (ASCS) has also photographed much of the United States. One can obtain a set of state index maps and ordering instructions from Westem Aerial Photography Laboratory ASCS-USDA PO. Box 30010 Salt Lake City, UT 84130 Black-and-white photographs of U.S. national forests are avail- able from regional offices of the U.S. Forestry Service. Many local aerial photography contractors have negatives and can fur- nish prints of areas over which they have flown. If necessary, an serial contractor can be hired to acquire needed photographs. NEW TECHNOLOGY For several decades new technology for aerial photography has been incremental in nature. New cameras have been devel. oped. Films with higher spatial resolution and faster speeds have been introduced. Two recent developments have the po- tential to change the science, Aerial photographs are being digitized and distributed on CD-ROMs that are compatible with desktop computers and image-processing software (see Chapter 8). Many photographs are stored on a single CD-ROM and are readily available for viewing, processing, and reproduction. This technology may replace film and photographic prints as media for storing and distributing photographs. NEWTECHNOUGY 59 Figure 2:31 National High Alnuce Photography program photograph of Washington, D.C, acquired from an altitude of 12 kn, 60 PHOTOGRAPHS FROM ARCRAFT AND SATELLITES. ‘The second trend has the long-term potertial to eliminate film as the medium for acquiring photographs. In digital cam- ceras film is replaced with an array of charge-coupled devices (CCDs), which are tiny light-sensitive detectors. The image is recorded elecironically as an array of digital numbers suitable For computer processing. Norinal color and IR color pho- tographs are recorded by means of internal filters. A number of digital photographs are stored in the camera and are transferred to other storage media, Digital cameras are presently available ‘only in small formats comparable 10 35-mm film cameras. Within a few years this technology may be available for aerial photography. King (1995) reviews the characteristics and ap: plications of digital cameras and photographs. PHOTOGRAPHS FROM SATELLITES Extensive collections of photographs have been acquired from manned and unmanned earth-orbiting film, and filters are modified from those used in aircraft. Beginning in 1962 the United States conducted a series of ‘manned satellite programs in preparation for the Apollo mis- sions that landed humans on the moon, These programs also acquired photographs of the earth that were summarized in Sabins (1986, Chapter 3). The current Space Shuttle program, has acquired many satellite photographs of the earth atelites. The cameras, Space Shuttle ‘The Space Shuttle program, or Space Transportation System (STS), began in 1981. The vehicle is similar in size to a ‘medium commercial jet airliner and accommodates a crew of up to seven on missions that last up to 9 days. Figure 2-32 shows the profile of a Shuttle mission, When launched from Cape Canaveral, Florida, the Shuttle is attached to two soli propellant rockets, plus a large liguid-fuel tank that feeds the three engines. Shortly after launch the solig-propellant rockets are expended and return on parachutes to the ocean, where they are retrieved for future use. Later the extemal liquid-fuel tank is jettisoned and disintegrates. on reentering the aimos- phore. Once the Shuttle is in exbit, it maneuvers with two small rocket engines. Doors to the cargo bay are opened, and the Shuttle inverts to aim remote sensing systems at the earth ‘When a mission is completed, the orbital maneuvering system (OMS) engines fire in retrograde fashion to cause reentry and an unpowered landing. ‘Two Shuttle missions have camied the modular aptoelee. tronic multispectral scanner (MOMS) of the German Aerospace Research Establishment (DFVLR). The MOMS is a along-trick scanner that records two spectral hands (0.575 to 0.625 jum and 0.825 10 0.975 jm) with ground resolution cells of 20 by 20 m. The MOMS has acquired relatively few images. which are available only to investigators selected by the DFVLR. Figure 2-32 Profile ofa Space Shuttle mission PHOTOGRAPHS FROMSATELLTES 61 Many photographs have been nequired from the Shuttle by hhandheld cameras and by the large format camers (to be de: scribed shorty). Handheld-Camera Photographs The Shuttle has several windows for acquiring photographs with handheld cameras Mos. photographs are recorded on normal color film at formats of 35, 70, and 140 mm, using a variety of cameras and lenses. ‘Afler each mission NASA prepares a catalog listing the loca tion, features, and time for cach photograph. For example, the catalog for STS Mission 37 in April 1991 lists 4283. pho- tographs of the eamth, Photographs are available from USS. Geological Survey EROS Data Center Sioux Falls, SD 57198 Telephone: 605-394-6151 ‘Technology Applications Center University of New Mexico Albuquerque, NM 87131 Telephone: 505-277-3622 Media Services Branch Still Photography Library NASA Lyndon B. Johnson Space Center P.O, Box 58425, Mail Code AP3 Houston, TX 77258 Telephone: 713-483-4231 Large-Format-Camera Photographs ‘The large format camera (LEC) was fabricated specifically for the Space Sule. “Large format" refers to the film size of 23 by 46 en with the longer dimension oriented in the orbit direction. The LEC was carried on the October 1984 Shuttle mission and ac- quired black-and-white, normal color, and IR. color pho- tographs. The focal length of the lens is 30.5 cm, and the spa- tial resolution is 80 line-pairs « mm-!. Table 2-4 lists additional characteristics of the LEC. Figure 2-33 is an enlarged portion of an LFC photograph of Boston, Massachusetts, that shows Table 2-4 Camera systems on satelites the high spatial resolution of these photographs and their po- tential for interpreting patterns of land use and land cover. A number of LEC photographs were acquired with forward overlap for stereo viewing. Figure 2-34 isa portion of a stereo pair with 4 vertical exaggeration of an area in the Mojave Desert of southeastem California and adjacent Nevada, The ‘mountains are fault blocks surrounded by valleys filled with detritus eroded from the mountains. Geologie features are par ticularly well expressed in this stereo mode. The Spring Mountains in the east portion (upper part) of the photographs consist of generally westipping sedimentary rocks. The light-toned rocks that form prominent cliffs along the eastern front of the Spring Mountains are sandstone of Jurassic age ‘The sandstone is overlain on the west by dark-toned carbonate rocks of Paleozoic age that have been thrust eastward for many kilometers over the sandstone. The Death Valley and Garlock fault systems are clearly seem at the north flank of the Avawatz Mountains. Reproductions of LEC photographs are no longer available, but the system demonstrated the utility of pho- tographs acquired from orbital alticudes. Declassified Photographs from Intelligence Satellites For many years the United States has employed satellites to ac- quire photographs of strategic areas, mainly the Sino-Soviet bloc, for intelligence purposes. These photographs have been highly classified and unavailable. In 1995 the United States an- nounced the declassification of intelligence photographs ac- quired from 1960 to 1972 by the CORONA camera, which is now obsolete. The CORONA missions acquired over 800,000 photographs. Each photograph covers approximately 16 by 185 km at spatial resolutions ranging from 2 08 m, The trans- parencies will be duplicated, indexed, and transferred to the EDC for unresticted sale to the public by late 1996. Contact the EDC for information on prices and availability. MeDonald (19954, 1995B) has reviewed the history and specifications of the classified U.S. satellite photography programs and has published a number of photographs LEC KVR-1000—TK-350 Characteristics United States Russia Russia Satellite altitude, km 23910370 220 20 Terrain coverage, km Variable 34 by 57 175 by 175, Spatial resolution, m Variable 2w3 50 10 Stereo overlap, % 20 t0 80 Minimal 0 080 Special range, um Normal color 0.51100.76 0.51 100.76 IR color Panchromatic 62 PHOTOGRAPHS FFOM ARCRAFT AND SATELLITES Figure 2-33 Portion of lurge-format-camera photograph of Boston, Massachusets, PHOTOGRAPHS FROM SATELLITES 63 bre ° 10 oe a [A Stree pairwith & verical exaggeration, 8. Location map, Figure 2-24 Steeo pir oflarge-format-camers photographs inthe Mojave Desert of southeastern California and Nevada, 84 PHOTOGRAPHS FROM ARCRAFT AND SATELLITES Figure 2-35 Enlarged portion of Russian KVR-1000 satelite photograph of Washington, D.C. Courtesy EOSAT Co. PHOTOGRAPHS FROM SATELLITES 65 For atleast two decades Russia has acquired photographs from unmanned earth-orbiting satellites, Many of these photographs are available for general distribution. Table 2-4 lists charactr- istics of photographs acquired by the KVR-1000 and TK-350 cameras. These cameras are caried on Kosmos satellites. Each mission lasts approximately 43 days and can phowgraph nearly 50 percent of the earth’s land area, depending. upon ‘weather conditions. Upon completion of a mission the capsule containing the camera and exposed film is returned to earth, Figure 2-35 is an enlarged portion of a KVR-1000 photo- graph of Washington, D.C. The Pentagon is located atthe ex- twome southwest cornice The white circle in the southeast €or ner is the JFK stadium, Its instructive to compare this KYR photograph (acquired from a height of 220 km) with the high altitude aireraft photograph in Figure 2-31 (12km height). The KVR sutscene covers the northeast portion of the area in the aircraft photograph, Seale of the KVR photograph is approxi- mately twice as large as that of the aircraft photograph. This comparison demorstrates the potential of satelite photographs for detailed interpretation. The KVR subscene shows only 36 percent of the original photograph, which covers 2000 kin? ‘TK-350 photographs (not shown) are usually acquired in tandem with the KVR- 000 photographs. The over‘apping 1K- 350 photographs can be analyzed in stereo to obtain elevation data for topographic mapping. Digitized versions of KVR- 1000 and'TK-330 photographs ae available from: EOSAT Company Customer Service Department 4300 Forbes Boulevard Lankam, MD 20706 Telephone: 800-344-0033 EOSAT can provide information on prices and available voverage. COMMENTS Photographs are a versatile and useful form of remote sensing for the following reasons: 1. The film provides excellent spatial resolution and has a high information content 2, Photographs cost relatively litle. 3. Different films provide a sensitivity range from the UV spectral region through the visible and into the reflected IR region. 4. Low-sun-angle photographs enhance subtle topographic features that are suitably oriented with respect to the sun’s azimuth, 5. Stereo photographs are valuable aids for many types of interpretation 86 PHOTOGRAPHS FROM ARCRAFT AND SATELLITES. ‘The principal drawhacks of aerial photographs ave th |. Daylight and good weather are necessary to aequire them, 2. In the shorter wavelength regions. atmospheric scattering reduces their contrast ratio and resolving power, 3, Information is recorded in the analog mode, Film must be digitized in order to digitally process the data (Chapter 8), although digital cameras are being éeveloped. 4. The longest wavelength recorded is 0.9 jum, which omits the valuable spectral information at longer wavelengths. The advantages often outweigh the disadvantages, and one should evaluate aerial photographs as a possible data source for any remote sensing investigation QUESTIONS 1. Normal color photographs taken of subjects in shaded areas hhave a bluish cast. Explain why. 2. Calculate the contrast ratio for a scene in which the bright fest and darkest areas have brightness values of 6 and 2, respectively. 3. Suppose the scere in question 2 is covered by an atmos: phere that contributes 4 brightness values of scattered light. ‘What is the resulting contrast ratio? Panchromatic aerial photographs will be acquired of this scene. How can their contrast ratio be improved? 4. What is the ground resolution for aerial photographs ac- quired at a height of 5000 m with a camera having a system resolution of 30 line-pairs + mmr! and a focal length of 304mm? 5. What isthe minimum ground separation in the photographs ‘of question 4? 6. What is the scale of the photographs of question 4? 7. For Figure 2-8, calculate the height of the highest portion of the building in the extreme lower left comer. 8. The airbase for two overlapping photographs is 1500 m, ‘The photographs were acquired from a height of 3000 m. ‘What is base-height ratio of this stereo pair? What is the vertical exaggeration of the stereo model? 9. Photographs can be acquired from satellites, using the same films and filters as aerial photographs. Describe the advan- tages and disadvantages of satellite photographs relative to acrial photographs, REFERENCES Boyer, M., J. Miller, M. Berlanger, and FE. Hare, 1988, Senescence and spectral reflectance in leaves of northem pin oak (Quercus palusris Mueschh,): Remote Sensing of Environment, v. 25. p 71-81. Buschmara, C.and 6. Nagel, 1993, Jn vio spectroscopy and intemal ‘optics of leaves as basis for remote sensing of vegetation: Intemational Journal of Respote Sensing, v.14, 711-722. veri, J. H. and P. R. Nixon, 1986, Canopy teflecance of wo drought-stressed shrubs: Photogrammetric Engineering and Re- mete Sensing, v.52, p. 1189-1192 Gausman, H. W., 1985, Plant leaf optical properies in visible and near-infrared light Texas Tech University Graduate Studies, No. 29, Lubbock, TX. James, TH, 1966, Te theory of the photographic process, third ei- tion: Msemnllan Co,, New York. NY, Jones, R.C., 1968, How images are detected: Scientific American, ¥.219,p.HIN3, Kienko, ¥.P, 1991, Resource-F subsystem: Geodesy and Cartography, ¥. 7, p-0-15, Moscow, Russia King. D. J., 1995, Airborne multispectral digital camera and video sensors—a ctitical review of system designs and applications: Canadian Journal of Remote Sensing, %21,p. 245-273. La Prade, G. L, 1972, Stereoscopy—a more general theory: Pheto- rammetric Engincering and Remote Sensing v.38, p. 1177-1187, La Prade. G. L., 1973, Stereoscopy—will data or dogma prevail? Photogrammetric Engineering and Remote Sensing, v. 39, p. 1271-1275, MeDonald, R. A., 1995A, Opening the cold war sky to the publie— declassitying Satellite reconnaissance imagery: Phowgrammetic Enzineering and Remote Sensing. v.61. p. 385-290, MeDonald, R. A., 1995B, CORONA: Photogrammetric Engineering and Remote Sensing, v.61, p.689-720. Miller, C. V., 1961, Photogeology: McGraw-Hill Book Co., New York, NY. Philipson, W. R.. ed. 1996, The manual of photographic intepreta- tion: American Seciety for Photogrammetry and Remote Sensing, Falls Church, VA, Roseablum, L., 1968, Image quality in serial photography: Optical Spectra, ¥. 2)p. 71-73. Sabins, FF. , 1973A, Aerial camere mount for 70-mm siereo: Photo- ‘grammesric Engineering and Remote Sensing, v.39, p.579-582. Sabins, FF, 1973B, Engineering geology applications of remote sersing in Moran, D. E, ed, Geotogy, seismicity, and envionmen- {al impact: Association of Engineering Geologists, Special Publi- cation, p. 141, 15S, Los Angeles. CA. Sabins, F. F, 1986, Remote sensing—principles and interpretation, second edition: W. H. Froemun and Co, New Yor, NY. Schwaller, M. R. and S. J. Tkach, 1985, Premature lea senescence— remote sensing detection and utlity for geobotanical prospecting: Economic Geology, v. 80, p. 250-255. Slates, P.N., 1983, Photographic systems for remote sensing in Colwell, R.N., cd, Manual of romote sensing, second edition: ch. 6, p. 281-291, American Society for Photogrammetry and Remote Sensing, Falls Church, VA. Thurell, RF, Photogrammetic Engineering and Remote Sensing p. 579-588, Vizy, K. N., 1974, Detecting and monitoring oil slicks with aerial photos: Photogrammetric Engineering and Remote Sensing, v. 40, p. 697-708, Walker, P.M. and DT. Trexler 1977, Low sun-angle photography: Photogrammetic Engineering and Remoe Sensing, v. 43, p. 493-505. Wolf, PR, 1974, Blements of photogrammetry: McGraw-Hill Book Co,, New York. NY. 1953, Vertical exaggeration in stereoscopic models v 19, ADDITIONAL READING Avery, T. E.and G. L. Berlin, 1992, Fundamentals of remote sensing and aigphoto interpretation, fifth edition: Macrillan Publishing Co., New York, NY. Cravat, H. R. and R. Glaser, 1971, Color aerial stereograms of se lected coustal areas in the United Stites: U.S. Department of Commerce, National Oseanic and Atmospheric Administration, Washington, DC. DeManh, L.E. and EJ. Giorganni, 1989, Color science for imaging systems: Physies Today, n.42,p. 44-52 Falkner, E.. 1995, Aerial mapping methods and applications: Lewis Publishers, Boca Raion, FL, ester, N. HL and E. H. Beaumont, eds, 1992, Photogcology and photogeomorphology: American Association of Petroleum Geol ogists, Treatise of Petroleum Geology Reprint Series, No. 18, Tulsa, OK. Light, D. L, 1996, Film cameras or digital sensors—the challenge lahoad for aerial imaging: Photogremmetric Engineering and Re mote Sensing, ¥.62,p. 285-291 Lynch, D. K_ and W. Livingston, 1995, Color and light in nature: ‘Cambridge Press, Cambridge, MA. ‘Molar, J. D. and J. R. Jones, 1984, Aipboto interpretation and the ‘Canadian landscape: Canadian Government Publishing Center, No MS2.60/1984E, Hull, Canada. Phillipson, W. R. ed. 1966, The marual of photographic interpreta ‘ion, second etition: American Society for Photogrammetry and Remote Sensing, Falls Church, VA. Rasher, M. E. and W. Weaver, 990, Basie photo interpretation: Soil Conservation Service, USS. Department of Agriculture, Washing: ton, DC. Ray, R.G., 1960, Aerial photographs in geologic mapping and inter. ‘pretation: U.S. Geological Survey Professional Paper 373 Smith, 1. T.and A. Anson, eds, 1968, Manual of color aerial photog. raphy: American Society for Photogrammetry and Remote Sensing, Falls Church, Va, ADOITIONALREADNG 67 CHAPTER LANDSAT Landsat is an unmanned system that prior to 1974 was called ERTS (Earth Resources Technology Satellite). Initially NASA operated Landsat, but in 1685 responsibility for operating the system transferred to the EOSAT Company, a private corpora- tion, Landsat operates in the intemational public domsin, which means that 1. under an “open skies” policy, images are acquired of the en- tire earth without obtaining permission from any govern ‘ment; 2. users anywhere in the world may purchase all images at uniform prices and priorities. ‘The Landsat program has been @ major contributor to the growth and acceptance of remote sensing as a scientific disci- pline, Landsat provided the first repetitive worldwide database with adequate spatial and spectral resolution for many applica tions. Landsat data are available in digital format, which has promoted the science of digital image processing. Present and Future generations of remote sensing specialists are indebted to the late William T, Pecora and William Fischer of the US. Geological Survey, who did so much to make Landsat a reality Landsst satellites have been placed in orbit using Delta rockets launched from Vandenberg Air Force Base on the California coast between Los Angeles and San Francisco, The five Lanésats belong to two generations of technology with different satellites, orbital characteristics, and imaging sys- lems, Freden and Gordon (1983) give details of the Landsat program. IMAGES LANDSATS 1,2, AND 3 ‘Table 3-1 lists orbital characteristics of the three satellites of the first generation, which were launched in 1972, 1975, and 1978; all have ceased operation, but they produced hundreds of thousands of valuable images. The multispectral scanner (MSS) was the primary imaging system in this first generation of Landsat. A retur-beam vidicon system was also carried (Sabins, 1987, Chapter 4), but those images were of limited Table 3-1 Oshit patterns and imaging systems of fest and sezond generations of Landsat Landsats Landsats Generation W,2and3— dand 5 Altitude 918km 705 km Orbits per day 4 14s Number of orbits (paths) 251 233 Repeat cycle 18 days 16 days Image sidelap ai equator 14.0 percent 7.6 percent Crosses 40°N latitude at 9:30am, 10:20 a.m (local sun time, approx.) Operational from 1972 to 1984 1982 to future On-board data storage Yes No Tnaging systems Multispectral scanner. Yes Yes ‘Thematic mapper No Yes Table 3-2 Characteristics of Landsat imaging systems Multispectral ‘Thematic ‘scanner (MSS) mapper (TM) Spectral segion ‘Visible and reflected IR- 0,50 10 1.10 um 0.45 t0 235 um. ‘Thermal IR, = 10.510 12.5 um. Spectral bands 4 a Terrain coverage East-west direction 185 km 185km. Nosth-south direction 185 km 170km. Instantaneous field of view Visible and reflected IR. 0.087 mrad 0,043 mrad ‘Thermal IR, = 0.17 mrad Ground resolution cell Visible and reflected IR 79by 79m 30 by 30m, ‘Thermal IR — 120 by 120 m value, Table 3-2 lists characteristics of the MSS, a cross-track scanning system that reconis four spectral bunds of imagery with a ground resclution cell of 79 by 79 m. Figure 3-1 shows speciral ranges of the MSS bands together with reflectance specira of vegetation and sedimentary rocks. Table 3-3 lists the wavelengths recorded by the MSS ands. In this book, MSS bands are designated 1, 2, 3, 4 in aevordance with current ter- minology. Figure 3-2 shows the four spectral bands for an MSS scene of the Los Angeles region. Beginning at the north margin of the image, successive scan lines are offset t0 the west to com- pensate forthe earth's rotation during the approximately 25 sec required to scan the terrain. This offsct accounts forthe slanted parallelogram outline of the images. An IR color image can be prepared by assigning MSS band 1 (green) to the color blue, band 2 (red) to green, and band 4 (reflected IR) to red. A major advantage of Landsat images is the 185-by-185-km (34,000 km?) coverage that facilitates regional interpretations. The Los Angeles scene covers parts af four major physiographic provinces, as shown in the location map (Figure 3-3). The Central Valley in the northwest portion of the scene, is a mayor agricultural area, as shown by the rectangular field patterns. Rugged mountains of the Sierra Nevada separate the Central Valley from the Antelope Valley, which is the westernmost ex- tension of the Mojave Desert. The Antelope Valley is bounded on the north by the Garlock fault and on the south by the San Anideas fault. Both of these active strike-slip faults are clearly expressed as linear valleys. The desert terrain of the Antelope Valley is bright on all the MSS bands. Along the south margin are alluvial fans of gravel eroded from bedrock of the Transverse Ranges. The light- to dark-gray signatures of the fans are determined by characteristics of the parent bedrock, 70 Lavosarimases FE Landsat iss 129 z HAP bonds FE rH Atmoopheric Absorption Bands, Roflectanee, % SS SS SX WIG SOV 7.00 75 2.00 7250 Wavelength, hm Figure 3-1 Reflectance spectra of vezetation and sedimentary socks, showing spectral ranges of Landsat MSS and TM bands. ‘The Sheep Canyon Fan, identified in Figure 3-3, is dark in all MSS bands because it consists of gravel eroded from the elona Schist, which is very dark. Irrigated fields in the Antelope Valley and Central Valley are dark in bane 2 (red) and bright in ands 3 and 4 (reflected IR). These signatures are explained by the spectral reflec tance curve for vegetation in Figure 3-1, Vegetation has low reflectance in band 2 because red wavelengths are absorbed by chlorophyll; reflectance is high in bands 3 and 4 because the intemal structure of leaves strongly reflects these IR wavelengths (Figure 2-28). Mountains of the Transverse Ranges trend westward and form the northem border of the Los Angeles and. Venuura Basins, which are part of the Peninsular Range Province. The ‘mountains are cut by numerous faalts, such as the San Gabriel fault. A conspicuous dark patch north of Ventura was burned in a brushfire. Bright patches inthe extreme eastem portion of the range are clouds lodged against the high mountain ridges. Table 3-3 Landsat multispectral scanner(MSS) spectral bands Projection color for ‘MSSband* — Wavelength,um Color IR color composite image 1) 05 006 Green Bue 265) 06 100.7 Red Green 36) 07 100.8 Reflected IR — 4m O8t1A Reflecied IR Red "Numbers inpareaheses were used or images acquired by Landsts 2, aud 3. Foe Landsat and dhe IMSS bunds sre designated 1,23. and ‘The Los Angeles and Ventura Basins are lowlands underlain by deep depressions filled with sedimentary rocks that genor- ated vast reserves of oil. The giant ol fields of the region are largely depleted and are now being developed into real estate ventures. The Ventura Basin is sill largely agriculural, which causes the bright signatures in bands 3 and 4. The Los Angeles Basin is completely urbanized. The central city has a dark sig- nature on bands 3 and 4 because vegetation is absent. The sur- rounding suburbs have gray signatures due to landscaping min- ‘led with buildings and pavement, Scattered bright patches are parks, golf courses, and cemeteries. ‘Water in the Pacific Ocean is uniformly dark on all bands, ‘The MSS data were digitally processed emphasize land fea- tures. Other processing methods could enhance spectral varia- tions caused by turbidity and shallow bathymetric features. LANDSATS 4 AND 5 The second generation of Landsat consists of two sstelies| launched July 16, 1982, and March 1, 1984. Landsat 4 has ceased functioning. Landsit $ was still functioning as of March 1996, but its ultimate lifetime is unpredictable, Landsat 6 was launched in September 1993 but failed to reach orbit, Which was a major loss to the remote sensing community. ‘When Landsat 5 fails there will be a gap in the formerly con- tinuous coverage of Landsat images, dating back to 1972 Satellites Figure 3-4 shows the second generation of Landsat satellites, which cary an improved imaging system called the thematic ‘mapper (TM) and the MSS. The solar array generates electrical power to operate the satellite. The microwave antenna receives, instructions and transmits image data to the ground receiving stations, shown in Figure 3-5, When a satelite is within the re= ceiving range of a station, TM images are scanned and trans mitted simultaneously. Images of aress beyond receiving ranges are transmitted to Tracking and Dota Relay Satellites CTDRS), which are placed in geostationary orbits. The TDRS system relays the image data (o a receiving station at Noman, Oklahoma, which then relays the data via a communication ‘satellite to the EOSAT facility in Maryland, At EOSAP the data are archived on high-density tapes (HDT). Data in the HDT format are converted into computer-compatible tapes (CCT) that are used for digital image processing and to gener- ‘ate master film transparencies, All MSS data are transmitted directly to ground receiving stations. ‘Thematic Mapper (TM) Imaging System ‘The TM isa cross-track scanner with an oscillating sean mirror and arrays of 16 detectors for each of the visible and reflected IR bands. Data are recorded on both eastbound and west ‘bound sweeps of the mirror, which allows a slower scan rate, longer divell time, and higher signal-to-noise ratio than with MSS images. AV the satelite altitude of 705 km the 14.9° an gular field of view covers a swath 185 km wide (Figure 3-5) Spectral ranges of the six visible and reflected IR bands are shown in Figure 3-1. Band 6 (10.8 to 12.5 4m) records thermal IR energy, which is beyond the range covered by Figure 3-1 ‘The TM was originally designed to include bands 1 through $ (visible and reflected IR) and band 6 (thermal IR). Users pointed out that information in the spectral band from 2.1 to 2.4 jum had great value for geologic mapping and mineral ex: ploration. Band 7 was added to acquire these data, which are widely used, as showin in Chapter 11, The original system for ‘numbering TM bands remained the same, however, which ex: plains why band 7 is out of sequence on a spectral besis. Table 3-4 lists the characteristics of the TM bands, Preparing TM Color Images ‘TM bands are generally made into color images for interpreta- tion (Table 3-5). Band 6 is rarely used because of its coarse spatial resolution (120 m), but it is employed in thermal Lanosars4anos 71 A, Band 1 (05 10 0.6 um). B. Band 2(0.6 100.7 un) ©, Band 3(0.7 08 um), D. Bard 4 (0.8 to 1.1 pm) Figure 3-2 Special bands of MSS images of the Los Angeles region. Images cover 18S by 188km, 72 LANDSAT MAGES ee A [valde | “ Figure 3-4 Landsats 4 and S.'The human figure (2m high) is added for sea. Figure 39 Figure 35 The: LANDSATS4ANDS 79 74 Figure 8.6 Landsat roesiving stations and thoi receiving ranges. Table 3-4 Landsat snematie mapper (TM) spectral bands Band LANDSAT MAGES: Wavelength, jm 0.45 0 052 0.520060 0.6310 069 0.7610 0.90 1.9910 175 10.40 00 12.50 2.08 to 235 Characteristics Blue-green, Maximum penetration of water, which is useful for bathymetric mapping in shallow water, Useful for distinguishing soil from vegetation and deciduous from coniferous plants. Green, Matches green reflectance peak of vegetation, which is useful forassessing plant vigor, Red, Maiches a chlorophyll absorption band that is important for discriminating vegetation types. Reflected IR. Useful for determining biomass content and for ‘mapping shorelines. Reflected IR. Indicates moisture content of soil and vegetation Penetrates thin clouds. Provides good contrast between vegetation types, ‘Thermal IR. Nighttime images are useful for thermal mapping and forestimating soil moisture. Reflected IR, Coincides with an absorption band caused by hydroxy! ions in minerals. Ratios of bands 5 and 7 are used to map hydrothermally altered rocks associated with ‘mineral deposits, Table 3-5 Evaluaticn of TM color combinations Display cotors* Advantages shallow bathymetric features. spatial resolution, Maximumn spectral diversity. ‘Normal color image. Optimum for mapping IR color image. Moderate spatial resolution, Lit Optimum for humid regions, Maximum ‘Optimum for temperate to arid regions. Disadvantages Lower spatial resolution due to band 1 Limited spectral diversity because no reflected IR bands are used, ited spectral diversity. Limited spectral diversity because no visible bands are used, Unfaniliar color display, but merpreters quickly adapt ‘TM bands a ised inthe sequence of pojecion colors: blue-green ed. mapping (Chapter 5). Any thre ofthe six visible and reflected IR bands may be combined in blue, green, and red to produce a color image. There are 120 possible color eombinations, which is an excessive number for practical use. Theory and experi- ence, however, show that a small number of color combins- tions are suitable for most applications. For several years afier TM data became available, they were routinely produced as IR color images (bands 2, 3, and 4 combined in blue, green, and red, respectively) because that familiar combination was the standard for MSS images and matched the signatures of IR color aerial photographs. In recent years, however, we have found that other TM color combinations are more useful. The optimum band combination is determined by the terrain, cli- mate, and nature of the interpretation project. The following examples illustrate bard selection for contrasting terrains in Wyoming and Indonesia Images of Semiarid Terrain, Wyoming Figure 3-7 shows the seven TM bands for the Thermopotis subseene in central Wyoming. The subscene is located in the south flark of the Bighorn Basin and includes a stretch of the Wind River and the town of Thermopolis (Figure 3-7H). Most of the ara is used for ranching. Some irigated crops are «grown in the stream valleys. The Gebo and Little Sand Draw oil fields occur inthe northern part ofthe area. TM Color Combinations Plate 2 shows four color eombin tions of TM images for the Thermopolis subscene, Table 3-5 lists and compares these combinations, Plates 2 and 2B ate the spectral equivalents of normal color aerial photographs and IR color photographs. Vegetation has a red signature in Plate 2B because band 4, which covers the strong. vegetation re- sponse in the reflected IR region. is shown in red. Red beds of the Chugwater Formation (shown by the stippled pattern inthe ‘map of Figure 3-7H) have an orange signature in the normal color image (Plate 2A). In the IR color image (Plate 2B), Chugwater outerops have a distinctive yellow signature that is typical for red rocks throughout the world seen on this color ‘combination, The IR color image has a better contrast ratio ané spatial resolution than the normal color image because the blue band was not used in Plate 2B, Plate 2C uses only the reflected TR bands 4, 5, and 7 as blue, green, and red and has the best spatial resolution of all the combinations. There is little color contrast, however, between the different rock outcrops, whick fare monotonous shades of pale blue, Even the red beds of the Chugwater Formation are light blue and are indistinguishable from the other outcrops. In Plate 2D the visible green band 2 is shown in blue and the TR bands 7 and 4 are combined as red and green. This combi nation provides the maximum range of color signatures for the roek outcrops and is optimum for interpreting geology in this semiarid area. Extensive experience in other arid and semiarid regions throughout the world confirms that the 2-4-7 color ‘combination is optimum. Vegetation is green in this image be: ‘cause band 4 is shown in green, Some investigators prefer a 1 4-7 version of this combination, but I find 2-4-7 to be optimum because band 2 has less atmospheric seattering than band I. Image Interpretation Figure 3-8 is a geologic interpretation map of the 2-4-7 image (Plate 2D}; Table 3-6 lists the forma tions that erop out in the Thermopolis subscene together with their ages, lithology, and signature in the 2-4-7 TM image. The image shows thatthe regional dip of the beds is northward Local reversals of dip toward the south form four major anti: clines. The Red Rose and Cedar Mountain anticlines are lo ceed in the southern portion of the subscene. The Gebo and Litle Sand Draw aniiclines in the north are oilfields that pro- duce fom the Phosphonia Formation. The cross section in Figure 3-9 shows these structural relationships. The oil fields were discovered as a result of surface mapping prior to the launch of Landsat, By studying these known oil fields we learn to recognize similar, but undilled, structures in less well ex plored regions ofthe world. Dip faults, which strike parallel with regional dip, ae ready recognized because they offset formation contact, Strike faults Which trend parallel with regional strike, eliminate or repeat LANOSATS 4ANDS 75 CC. Band 3 (0.63 100.69 um). DD. Band 4 (0-76 100.90 um) Figure 3-7 Landsat TM hands forthe Thermopolis, Wyoming, subscene. 76 LANDSAT MAGES Chugwater Redboas | G. Band7 (2.08 102.35 xm, H. Interpretation map. stippled areas ae outcrops of ‘Chugwater red beds, LANDSATS4ANDS 77 EXPLANATION al Alluvial Deposits (Quaternary Font Union Formation Early Tertiary Lance and Weeteetse fms Laie Cretaceous GG Mesaverde Formation Late Cretaceous Frontier Formation Late Cretaceous Re Choveriy, Mowry, Thermopals Fs, Eary Cretaceous Undivced Foxrations Early Jurassic. Chugwe Trassic re Prosehenia Formation Permian Figure 3-8 Interpretation map forthe Thermopolissubscene. beds and are more difficult to recognize. A strike fault along the south flank of the Red Rose and Cedar Mountain anticlines is recognized because the outcrop of the Chugwater Formation is much narrower on the south flank than on the north flank of the folds (Figures 3-8 and 3-9). The normal strike Fault has cut out 4 portion of the Chugwater beds, The steeper dip on the south flank also contributes tothe narrow outcrop. 78 Lawosar maces: Images of Tropical Terrain, Indonesia Figure 3-10 shows the three visible and three reflected IR bands for the Mapia subscene in the south-central portion of Irian Jaya in Indonesia. This tropical rain-forest terrain pro- vides a contrast with the semiarid rangeland of the Thermo- polis image, South Non a Red Rose Gebo 8 Anticline Anticline wet Oil Field Figure 3-9 Cross section ofthe Themopolis subscene. Location an formation symbols are shown in Figure 3-8 Table 3-6 Formations inthe Thermopolis TM 2-47 subscene Formation Age Lithology Image signature Alluvial deposits Quaternary Soil in floodplains of major streams, Bright green, Flat valley floors with irrigated fields, Fort Union Formation Early Tertiary Resistant sandstone with minor shale beds. Dark pink. Prominent, eroded dipslopes. Meeieetsee and Lance Late Cretaceous Nonresistant shale and sandstone, Medium pink. Formations Broad valley with minor ridges. Mesaverde Formation Late Cretaceous Resistant sandstone with shale and coal beds. Medium pink. Alternating ridges and valleys. Cody Shate Late Cretaceous Noaresistant shale. Light pink, Broad valley with minor ridges. Frontier Formation Late Cretaceous Alternating sandstone and shale, Dark pink, Narrow ridges and valleys Cloverly, Mowry, and Early Cretaceous Resistant and nonresistant shale. Light blue and dark pink. “Thermopolis Formations Mapped as a single unit. Narrow ridges and valleys, Undifferentiated Formations Early Creiaceous Alternating sandstone and shale. Dark pink and light blue. Narrow ridges and valleys Chugwater Formation Triassic Red sandstone and siltstone Yellow and orange Alternating ridges and valleys. Phosphoria Formation Permian Resistant carbonate rocks. Crops out in Very light blue cores of Red Rose and Cedar Mouniain nticlines. Lanosarsaanos 79 ie A Band 1 (0.45 t0 052 um), sh ©. Band 3 (0.6910 0.89 un), D. Band 4 (076 to 0.90 um) Figure 3-10 Landsat TM hands for the Mapia subscene, Irian Jaya, Indonesia 80 —_LawosaT maces Geologie man, Younger alum es] ‘er Aiur Lower H, Explanation forma. LaNDSATS «AND 5 at Table 3-7 Formations in the Mapis TM 4 Formation. Age Lithology Image signature ‘Younger alluvium Recent Soll and gravel Valleys slong major drainages Older alluvium Recent Gravel deposits Erodled terraces and alluvial fans Buru Formation Upper Member, ‘Sandstone and minor shale beds Ridges and dipslopes Early Tertiary Burv Formation Lower Member, ‘Shale ‘Strike valleys Early Tertiary New Guinea Limestone Early Tertiary Limestone Karst topography TM Color Combinations The three visible bands (1,2, 3) in Figure 3-10 have low contrast and poor spatial resolution be- ‘cause the high moisture content of the atmosphere strongly scatters these short wavelengths. The three reflected IR bands (4, 5, 7) have much better contrast and resolution because these longer wavelengths are less susceptible to atmospheric scattering, Plate 3 shows four color combinations for the Mapia subscene, Thanks to computer enfancement the normal ccolor image (Plate 3A) has better color contrast than one would anticipate, based on the appearance of the blick and white bands. Clouds obscure much of the image, and geologic Features are difficult to discem. The 2-3-4 IR color imige (Plate 3B) has more contrast and detail because band 4 (re~ flected IR) replaces the low-contrast band. | image (visible blue). The clouds are greatly diminished. The red signature in- dicates the extensive forest cover of the region, Plate 3C is compiled from reflected IR bands 4, 5, and 7 shown in blue, sreen, and red, For tropical regions this combination provides ‘optimum reselution, color contrast, reduction of clouds, and expression of geologic features. In Plate 3C vegetation has at range of color signatures, which aids interpretation, whereas in Plate 3A.B vegetation is saturated dark green or bright red Vegetation pattems are more distinct in Plate 3C because bands 4, 5, and 7 coincide with major variations in the re flectance spectrum of vegetation, shown in Figure 3-I. Plate 3D consists of bands 2, 4, and 7 in blue, green, and red and is the second best of the four images. This 2-4-7 combination is ‘optimum for the Thermopolis area, but in Irian Jaya the vege- tation cover and humid conditions reduce its effectiveness, Figure 3:11. Cross section forthe Map Image Interpretation Figure 3-10G is the geologic interpreta tion map for the 4-5-7 color image of the Mapia subscene, Table 3-7 lists the formations together with their ages, lithology, and signatures on the 4-5-7 image. Despite the vegetation cover, the different formations are mappable on the image because each unit erodes to a distinctive topographic pattern. The resistant sandstones of the upper memier of the Buru Formaticn erode to form the rugged ledze-and-slope topography in the south por tion of the image (Plate 3C). The nonresistant shale ofthe lower ‘member of the Bury Fomnation forms broad featureless strike valleys. The New Guinea Limestone crops out in the erest of the Mapia and Makamo anticlines (Figure 3-10G), where it forms broad arches. In this humid environment, solution and collapse of the limestone produce a distinctive terrain of closely spaced pis and pinnacles called karst topography. Karst topography is, well developed on the New Guinea Limestone on the crest of Mapia and Makamo anticlines. At the small scale of Plate 3C, however, the kast pattem is somewhat difficult to recognize. Geologic structure is interpreted for the Mapia subscene in the same manner asthe Thermopolis image; howeve1 significant differences in sola illumination Thermopolis Mopia area area ‘Sunazimuth From ENE From SE ‘Sunelevation as. 2° In the Mapia area, shadows and highlights are subdued be- ‘cause the sun has a high elevation and an azimuth nearly paral- Mania bscene. Location ind formation symbcls are shown in Figure 3-10G, H 82 Lavosarmases: Tel with the east-west regional strike. Despite these disadvan tages, geologic structures can be interpreted. In the south por- tion of the subscene, resistant beds of the upper member of the Bury Formation erode to form dipslopes and antidip scarps that define regional south dips. In the central part of the sub= scene, the Mapia and Makamo anticlines form brosd arches of the New Guinea Limestone, surrounded by lowlands of the nonresistant lower member of the Buru Formation. In the cloudy northern portion of the image the New Guinea Limestone directly overlies the lower member of Buru Formation. This relationship is interpreted as a southward-i- rected thrust fault. These structural relationships are shown in the north-south cross section of Figure 3-11. ORBIT PATTERNS ‘Table 3+1 lists the orbital characteristics of the two generations, of Landsat. In order to obtain images of the entire earth, both, generations of Landsat are placed in sun-synchronous orbits. Figure 3-12 shows the fixed circular orbit ofthe second genera- tion Landsats 4 ané 5 (solid line) and the daylight hemisphere of the earth. Every 24 hours, 14.5 image swaths, shown as pat tered strips in Figure 3-12, are generated, Figure 3-13 shows, the southbound, daylight portion of the image swaths (185 km. wide) for a 2hour period. The northbound segment of each orbit covers the dark hemisphere. Polar areas at latitudes greater, than 81° are the only regions not covered. Every 24 hours the carth’s roiation shifts the image swaths westward. After 16 days, the earth has been covered by 233 adjacent, sidelapping image ‘watis and the eyele begins again. This 16-day interval is called Figure 3-12 Landsat TM sun-synchronous orbit (solid circle) and ‘mage swaths (patered lines) generated in one day. Diagram plotted With Satellite Too! Kit of Analytical Graphs Ine... King of Prussia. Pennsyivania. Courtesy D. N. Boosalis, AGL the repeat cycle. The sun-synchronous orbit patiern causes the corresponding orbits in each repeat eycle to occur at the same time, For example, every 16 days a southbound second-genera- mn Landsot crosses Los Angeles at approximately 10am. local Figure 9-13 Map showing the 145 southbcund, daytime image swaths (patterned lines) daring asin le day of Landsat 4 and'5. Each day the earth's rotation shifts the patter westvard; afte 16 days the earth is covered ang the cycle is epeated, For comparison the solid curves are equatoral orbits ofa tpi ‘eal Space Shute mission. Diagram ploted with Suelie Tool Kit of Analytical Graphics Ine. King of Prussia, Pennsylvania, Courtesy D.N. Boosalis, AGI ORBITPATTERNS 83

You might also like