Download as pdf or txt
Download as pdf or txt
You are on page 1of 60

Remote Sensing Systems

Sanjeevan Shrestha
Shr.sanjeevan@gmail.com
Remote Sensing Systems
• The details of sensor construction and materials vary with the
wavelengths of interest, and the dimensions of optical system, and
detector.
• The ability of remote sensing system is generally associated with the
resolution, which is defined as, “The ability of an imaging system to
record fine details in a distinguishable manner”.
• In general, resolution is defined as the ability of an entire remote
sensing system, including lens, antenna, display, exposure, processing
and other factor, to render a sharply defined image.
Spatial Characteristics
• The spatial characteristics of remote system is generally termed as spatial
resolution.
• Spatial resolution describes the ability of a sensor to identify the smallest
size details of a pattern on an image.
• This is also defined as the smallest unit of distance (usually a side length of
a square are in an image) that can be determined by a sensor
measurement of the target.
• This is directly related to pixel size, which is refers to the size of smallest
possible feature that can be detected.
• Same area of higher spatial resolution contains more pixels than lower
resolution
Spatial Characteristics
• High spatial resolution: 0.3 - 4 m
Medium spatial resolution: 4 - 30 m
Low spatial resolution: 30 - > 1000 m
Spatial Characteristics
• Every pixel represents an average in each of three dimensions:
space, wavelength and time.
• The average over time is usually very small (on order of
microseconds for a whiskbroom scanner such as TM and
milliseconds for a push broom scanner such as SPOT) and is
inconsequential in most applications.
• The average over the space and wavelength, however, defines
the characteristics of data.
Spatial Characteristics
• If we imagine a three-dimensional continuous parameter space
(x,y,⅄), defined over spatial coordinate (x,y) and spectral
wavelength (⅄), we can visualize each pixel of a given image as
representing an integration over a relatively small volume element
in that continuous space
• The (x,y,⅄) space is not quite as neatly divided. But now, we will
assume this convenient subdivision.
Spatial Characteristics
• The grid of pixel that constitutes a digital image is achieved by a
combination of scanning in the cross-track (orthogonal to the motion
of the sensor platform) and by the platform motion along the in-track
direction
• A pixel is created whenever the sensor system electronically samples
the continuous data stream provided by the scanning.
Mode of Scanning
• Three mode of scanning
• Line Scanner
• Whiskbroom Scanner
• Pushbroom Scanner
Line Scanner
• Line scanner uses a
single detector
element to scan the
entire scene
Whiskbroom Scanner

• Use several detector elements,


aligned in track, to achieve
parallel scanning during each
cycle of the scan mirror
• This is used in Landsat TM
Whiskbroom Scanner
• A related type of scanner is the paddlebroom, exemplified by AVHRR
and MODIS, with a two sided mirror that rotates 360 degrees,
scanning continuously cross-track.
• A significant difference between paddlebroom and whiskbroom
scanners is that the paddlebroom always scans in the same direction,
while the whiskbroom reverse the direction for each scan
PushBroom Scanner

• Pushbroom scanner, such as


SPOT, have a linear array of
thousands of detector elements,
aligned cross-track, which scan
the full width of the collected
data in parallel as the platform
moves.
Scanner
• For all types of scanners, the full cross track angular coverage is called
Field of View (FOV) and the corresponding ground coverage is called
ground –projected field of View (GFOV).
• GFOV is also called swath width or the footprint of the sensor
• The spacing between pixels on the ground is the Ground-projected
sample interval (GSI)
• The cross track and in track GSIs are determined by the cross-track
and in-track sampling rates, respectively, and the in-track platform
velocity.
• It is common practice to design the sample rates so that the GSI
equals the ground projected instantaneous field of view (GIFOV)
• GIFOV is the geometric projection of a single detector width, w, onto
the earth’s surface.
• Thus, the GIFOVs of neighboring pixels will abut, both in-track and
cross-track.
• The in-track GSI is determined by the necessary combination of
platform velocity and sample rate (pushbroom) or scan velocity (line
and whiskbroom) to match the in-track GIFOV at nadir.
• The GSI is determined by the altitude of the sensor system H, the
sensor’s focal length f, and the inter-detector spacing (or spatial
sampling rate or detector width).
• If spatial sampling rate is equal to one pixel per inter-detector
spacing, the relation for the GSI at nadir is simply,
𝐻
𝐺𝑆𝐼 = 𝑖𝑛𝑡𝑒𝑟𝑑𝑒𝑡𝑒𝑐𝑡𝑜𝑟 𝑠𝑝𝑎𝑐𝑖𝑛𝑔 ×
𝑓
𝑖𝑛𝑡𝑒𝑟𝑑𝑒𝑡𝑒𝑐𝑡𝑜𝑟 𝑠𝑝𝑎𝑐𝑖𝑛𝑔
= 𝑚
Where f/H is the geometric magnification , m, from the ground to the
sensor focal plane
• The GIFOV depends in a similar fashion on H, f, and w. System design
engineers prefer to use the instantaneous field of view (IFOV) which is
defined as the angle subtended by a single detector element on the
axis of the optical system.

𝑤 𝑤
𝐼𝐹𝑂𝑉 = 2 tan−1 ≅
2𝑓 𝑓
For GIFOV, we have,
𝐼𝐹𝑂𝑉
𝐺𝐼𝐹𝑂𝑉 = 2𝐻 tan
2
𝐻
=𝑤×𝑓
𝑤
= 𝑚
• The GSI and GIFOV are found by scaling the inter-detector spacing
and width, respectively, by the geometric magnification, m.
• Users of satellite and aerial remote sensing data generally prefer to
use GIFOV rather than IFOV in their analyses
• Sensor engineers, on the other hand, often prefer the angular
parameters FOV and IFOV because they have the same values in
image and object space.
• The field of view (FOV) of a sensor is the angular extent of data
acquisition cross-track.
• The corresponding cross-track ground distance is given by
𝐹𝑂𝑉
𝐺𝐹𝑂𝑉 = 2𝐻 tan
2
The GFOV is also called the swath width.
Radiometric Characteristics
• The sensor collects some of the electromagnetic radiation (radiance)
that propagates upward from the earth and forms an image of the
earth’s surface on its focal plane
• Each detector integrates the energy that strikes on its surface to form
the measurement at every pixel
• The integrated energy at each pixel is converted into an electrical
signal and quantized as a integer value, the Digital Number (DN)
• As with all digital data, a finite number of its, Q, is used to code the
continuous data measurements as binary numbers.
Radiometric Characteristics
• The number of discrete DNs is given by,
𝑁𝐷𝑁 = 2𝑄
And the DN can be any integer in the range,
𝐷𝑁𝑟𝑎𝑛𝑔𝑒 = 0, 2𝑄 − 1

The larger the value of Q, the more closely the quantized data
approximates the original continuous signal generated by the detectors
and the higher the radiometric resolution of the sensor.
Radiometric Characteristics
• So, radiometric resolution refers to the number of digital levels used
to express the data collected by the sensor.
• This is determined by the number of discrete levels into which signal
may be divided.
• The radiometric resolution of an imaging system describes its ability
to discriminate very slight difference in energy
• The finer the radiometric resolution of a sensor, the more sensitive it
is to detecting small difference in reflected or emitted energy.
Radiometric Characteristics
• SPOT and TM have 8 bits per pixel, while AVHRR has 10 bits per pixel
• To achieve high radiometric precision in a number of demanding
applications, the EOS MODIS is designed with 12 bits per pixel and
most hyperspectral sensors use 12 bits per pixel
Spectral Properties
• Discrete multispectral channels are typically created in an optical sensor by
splitting the optical beams into multiple path and inserting different
spectral filters in each path or directly on the detectors.
• In this way, the image can be captured in different bands
• This is directly associated with the spectral resolution of the image.
• Spectral resolution describes the ability of the sensor to define fine
wavelength intervals.
• The finer the spectral resolution, the narrower the wavelength range for a
particular channel or band; low spectral resolution means the sensor
records the energy in a wide band of wavelength as a single measurements
Spectral Properties
• B/W (panchromatic) sensor which covers a wide spectral range, the
visible portion of EM spectrum has coarse (lower) spectral resolution
because it records entire visible portion not individual bands.
• But in color image spectral resolution is fine because it records
reflected energy at blue, green and red wavelength of spectrum.
Spectral Characteristics of sensor
• The spectral location of sensor bands is constrained by atmospheric
absorption bands and also determined by the reflectance features to
be measured
• If sensor is intended for land or ocean applications, atmospheric
absorption bands are avoided in sensor band placement
• On the other hand, if a sensor is intended for atmospheric
applications, it may very well be desired to put spectral bands within
the absorption features.
Temporal Characteristics
• One of the most valuable aspects of unmanned, near polar orbiting
satellite remote sensing systems is their inherent repeating coverage
of the same area on the earth.
• This is particularly important for the monitoring of certain earth
based phenomena eg: monitoring agricultural crop.
• Many remote sensing satellites, including Landsat, AVHRR and SPOT
are the sun synchronous orbits which means that each always passes
over the same spot on the ground at the same local time.
Temporal Characteristics
• The interval between revisits, which in turn referred as the temporal
resolution of image, solely depend of the particulars of the satellite
orbits for sensors that have a fixed view direction, such as Landsat
TM.
• If more than one system is in orbit at the same time. It is possible to
increase the revisit frequency.
Platform
• The place where the sensor resides, remote from the target or
surface being observed is Platform
• In order for a sensor to collect and record energy reflected or emitted
from a target or surface, it must reside on a stable platform removed
from the target or surface being observed. Very often sensor is
mounted on a moving vehicle which we call platform.
• Platforms for remote sensors maybe situated on the ground, on an
aircraft or balloon(or some other platform within the Earth's
atmosphere),or on a spacecraft or satellite outside of the Earth‘s
atmosphere.
Ground Based Remote Sensing
• Ground-based sensors are often used to record detailed information
about the surface which is compared with information collected from
aircraft or satellite sensors.
• In some cases, this can be used to better characterize the target
which is being imaged by these other sensors, making it possible to
better understand the information in the imagery.
Airborne Remote Sensing
• If sensor is placed in aircrafts like helicopter, airplane etc. That system
is called Airborne remote sensing ( e.g. Photogrammetry)
• It is carried out using sensors mounted on the aircraft.
• Since sensors are placed at lower altitude less spatial coverage is
achieved.
• Ex. AVIRIS (Airborne Visible Infrared Imaging Spectrometer)
Spaceborne Remote Sensing
• If sensor is placed in satellite, that system is called Spaceborne
remote Sensing
• It is carried out using sensors that are mounted on satellites and
space stations.
• Since sensors are placed at higher altitude more spatial coverage is
achieved.
• Ex. EOS(Earth Observation Satellites)
Spaceborne Remote Sensing
• For remote sensing purpose, the following orbit characteristics are
relevant
❖ Orbital altitude: distance (in km) from the surface of earth
❖ Orbital inclination angle: angle (in degree) between the orbital plane and
the equtorial plane
❖ Orbital Period: time (in minutes) required to complete one full orbit
❖ Repeat cycle: time (in days) between two successive identical orbits
❖ revisit time: time between two subsequent image of same area
Satellite Orbit
• Determined by Kepler’s Laws
❖ perturbed by gravitational irregularities, friction etc.
❖Ranging and repositioning is required for satellites to maintain their orbit.
❖The orbital period is the time it takes for the satellite to circle the earth.
❖Easy to compute if we assume the Earth is a sphere but Earth is an oblate
ellipsoid.
Satellite Orbit
• Sun-synchronous
❖ Remain on illuminated side of Earth as Earth rotates underneath
❖ Orbital inclination measured against equatorial plane.
❖ Satellites cross equator at same time everyday
❖Cover each area of the world at a constant local time of day called local
suntime.
❖ To achieve this inclination angle must be carefully chosen between 98 to 99
degree.
Satellite Orbit
❖Most satellites in this orbit cross the equator at mid-morning at around 10:30
am.
❖ Allows to record images at 2 fixed times in a day i.e. at daytime and at nighttime(
for thermal or radar)
❖ Typically placed at 600 to 1000 km altitude.
Satellite Orbit
• Geo-stationary
❖Orbit speed matched to rotation of Earth.
❖ Location static above a geographic location.
❖ Altitude is 36,000 km above equator.
❖The inclination of geostationary satellite is 0 degree with the equator
❖ Since satellites in this orbit revolve at speed to match the rotation of earth so
they seem stationary relative to the earth surface.
❖ This allows the satellite to observe and collect information continuously over
a specific areas.
Satellite Orbit
❖ Due to their high altitude, Some satellite can monitor weather and cloud
pattern covering entire hemisphere of the earth
❖ Weather and communication satellites commonly have these types of orbits
Such as GMS, GEOS, METEOSAT, INSAT etc
Satellite Orbit
• Use of Sun-synchronous orbit
❖ Equatorial crossing time depends on nature of application
❖ Polar Orbiting
❖ Earth monitoring ( global coverage)
❖ Orbital altitude typically between 600 -1000km
❖ good spatial resolution
Satellite Orbit
• Use of Geo-stationary orbits
❖ Weather satellites (GEOS, METEROSAT)
❖ Telephone and television relay satellites
❖ Constant contact with ground stations
❖ Limited spatial coverage
❖ Each satellite can only cover about 25-30% of the earth’s surface
Image Data Characteristics
• Image size: The number of rows and columns in a scene.
• Number of bands: The number of wavelengths bands stored.
• Quantization: The data format used to store the energy
measurements. The discrete values used are known as Digital
Numbers (DN).
• Ground pixel size: The area coverage of a pixel on the ground.
Image Data Characteristics
• Image characteristics:
❖Spatial: area measured
❖ Spectral: wavelengths sensor is sensitive to
❖ Radiometric: energy levels measured
❖ Temporal: time of acquisition

• Each of these to be specified by:


❖ Coverage: range between min. and max.
❖ Resolution: smallest units distinguished
Image Data Characteristics
• Higher spatial resolution, smaller area covered per image
• Amount of reflected energy per unit area is limited
• if energy is divided in more bands, pixel size needs to increase to pass
threshold for recording.
• If pixel size needs to be small, fewer, broader bands
• (compare MS and pan bands of same sensor)
Image Format
• Remote sensing images are stored on a disk or tape in one of the
three format.
• Band-Interleaved-by-Sample (BIS)
• Band SeQuential (BSQ)
• Band-Interleaved-by-Line (BIL)
Band Interleaved by Sample
• This BIS format is also called
Band-Interleaved-by-Pixel
(BIP)
• The data for each pixel is
written band by band
Band SeQuential
(BSQ)
• This format stores information
for the image one band at a
time.
• In order word, band 1 is stored
first, then data for all pixels for
band 2 and so on.
Band Interleaved by Line (BIL)
• Band interleaved by line data
stores pixel information
band by band each line, or
row, of the image
• For example, given three
band image, all three bands
of data are written for row 1,
all three bands of data are
written for row 2 and so on
until the total number of
rows in the image is reached
Image Format
• These format are determined by different ordering of the three data
dimensions
• From a data access time viewpoint, the BSQ format is preferred if one
is mainly interested in working with individual spectral bands,
• BIS is preferred if one is working with all spectral bands from a
relatively small image area.
• The BIL format is a commonly used compromise
Data Systems
• The aim of preprocessing is to create a consistent and reliable image
database by
• Calibrating the image radiometry
• Correcting geometric distortions
• Removing some types of sensor noise
• Formatting to a standard prescription
• A taxonomy of data products has evolved for major remote sensing
systems along the lines.
Data Systems
• A generic hierarchy that encompasses these examples is given by four
major processing levels
• Reformatted raw data
• Sensor corrected data
• Geometric corrections
• Radiometric corrections
• Scene corrected data
• Geometric corrections
• Radiometric correctionss
• Geophysical data

You might also like