Download as pdf or txt
Download as pdf or txt
You are on page 1of 70

lOMoARcPSD|34709425

Geog 228 Final and Midterm

Introduction to Remote Sensing (University of Victoria)

Studeersnel wordt niet gesponsord of ondersteund door een hogeschool of universiteit


Gedownload door Faissal Bozi (faissalbozi@gmail.com)
lOMoARcPSD|34709425

GEOG 228- MIDTERM AND FINAL DOC


Winter 2022

[School]
[Course title]
Gedownload door Faissal Bozi (faissalbozi@gmail.com)
lOMoARcPSD|34709425

Table of Contents
Intro to remote sensing.......................................................................................................................................................................... 2
Scale ................................................................................................................................................................................................... 1
Types of sensors: ............................................................................................................................................................................... 1
Aerial photos - passive .................................................................................................................................................................. 1
Interpretation of aerial photos .............................................................................................................................................................. 5
The role of scale on texture ................................................................................................................................................................... 6
Photo Interpretation key ................................................................................................................................................................... 7
Scale ................................................................................................................................................................................................... 9
Mission planning .................................................................................................................................................................................. 10
Let’s plan a mission ....................................................................................................................................................................... 5
Sensors ................................................................................................................................................................................................... 2
Radiometric resolution ...................................................................................................................................................................... 4
Image characteristics ............................................................................................................................................................................. 5
Product .............................................................................................................................................................................................. 8
According to the orbit: ....................................................................................................................................................................... 9
Active sensors ...................................................................................................................................................................................... 16
Backscatter ...................................................................................................................................................................................... 16
Electromagnetic spectrum and light attenuation by the atmosphere ................................................................................................. 17
Scattering ......................................................................................................................................................................................... 18
Radiometric/atmospheric correction ................................................................................................................................................... 22
Scene illumination ........................................................................................................................................................................... 22
When do we need it? ....................................................................................................................................................................... 23
Types of correction .......................................................................................................................................................................... 23
1. Relative ............................................................................................................................................................................... 24
2. Absolute .............................................................................................................................................................................. 25
Geometric Correction........................................................................................................................................................................... 26
Sources of distortion ........................................................................................................................................................................ 26
Two approaches to geometric correction: ...................................................................................................................................... 28
Geometric Correction Steps ............................................................................................................................................................ 28
What is a “Ground Control Point” (GCP)? ................................................................................................................................... 28
Enhancement Techniques .................................................................................................................................................................... 33
Key concepts .................................................................................................................................................................................... 33
PLOTS WILL BE ON FINAL............................................................................................................................................................. 34
1. Contrast manipulation ........................................................................................................................................................ 34
2. Spatial Filtering - know this ................................................................................................................................................ 36
3. Band Math ............................................................................................................................................................................... 38
Image classification .............................................................................................................................................................................. 40
Final Review ......................................................................................................................................................................................... 51

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Intro to remote sensing


1. Intro
2. Course structure/content
3. What is remote sensing
4. Aerial photos

What is your goal?


To learn about...
• Aerial photos • Imagery geometric corrections
• Imagery structure • Imagery enhancement
• Remote sensing sensors • Imagery classification
• Imagery radiometric corrections • Accuracy assessment

Why Earth Observation?


• Some of the specific applications of Earth observations include:
o Ocean dynamics
o Forecasting weather
o Tracking biodiversity and wildlife trends
o Measuring land-use change (such as deforestation, urban planning, heat islands)
o Monitoring and responding to natural disasters, including fires, floods, earthquakes, and tsunamis
o Managing natural resources such as energy, freshwater and agriculture
o Addressing emerging diseases and other health risks
o Predicting, adapting to and mitigating climate change

Kelp reflects light in the infrared spectrum


• biophysical properties: ocean productivity
o Which zones are most productive?
• Biophysical properties: ice conditions
o Ice thickness

What is remote sensing


• Remote sensing is the science of obtaining information about an object,
area, or phenomenon through the analysis of data acquired by a device
that is not in contact with the object, area, or phenomenon under
investigation...in the context of this course, relates to Earth
Observation.
• Data acquisition:
o Platform depends on application
▪ What information do we want?
▪ How much detail?
▪ What type of detail?
o Spacecraft (satellite)
o Aircraft (aerial)
o UAV
o Ground (surface)

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Scale
• High spatial resolution to low spatial resolution; you loose
information.
• High spatial resolution = more detail
• Low spatial resolution = low detail

Types of sensors:
• Passive: energy (radiation) naturally emitted/reflected by an object, area, or phenomenon is detected by a
sensor and used to infer information about the object, area, or phenomenon from which energy originated.
o sunlight
• Active: energy is directed at an object, area, or phenomenon, and that portion which returns to a sensor is
used to infer information about the object, area, or phenomenon from which the energy returned.
o Interacts with objects in the landscape
o Platform producing the energy
o LiDAR

1. Reflected solar radiation: passive - sun as direct source of energy


- Multispectral and hyperspectral
1. Airplane (aerial photos)
2. Satellite
3. UAV

2. Emitted (thermal) radiation: passive - sun as original source of energy

3. Backscattered radiation: active - own source of illumination


a. Platform has own illumination source

Aerial photos - passive


1. History
o 1858 - first aerial photograph form a balloon -
Tournachon France
o 1860 - J.W. Black - Boston from a balloon
o 1903 - Pigeons

• A lot of things can be done with these series of air


photos

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

o Conditions, changes over time, climate impact assessment


• GeoBC air photo archive; UBC archive

2. Films

A. Reflected light
• Type of film, and how
the surface reflects light
in different ways
o Sensitive to one
part of the light
spectrum
o Different parts
of the colour
spectrum shown
▪ Vegetation - reflects infrared colour
• Sensitivity of the film
• How the different surfaces reflect the light
• Infrared light absorbed by water
• Depends on application for picking type of aerial photo
• Why are these photos different?
o Electromagnetic spectrum - sun
▪ Suns photosphere....
▪ 400-700 nano meters - human vision

B. Surface reflectance
• Percentage of light reflected from a specific surface
• Green dominates reflection - 60% (our eyes only pick up 20%, not infrared)
• Asphalt - very low reflectance
• Water absorbs light - film won’t be able to detect it

C. Film properties
• Doesn’t separate light into BGR wavelengths
• Homogenous to 0.7 micrometers
• Panchromatic - not very many differences, an average of the
visible light reflected by the surface
• B&W infrared - easily differentiable
• Strong signal
• Water has very low reflection in the infrared
spectrum

• Natural colour - reflects own individual colour


• False-colour - reflects green red and infrared light
• Green sensitivity - associated with blue colour;
• Red sensitivity - associated with green colour;
because of the 60% vegetation reflection, we
only see 20%
• Infrared sensitivity - associated with red colour
• Less than 1% reflected by water

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

• Why are these photos different


• Panchromatic - doesn’t separate the different colour
spectrums
• Infrared - difference within forested areas (vegetation
reflects a lot of light)
• Dark areas likely not as healthy as light areas in
forest

• Natural
• Infrared colour
• Natural vegetation in red
• Turf inside stadium not natural
so does not reflect red

3. Photo properties
• Vertical: optical axis of camera at vertical (perpendicular to
the surface) 90* (+/-3)
o Field of view dictates the width of air photo
o The scale is essentially constant
o Directions can be measured
o Can be used as a map (with supplementary data)
o Stereo
o Info: planimetric (x, y) topographic (z - DEM)

• Oblique: tilt > 20*; views distorted - distant figures smaller


o High oblique (~60*): sky, horizon, surface

o Low oblique: only surface


o Large area covered than on a single vertical photo
o Views are distorted
o No scale
o Distant figures smaller
o Used as supplement to vertical air photos

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

General characteristics

• Annotations: vertical photos main points


o Fiducial marks -- the middle of each side length in a photographic
frame
o Principal points -- the center of the line of flight and y-axis

• 23x23cm is constant

4. Interpretation
A. Stereoscopy:
• One eye: allow monocular vision (limited depth perception)
• Two eyes: depth perception = stereoscopy --> each eye focuses on the same object from a
different position and sends a slightly different image to the brain, where the two images are
focused in 3D.

• 6.4 cm = eye base of the average human


• This distance limits the depth perception of humans to
about 1000m
• Will only happen when both eyes are in the same plane

• Solution: stereo pairs


• Consecutive exposure positions is a substitute for eye positions

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

• Key acquisition: overlapping photographs - stereo pairs


• "the same object will appear on two or more of the photos but from slightly
different viewing angel" --> stereoscopic parallax
• How does it work?
• Each eye occupies one of the exposures (photo), so the overlap is seen in 3D

Interpretation of aerial photos


• Interpreter uses own knowledge along with the ability to recognize
scene elements
• Interpreter also uses photo-interpretive keys
o A set of guidelines to assist interpreters in rapidly identifying
photographic features by using the elements of
photointerpretation.

1. Shape
• Shape: external form or configuration of an object
o Regular uniform shapes often indicate a human
involvement
o Ex. Linear, curvilinear, circular, elliptical, radial, square etc.
2. Size
• Size: a measure of the objects surface dimensions
o Absolute: length, width, perimeter, area and volume. It
depends on the scale of the aerial photo
o Relative: small, medium, large
▪ Ex. Single lane vs. Multi-lane highways

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

3. Tone/colour
• Tone/colour: related to the reflective characteristics of
objects within the photographic spectrum
• Tone/colour = f (surface spectral signature, film, filter,
sunlight)
• Representation
o Shades of grey (tone) - light (bright), intermediate
(gray), dark (black)
o colour
• Sand under the water -- signal not penetrating water
• Near infrared - mangrove most visible due to foliage
reflecting -- associated with red colour
• Water = low signal

4. Texture
• Texture: the visual impression of coarseness or smoothness caused by the
variability of uniformity (repetition) of image tone or colour
o Types: smooth (uniform, homogenous), intermediate, rough (course,
heterogenous)
o Ex. Calm water has a smooth texture; a forest canopy has rough
texture
o Scale dictates texture too
▪ Changes the visible texture depending on the different types of scale

The role of scale on texture


o Air photos should have the same scale

5. Pattern: the overall spatial form of related features


• (spatial arrangement of objects (e.g. row crops vs.
pasture)
• Types: random, systematic, circular, oval, curvilinear,
linear, radiating, dendritic, rectangular, pentagonal, etc
o Man made vs non-man made

6. Shadow
• Shadow: a shadow provides information about the object's height, shape,
and orientation

7. Association/Site
• Association/Site: associating the presence of one object with another, or
relating it to its environment
o Ex: industrial buildings often have access to railway sidings;
nuclear power plants are often located beside large bodies of
water
• Can also relate to elevation or vertical location (e.g. low-lying, or high
elevation)

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

8. Time
• Time: temporal characteristics of a series of photographs can be helpful in
determining the historical change of an area
o Ex: looking at a series of photos of a city taken in different years can
help determine the growth of suburban neighbourhoods.

Photo Interpretation key


• a set of guidelines to assist interpreters in rapidly identifying photographic
features
o Dichotomous Key (from general to specific)

Example Dichotomous Key


• Texture – smooth ……..Cropland
o Colour – Light yellow or grey……..Bare soil
o Colour – Red ………………..Corn or Soybean
• Texture –Rough ………Forest
o Colour – Red or magenta …..Hardwoods
o Colour – Very dark red ………Conifer
▪ Site – Upland………………..Jack Pine
▪ Site – Lowlands …………….Black Spruce

• OPEN WATER
o Texture: Smooth, glossy surface
o Pattern: White and blue-green
o Color: Dark to light blue; flat blue
• MARSH
o Texture: Marbled; Rough
o Pattern: White with speckles of blue
o Color: Predominately white; White with blue; Light blue; Some pink
• FLATS and BEACHES
o Texture: Smooth to Rough appearance
o Pattern: Solid white elongated feature
o Color: Bright white; White with blue; Light gray-white
• URBAN
o Texture: Smooth; Flat
o Pattern: Regular patterns
o Color: White; Some red mixed in; White with gray mottled in regular patterns

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

• FORESTED WETLAND
o Texture: Rough, uneven surface; Bumpy
o Pattern: Generally, non-regular
o Color: Red; Dark red, even black; Brilliant red; Gray blue with some red spots
• WETLAND SCRUB-SHRUB
o Texture: Rough, uneven surface
o Pattern: Pin-point red dots
o Color: Light pink; Some red
• UPLAND FORESTED
o Texture: Dense red dots; Uneven surface
o Pattern: Dark red to black
o Color: Dark red; Some pink with red for scrub-shrub
• AGRICULTURE
o Texture: Smooth; Flat
o Pattern: Regular patterns; Possibly visible field rows
o Color: White; Pink; Red (dependent on season); Some gray mixed in with other colors
**note: this guide is referring to colour IR photograph

Classes
1. Trees (woody vegetation, forested areas, landscaped trees)
2. Fields (grass, other herbaceous vegetation, gravel, sand, dirt)
3. Impervious surfaces (concrete, buildings, etc.)

Enhancement techniques for better interpretation: digital


• Digital enhancement helps a lot to separate details previously hidden

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Scale
• Scale is the ratio between the distance on the photo and the corresponding distance on the ground
• Expressed as a representative fraction (RF)
o Written as 1:100000 etc.
o LARGE SCALE: large scale photos (ex. 1:25,000) cover smaller areas in greater detail. A large scale
photo simply means the ground features are at a larger more detailed size
o SMALL SCALE: smaller scale photos (ex. 1:50,000) cover larger areas in less detail. A small scale photo
simply means that ground features are at a smaller less detailed size.

• Scale determination based on:


o Focal length and altitude
o Photo-map distance
o Photo-ground distance

1. Scale determination from focal length and altitude


• Scale of vertical photography is a function of the focal length of the
camera (f) and the heigh above the terrain from which the exposure is
made (H)
RF = 1/H/f
• Insert diagram
Focal length
o Distance from the middle of the camera lens to the focal plane (ie. The
film plane)

o Focal length (f): linear distance from the center of lens to the focal
plane

▪ Increasing the focal length increases "zoom"


Altitude
o Linear distance from the center of lens to the ground

Example
o You have a photograph that was taken with a camera of focal
length 210mm and average ground elevation of 400m at a flight
altitude of 2500m above mean sea level
RF = 1/ H/f
▪ Flight altitude :2500m
▪ Average ground elevation 400m
▪ F = 210mm
• RF = 1/(2500-400m)/0.21m)
• RF = 1/10,000
1:10,000
• This is the average or nominal scale of the photograph
• ONLY GUARUNTEED FOR PP (principal point, center of aerial pic)

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

2. Scale determination from photo-map distance


RF = 1/ (MD)(MS)/PD
• MD = map distance between two points
• MS = map scale representative fraction
• PD = photo-distance between the same two points

• PD: where to define the two points


• Same height
• Equidistant from the principal point

• Example
• Distance between two roads on a map = 6.0cm. Map scale =
1:50,000 (RF = 50,000). Distance between the same two roads
on the photo 3.2cm
RF = 1/ (MD)(MS)/PD
RF = 1/ (6*50,000)/3.2
RF= 1:93,750

3. Scale determination form photo-ground distance


• RF = 1/ GD/PD
GD = ground distance between two points
PD = photographic distance between the same two points
• Example
• Distance between two buildings on the airphoto = 5 cm --> 0.05m
• GD = 1584 m
1/(1584m/0.05m)
• RF = 1:31,680

Mission planning
• Each flight costs money
• The best coverage requires a trade-off between scale and coverage
• Scale is based upon what you are studying and what the final product will be
• Hopefully you have enough funding to get great coverage at great resolution
• 120km2 = ~$18,000
• Higher altitude = scale may not be sufficient

Considerations for mission planning


1. Study area 6. Overlap and side-lap
2. Required weather and timing 7. Photo alignment
3. Film filter combination 8. Tilt
4. Film titling 9. Flight-plan map
5. Flight lines

1. Study area
• Basic info
1. Delineation of the area to be photographed
2. Flight altitude: important considerations
• Maximum limit 13,700m for jet aircraft
• Lower limit 300m for fixed wing
• Unpiloted aerial vehicle (UAV) (special flight operations certificate)
3. Standard 23X23 cm photo or digital

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

2. Required weather and timing conditions


i. Wind: avoid strong winds; plane drift off
course; waves on water surface
• Scale issues
ii. Clouds: < 10% (ex. Clouds and shadows
problem, and cloud reflection off water)
• Specify cloud cover for jobs,
obstructions
iii. Season: project objectives must consider the
nature of the features to be identified
• Topographic mapping, soils mapping,
terrain features
• (fall) = leaf-off, no snow

B. Vegetation: mid spring to summer means leaf-on


C. Submersed features: environmental conditions (water
turbidity; tide; sun glint; cloud cover)

Iv. Time of the day


o Sun elevation angle = f (latitude; season; time of the day)
A. Shadow is not wanted: 30-50 degrees --> (+/- 2 hours
of solar noon) -- optimal zone
• Why? < 30* not enough reflectance and data
may show very long shadows
• Zenith -- sun elevation angle

B. Shadowing is wanted: ~ < 30 degrees sun elevation


angle
o Dissected alluvial and bedrock terrain in an arid
environment

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

C. Sunspots/sun glint must be avoided: occur at high sun


elevation angles (𝜃𝑠 > 50°)
• Sun glint masks information

3. Film-filter combination
i. Films
• Black and white
• Panchromatic -- not sensitive to blue,
green, or red; everything approximated in
grey level
• Infrared
• Colour
• Normal (natural)
• Colour infrared (false-colour) -- takes into consideration the infrared part of the
spectrum
• $ black and white < natural colour < colour infrared $
ii. Filter: subtraction of light
• Yellow filter: subtracts blue (minus
blue) -- most commonly used
• Magenta filter: removes green
• Cyan filter: remove red
• Others: polarized, natural density

4. Film tilting
• Date; project code; exposure numbers; scale; time; photo
number etc.

• Digital: image metadata


• EXIF: exchangeable image file format -- specifies standard format
for tags used by digital cameras
5. Flight lines
• Recommendations
▪ Parallel to the longest axis of the study area
• Ex. North-south; east-west
▪ Extra flight line added to each side of the study
area

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

6. Overlap and sidelap


• Overlap = 60% along the flight line (Stereoscopic
overlap)

• Sidelap: 20-30% between adjacent flight lines

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

UAV planning mission

Overlap: allows for structure from motion


• Define 3D objects
• Tie points between images

7. Photo alignment:
- drift should not affect more than 10% of the print width for any 3 consecutive photographs

8. Tilt
• Not exceed 2° to 3° for a single exposure
• Average < than 1° for the entire project

9. Flight-plan map:
• How do we calculate this?
A. Flight altitude above terrain
B. Flight altitude above sea level
C. Lateral ground distance per photo
D. Alignment of flight lines
E. Number of flight lines
F. Number of photos per flight
G. Total number of photos

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Let’s plan a mission


Example:
• A client has contracted your company to take aerial photos over a study area of 20 km east-west by 35 km
north-south.
• The client would like to know the number of air photos that will be required to cover the study area.
• The following information is needed to calculate the total number of air photos required.
• Elevation of study area: 500m asl
• Desired photo scale: 1:25,000
• Film format 23 cm x 23 cm
• Focal length: 152 mm
• Overlap: 60 percent
• Sidelap: 30 percent

Number of air photos that will be required to cover the study area?
We need to calculate all of these
(a). flight altitude above terrain (e) number of flight lines
(b) flight altitude above sea level (f) number of photos per flight line
(c) lateral ground distance per photo (g) total number of photos.
(d) alignment of flight lines

Flight Altitude (a and b) for 1:25,000


We know
• Focal length: 152 mm --> 0.152m • Representative fraction = 25,000
• Desired photo scale: 1:25,000 • Elevation of study area: 500m asl

H = (RF)(f)
H = (25000)(0.152)
H = 3,800m above terrain

• Aircraft Altitude = 3,800 + 500


• Aircraft Altitude = 4,300m above sea level

(a) flight altitude above terrain


(b) flight altitude above sea level
(c) lateral ground distance per photo
(d) alignment of flight lines
(e) number of flight lines
(f) number of photos per flight line
(g) total number of photos W

(c): Lateral Ground Distance (D) of Single Photograph


We know:
RF: 25,000
Photo distance (d): 23 cm --> 0.23m

D = (RF)(d)
D = (25,000)(0.23)
D = 5,750m or 5.75km
• D and d should be proportionate

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

(D) : Alignment
• Alignment : North-South Alternating
• We know: Area specification=20 km east-west by 35 km north-south
(a) flight altitude above terrain
(b) flight altitude above sea level
(c) lateral ground distance per photo
(d) alignment of flight lines
(e) number of flight lines
(f) number of photos per flight line
(g) total number of photos

(e) Number of flight lines


• We know: 20km east-west

• W = width of study area=20km


• D = lateral ground distance of single photo=5.75km
• Sg =sidelap gained by each successive flight line; the overlap gained --> added a little bit of extra information
for buffer; do more cuz were already in the air
• Sg= 100 - percent sidelap (30) = 70 = 0.7 (decimal fraction)
• 2 = number of extra flight lines added to the sides of the study area

𝑊
𝑁𝐿 = +2
(𝐷)(𝑆𝑔)
20
𝑁𝐿 = +2
(5.75)(0.7)
= 4.97 +2
= 6.97 or 7 flight lines
• ALWAYS ROUND UP

(f)Number of photos per flight line


• We know: 35km north-south

• L = length of flight line=35km


• D = lateral ground distance of single photo=5.75km
• Og = overlap gained by each successive photo --> goes beyond just in case we miss something. Done for each
flight line
• Og = (100-percent overlap(60)) = 40 =0.4
• 4 = number of photos added to assure complete coverage (two to each end of a flight line)
𝐿
𝑁𝑃 = +4
(𝐷)(𝑂𝑔)
35
𝑁𝑃 = +4
(5.75)(0.4)
= 15.2+ 4
= 19.2 or 20 photos per flight
• ALWAYS ROUND UP

(g) Total Number of Photos?


• Total Number of photos in a flight line multiplied by the total number of flight lines
• TNP = NP x NL
• TNP = 20 x 7
• TNP=140 Photographs

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

(a) flight altitude above terrain


(b) flight altitude above sea level
(c) lateral ground distance per photo
(d) alignment of flight lines
(e) number of flight lines
(f) number of photos per flight line
(g) total number of photos

What’s with all this arithmetic?


• Each flight costs money.
• You wish to get the best coverage
• This means a trade-off between scale and coverage.
• Scale is based upon what you are studying and what the final product will be.
• Hopefully you have enough funding to get great coverage at great resolution ~ $ 18000 --> 120 km2
• Photogrammetry: $1200-$15000+
• LiDAR: $2000-$12000+
• UAV: $600-$3500

Sensors
IR = Infrared
Sensors are designed considering
• How the earth reflects radiation
• Light at different parts of the sun spectrum
• ABGR-IR
• Grass has high reflectance in the green part of the spectrum;
increases greatly with IR
• Vegetation absorbs blue and red light
o Photosynthesis!!!

Water absorbs radiation in the IR part of the spectrum


• Needed to maintain the structure of the H2O molecule
• Very important for understanding satellite building

Atmosphereic windows
• Atmosphereic window = Portions of the spectrum that
transmit radiant energy effectively -- NOT ABSORBED
• Light is going through the atmosphere with relative
ease in the light sections of the plot
o Ozone blocks light
▪ Not absorbing light --> transmitted
through atmosphere
▪ Can’t have satellites operating in the
ozone (0.2micrometres)
o Visible light = atmosphereic window
▪ Specific wavelengths -- cannot build a satellite for (dark dips in image)

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Then, spectral resolution is defined


• Some sensor "image" many fine portions of the electromagnetic spectrum
simultaneous
• 3 bends -- visible light
• Multi-spectral satellites -- multiple bends
• LandSat -- detecting how vegetation reflects radiation
o A lot of change happening in infrared
o 3 bends in the visible
o +1 extra bend in the infrared
o Bands 5 and 7 detect high and low moisture content
o No atmospheric window in largest gaps between levels
▪ Very little light will reach the satellite

And spatial resolution is defined


• Different sensors may produce images with difference spatial
resolution
• The minimal area a satellite can separate in the ground

Our space
• High vs low orbit satellites
• Number of debris objects estimated by statistical models to be in orbit
o 36500 objects greater than 10cm
o 1000000 objects from greater than 1cm to 10cm
o 330 million objects form greater than 1mm to 1cm
• Garbage collectors in atmosphere

ClearSpace 1 - 2025
• Escort it down to a lower orbit where the debris will enter the atmosphere and
burn up
o To burn it up -- won't come to earth

Remote sensing imagery can be used for:


1. Mapping
2. Deriving bio-geophysical variables -- ocean productivity

What is the structure of a remotely sensed image?


• Swath -- what a satellite can see
• The amount of electromagnetic radiance, L (watts m2 sr1; watts per meter squared per steradian) recorded
within the IFOV of an optical remoted sensing system; is a function of wavelength of light and surface spectral
characteristics.
• Satellite is measuring radiance from the surface and the atmosphere -- dictates the amount of radiance which
is going to meet the satellite
• Within field of view there are smaller angles, --IFOV = instantaneous field of
view. IFOV = pixel in m2, represents pixel size
o The smallest unit at which a satellite can detect
o Radiance (L) = reflectance from ground + atmosphere
• Pixels define the smallest unit which a satellite can view
• Passive sensors measure radiance reflected from the ground surface plus
atmosphere --> sun to earth to satellite
o 400-700 nm = visible light spectrum

• Remotely-sensed imagery is stored using a raster representation. Each band is a raster layer.
• When L reaches satellite through atmosphere, it is AT Sat L

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

• Satellite images will always have 185km FOV


• Each band acts as a layer since they're sensitive to
different parts of the light spectrum -- one layer that's
sensitive to blue light, one that's sensitive to red, one
that's sensitive to green; minimum of three layers, stack
together to create the image

Radiometric resolution --
• takes a mean value of pixel cell (that are not
homogenous) to create value
• Ex: Red band (= sensitive to red wavelength)
o The DN for each pixel corresponds to the average radiance coming from the surface within the pixel
cell.

• Ex: Blue band (= sensitive to blue wavelength)


o The DN for each pixel corresponds to the average radiance coming from the surface within the pixel
cell.

Each band is a raster layer Or brightness value

Greyscale is typically used to display a single band...


• while RGB (Red-Green-Blue) images can display 3
bands corresponding to the red, green and blue sensitive cells of the monitor. Computer monitor colour are
additive, meaning that “true” red + green + blue = white, and the absence of colour = black.

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Band 1 = Aerosol Blue, band 2=blue, band 3= green, band 4=


red, band 5=Near IR, band 6 = Short Wave IR1, band 7 =
Short Wave IR2
o Always have to know which band is associated with the
channel of the computer -- determines dominant output
colour -- infrared band in green channel will result in green
image, but infrared band in red channel will result in red
image

Image characteristics
1. Spatial resolution
o It is the smallest linear separation between two objects that can be
solved by the remote sensing system.
o Corresponds to the cell size used to define the pixel size in the raster
file. Depends on the IFOV (instantaneous field of view)
▪ Large cell size = low spatial resolution
▪ Small cell size = high spatial resolution

2. Spectral resolution
o The number of and size (width) of specific wavelength intervals in the electromagnetic spectra to
which a remote sensing instrument is sensitive
o Visible light 400 nano metres to 700 nano metres (or 0.4 micro metres to 0.7 micro meters)
o Thermal infrared -- emitted light

o Low spectral resolution – ex, entire visible spectrum


lumped together (panchromatic)
o Cannot distinguish among red, green, and blue
radiance; the detected signal is an average of the
radiance (blue, green and red)
o Panchromatic -- averages all light coming in through
the visible spectrum

o High spectral resolution – fine portion of the spectrum captured


o Red, or a portion of red spectrum considered

o Some satellites detect many fine portions of the electromagnetic


spectrum simultaneously

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

• Increases in resolution every time there is a


new landsat satellite launched
o Higher spectral resolution --
narrower bandwidth

• Most current landsat is landsat 9

o A Multispectral Image is composed of


several images.
o Each image is of a unique portion of the
electromagnetic spectrum

• Digital number -- light reflected from the ground

• Water is absorbing the infrared light which is why we have a lower digital number
o Water is a greater absorber of infrared light, it reflects green and red light

• Infrared band -- much bigger


• 30m with all bands
• Panchromatic present -- higher spatial resolution
possible, broader - possible to reach 15m resolution

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

o Hyperspectral = very high spectral resolution ~ 200 bands or more


o A type of image system that detects very narrow and continuous bands
o Very narrow band widths -- 5 micrometers?
o Might supply too much information, which will need to be defined

3. Radiometric resolution
o Sensitivity of a remote sensing detector to differences in signal strength
as it records the energy reflected from the terrain
o The number of bits that you image has -- related to the sensitivity of the
detector
▪ Defines the number of discriminable signal levels
▪ 8 bit -- 256 variability of values
• The larger the bit-image, the better the quality of the
image

• Will always be 2^ to the number of bits


• Translatable -- compression happens during transmission from satellite to computer

4. Temporal resolution
o Images are collected on regular time
intervals; other are only collected upon
request

o Depends on:
▪ Orbit:
• Polar -- landsat
• Equatorial
• geostationary -- assimilation of
data plus a model for the weather
forecast, continually collects data
at one site
▪ Swath: narrow x wide -- wider swath more
easily gathers information
▪ Sensor Configuration
• *continuous x programmed (landsat is not programable) at 90°
• *nadir looking x side looking- clarify

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

From detection to information


nadir observation - Ecosia - Webnadir observation - Ecosia - Web

• 550-750km away from earth surface


o There is an atmosphere between us and the satellite
o Watts per square meter per area
o Vicarious calibration -- spectral target, like the amazon
forest, used to find deviations.
o Some satellites use the moon to recalibrate and find
deviations
o Geometric errors -- uncertainties in data
o Radiometric errors --
o Is this data enough of the needs of the project? Are all uncertainties known?
Sensor system: on board analog-to-digital conversion and calibration → direct telemetry to Earth or indirectly through
tracking and data relay satellites (TDRS) → ground: incorporation of ancillary data → data processing (geometric and
radiometric) and visual or digital information by extraction (biophysical and land use/cover) → distribution and Use of
Information.

Product -- image; need to know what level of


image you have
• Level 0 -- system correction and
radiometric calibration → raw data
• Level 1 -- At-sensor radiance data
o Level 2a product -- geometric
correction; attitude data, position
data, DEM
o Level 2b product -- atmospheric
corrected data; radiative transfer
model, atmospheric variables,
topographic variables
• Level 2 -- statistical or physical models and
validation → preprocessing
• Level 3 -- thematic variables and
application; mapped on uniform space-
time grid scales

Sensors
• As used for earth observation
o Weather satellites/sensors
o Land observation satellite/sensors
o Marine observation satellites/sensors
• Many different payloads -- multiuse
• New satellites equipped with less payloads to reduce size and light

According to the source of energy


• Three remote sensing modes
1. Passive: measure reflected solar radiation
• Sun as direct source of energy
• Long wave radiation?

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

2. Passive: measure emitted (Thermal) radiation


• Sun as original source of energy
• Short wave?
• Thermal infrared part of the spectrum
• Useful for defining surface temperature
• Only interacts with the top canopy of forests

3. Active: measure backscattered radiation


• Own source of illumination
• Radar satellites and LiDar -- lidar can see through clouds
and measure the depth of the forests
• Measured with an antenna
• Measuring time difference between the radiation
touching the earth and bouncing back
• Can define roughness of surface -- unique
• Not measuring colour

• Can combine passive and active radar

According to the orbit:


1. Polar orbit -- move relative to the Earth; North to South
• Low orbit
• Sun synchronous (acquisition at direct sun light)
• Most common

• Geostationary Orbit
• Platform moves at the same rotation as the Earth; appears to be
stationary.
• It is always above the same location
• ~ 35, 000 km – high orbit
• Higher temporal resolution
• GOES -- used for weather forecasting

GOES -- Geostationary Operational Environmental Satellite Imagery


• Can look at half of the earth at the same time

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Land Observation satellites:


• LANDSAT: Passive, Sun-synchronous, quasi-polar
• Most are polar orbit -- largest image area
• All passive -- slight improvement in data
quality (not temporal)
• Equatorial -- around the equator
• Quasi-polar -- slight tilt
• No gap in time for when a landsat is not
operational -- no lack of data

• It takes 16 days for landsat to orbit earth -- temporal


resolution
• 1 orbit = 99 minutes
• 14 orbits a day
• Good to know time and location of new photos; useful for planning

• Improvements in spectral and spatial resolution


• Swath always 185km!!!!!

Landsat 8

• Added Band 8 with 15m panchromatic band -- improved resolution

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

• Spatial resolution went form 80m to 30 m with Landsat 8/9

• Grey areas -- high atmospheric transmission, light is being absorbed by grey parts

• Position of bands for certain satellite


• Xx along wavelength = spectral resolution in micro meters
• Thickness of band coordinates with the thickness of the line, thinner = less moisture content, thick = high
moisture content
• Want to make sure spectral bands are targeting the correct wavelength
• Positions of the bands located where they are to locate changes in
vegetation
• Little blue light
• A lot of infrared light -- band 4 specifically for this
• 5 and 7 shortwave infrared -- moisture of vegetation
• Detecting differences -- health status in vegetation
• Location of the bands
• Healthy vegetation reflects the most infrared light
• 30m is the smallest resolution we can view in visible, NIR and
SWIR

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

• Temporal resolution of landsat is 16 days


• Useful for planning
• Defined in orbits
• 14 full orbits per day every day
• 185km wide = swath

SPOT series
• Système Pour l'Observation de la Terre
• High Resolution Visible (HRV) than Landsat
• French space company -- beginning of the 80s
• Second longest time series
• Passive, Sun-synchronous
• Spot 6 and 7:
• Footprint 60 by 60 km
• Bands:

• 1986 -- first launch


• Spatial resolution -- 20m
• Better spatial resolution
• Launched with a panchromatic band with a
resolution of 10m
• Panchromatic band at 2.5m -- 2003
• Spot 7 satellites -- pan 1.5m , spatial 6m
• Higher area for landsat
• Spot may have smaller spatial resolution but you'll
need more pixels

Spatial and spectral resolution of selected Landsat, SPOT


SPOT is not free
Spot has higher resolution

SPOT 6
• 2012
• Panchromatic: 1.5m
• Multispectral: 6.0m
• 60km x 60km
• SPOT has the ability to collect off-nadir movements to acquire images
from the side orbits, costs more, reflectance values can get
compromised, emergency use is most common for this function
• Landsat only acquires images with a nadir observation, 90* to the ground

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

SPOT: stereoscopic capability


• Allows for two slightly different views of observation
• Allows for the creation of digital elevation models

Sentinel Satellites - ESA


• European Space agency
• Sentinel-1 -- microwave antenna
• Sentinel -2 land observation satellite with better spatial
resolution
• Sentinel-3 views sea surface and land-ice topography
• Free!

• Temporal resolution of data increased by having two satellites


• Grey part represents light going through the atmosphere
• Similar positioning
• Main differences
• No thermal infrared in Sentinel 2 (meauring temp of earth) -- only measures visible part of spectrum
• Sentinel 2 has narrower bands -- narrow reflectance areas more useful -- hyperspectral
• Health of vegetation 5,6,7 in sentinel 2
• Band 9 -- to detect aerosols, not for detecting light coming from earth

High Spatial Resolution satellites


• Comparison between different spatial resolutions
• Same spectral bands but spatial resolution is much higher (Quickbird)

• Ikonos higher spatial resolution than Landsat

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

• Panchromatic bands -- difficult to separate features on the ground but very high resolution
• Colours of visible bands and resolution of panchromatic

IKONOS
• First to collect publicly available high-resolution
imagery at ~1- and 4-meter resolution
• Launched on Sep 24th, 1999.

• Digital globe -- launched many high spatial resolution satellites


• Worldview 3 current, worldview 4 failed
• High spatial resolution but very high price

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Coastal: 0.40 - Red: 0.630 –


WorldView-2 (1.1 days, swath: 16.4 km) 0.45 μm 0.690 μm
• 46 cm panchromatic Blue: 0.45 - Red Edge:
• 1.85 multispectral 0.51 μm 0.705 – 0.745
• μm
Green: 0.510 – Near-IR1:
0.580 μm 0.770 – 0.895
μm
Yellow: 0.585 – Near-IR2:
0.625 μm 0.860 – 1.040
μm

WorldView 3 – 2014
• 31cm Pam; 1.24 m Multispectral

Spectral Resolution
• Changes within generations of high spatial resolution satellites with
positions of the bands

• Its not free


• You buy it per square kilometers
• Minimum 100km2
• These do off-nadir too

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Active sensors
• RADARSAT-1 (1995-2013) and 2 (2007-) – MICROWAVE
• Produce their own energy
• Radar -- very long wave lengths -- 35cm wavelength
• Interacts with the roughness of the surface then
bounces back
Active microwave
• Much longer wavelengths
• Bands defined by letters
• Can see through the atmosphere... Mostly
• Even through clouds

Backscatter
• We measure the signal that is backscattered from the earth
• Low to medium to high signals
• Low signal = dark surfaces
• Medium = interacting with the roughness of the surface
• High = energy will hit the surface, considering geometry of the surface
the light is backscattering from it may bounce directly back and give a
very bright signal, metal gives a strong signal -- double bounce back
scatter

RADARSAT-2
• Several “modes” including fine resolution
• RCM = RADARSAT constellation mission?
• Can program images to cater to what areas need mapping,
• Very programable -- very expensive
Several modes
• Antenna detecting the roughness of the surface
• Smooth areas -- backscatter will be away from the
antenna
• Fine/ML -- may be able to see the boats due to their
high bounce back rates

RADARSAT-2 and RADARSAT Constellation Mission (RCM - 3


sat)

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Marine safety
• Iceberg detection
• Oil spill detection -- RCM and RADARSAT 2, can detect boat and oil

Sentinel -1 ESA
• C band -- same as RADARSAT and RCM
• Limitation on spatial resolution
• Free data

• Double bounce off metallic surface where boats are, and dark surface is where oil is.
• Doesnt matter if its cloudy

Electromagnetic spectrum and light attenuation by the atmosphere


What happens to energy when it crosses the earths atmosphere?

Transmission of energy from the sun to earth


• Energy comes from the surface, transmission of radiation through a vacuum at the speed of light (8 min at
3*106 m/s)
• Some wavelengths scattered and or absorbed by gases, molecules, particles etc. In earth's atmosphere
• Depending on the surface temp of earth, it will emit the energy at
• Attenuation --
• Many different wavelengths
• Dominance of visible spectrum, blue to red
o Ultraviolet and near infrared gets heavily absorbed by ozone molecules

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Defining electromagnetic radiation


• EMR travels in a harmonic, sinusoidal fashion at
c= 3*10^6 m/sec
• EMR is produced whenever a charged particle,
such as an electron, changes ir velocity
• The length of time a charged particle is
accelerated = wavelength
• Number of accelerations per second = frequency

Energy with a high wavelength = lower frequency

0.5µm = 500nm = the beginning of the green spectrum


Diagram

• Wave size in
perspective
• Microwave is the
longest, Infrared is
longer than visible,
visible longer than
ultraviolet

UV to
400nm -- 500nm = Blue
500nm - 600nm = green
600 nm - 700nm = red
To infrared

Energy and atmospheric attenuation


• Atmospheric light attenuation is caused mostly by scattering and absorption particles in the atmosphere

Scattering
• Scattering = the direction of the scattered light is not always predictable
1. Rayleigh
o Scattering which occurs when the size of the particles is smaller than the
wavelength of light
o Mainly impacts short wavelengths in the upper 4.5km of the atmosphere
o If we didn't have this scattering, the sky would not be blue! -- molecules are
about the same size as the visible light spectrum, dominantly in the blue.
o Needs to be corrected -- a lot of signals from the atmosphere
o The least rayleigh scattering in the red wavelengths
2. Mie
o Occurs when there is sufficient quantity of materials with diameter
approximately the same size of the wavelength
▪ Dust and smoke
o Longer wavelengths impacted in the lower atmosphere (< 4.5km)
o Rayleigh no longer dominant
o Can be seen on mars

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

3. Nonselective
o Particles are > 10 times the size of the wavelength
o Lowest portions of the atmosphere
▪ Water droplets
o Equal amounts of all visible wavelengths are scattered causing a
white appearance -- nonselective
o Creates white as all the visible light is scattered evenly
o Diagram
o Not very important for remote sensing
o Difficult to correct

No atmosphere on the moon


• Light not interacting with particles

Absorption
• Process by which radiant energy is absorbed and converted into other forms of energy
• In the atmosphere, efficient absorbs of solar radiation include: water vapour, carbon dioxide, and ozone
• Need to be careful -- molecules get absorbed by gases in the atmosphere

Atmosphereic windows = portions of the spectrum that transmit radiant energy effectively -- least absorbed by gases
• Dark means its not transmitted through the atmosphere -- its been absorbed, suspects listed in diagram
• No gases that successfully absorb visible light radiation

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Clips from Lab instructions.

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

overall residual

Error margins should be going down with each new correction

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

--------------------------------------- POST MIDTERM CONTENT ------------------------------------------

Radiometric/atmospheric correction
Important concepts:
Reflectance (Rλ): is the ratio of the radiant flux reflected from a surface (Lλ) to the
radiant flux incident to it (Eλ), at certain wavelengths
(E lambda = irradiation incident -- wavelength dependent)
(L lambda = radiation from a surface)
• Converting radiance measurements to reflectance provides a means of
normalizing illumination differences
𝑅λ = 𝐿λ/𝐸λ
• Information from the target, atmosphere and from adjacent pixels

Amount of light will be greater in July so it will be better for viewing foliage,
shorter path for radiation to travel -- sun elevation level
Scene illumination
• Solar elevation controls scene illumination, which has a big impact on radiance
measurements
• Radiance measurements may need to be normalized to facilitate comparisons
between different dates
• Low irradiance during winter, low sun elevation -- corrections improve
irradiance

• E -- irradiance coming in
o Some light may not even reach the surface, as if it is all coming from a single pixel -- path radiance (LP)
= interference
• Digital number without any atmospheric correction = path radiance + target radiance + adjacency radiance

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

When do we need it?


1. Multi-temporal data: time series
2. Biophysical parameters will be estimated
(eg. Chlorophyll concentration, vegetation biomass, SR=Red/IR)

DN value = radiance (radiation reflux from a surface)


Types of correction
1. Relative: use information derived from the image itself or other images
2. Absolute: statistical methods - ELC
• sensor and atmosphere characteristics - models

1i. Single-image Normalization using Histogram


Adjustment:

Assume clear sky acquisition


Landsat 7 bands
Band 1: 0.450 – 0.515
Band 2: 0.525 – 0.605
Band 3: 0.630 – 0.690
Band 4: 0.750 – 0.900
Band 5: 1.55 – 1.75
Band 7: 2.08 – 2.35

Rayleigh doesnt affect NIR or IR -- only addressing scattering in the visible wavelength

Bias - Rayleigh scattering in the DN

DNadjusted = DNoriginal - bias


Bias = minimum value in a band
Histogram shift process --> not addressing irradiance, can
only be done to Rayleigh
Minimizing the impact of Rayleigh scattering

Basically, we are trying to minimize the effects of Rayleigh


scattering

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Types of corrections:
1. Relative

1ii. Multiple-data image normalization using regression


1. Define pseudo-invariant features (PIF) in both images -->
features that do not change the DN in time
• PIF= DN do not vary in time (spectrally stable)
o Ex: deep non-turbid waters, bare soil, large rooftop
• PIF should be in a location that represents the average
elevation of the image
• PIFs: dark and bright

2. Define reference image: usually the one with the best


quality and atmospheric correction

- assumes a linear relationship in pixel values of same cover between dates


• PIFs need to be plotted in a linear equation
o Best fit curve
• High square value to be able to use this method
• Insert digital number into linear equation for x
• Y = (slope)(DN) + (offset on x axis)
• X value comes from uncorrected image
• Normalized to reference image
• Cannot mix images from winter and summer --
difficult to have a high r2.
Now, the normalized SPOT Band 2 DNs are what they would
have been if the image had been acquired under the same
conditions as the master 8/10/91 image.

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

iii. Dark Objective Subtraction (DOS)


• Assumes that very dark objects (low DN - shadows, deep clear water) in the image
should have a very low reflectance (or radiance)
• Water tends to be a very good dark object
• Assumes that very dark objects (lowest DN) in the image (shadows, deep clear
water) should have a very low reflectance (thus very low radiance)
• If so, the measured radiance (DN) is actually mostly composed of Path Radiance
• By subtracting that radiance (DN) from all pixels, you can somewhat remove the
effect of path radiance from the pixels (Rayleigh and Mie)
• Does not account for atmospheric
absorption

Before dark pixel subtraction


• Takes the inverse of the uncorrected band value --> relative correction
• Contributions from the atmosphere
• Value does not come from the water, comes from the atmosphere
o Value X is contribution from the atmosphere
o Inverse is coming from the DOS
• Has to be done for every individual band
• Dark ocean eliminated rayleigh and a little bit of mie scattering
• Dark object subtraction normalized
• Goal is to correct the vivisble bands

After dark pixel subtraction

• Minimizing Rayleigh and Mie scattering


• Path radiance

Types of correction?
2. Absolute
i. Statistical methods – ELC (Empirical Line Calibration)
• forces remote sensing image data to match in situ spectral reflectance measurements (ideally obtained at
roughly the same time)
ii. Models: Physical models to account for sensor and atmosphere characteristics at the time of image
acquisition

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Geometric Correction
Imagery needs to be spatially anchored to the
world in order to allow images and data from
the same areas to be used together, or to
compare among images.
• It is therefore usually necessary to
remove geometric distortion so that
individual pixels are in their proper
planimetric (x,y) map locations

Why is Geometry Important?


• Mutual comparison of images
o multi-sensor
o multi-temporal
• Comparison with and use of
existing/new data (maps, GIS layers)
for:
o Interpretation: ex- change over
time
o modelling
• individual pixels are in their proper planimetric (x,y) locations

GIS data base


Ex. Urban planning

Sources of distortion
• Remotely sensed imagery typically exhibits internal and external geometric
error.

1. Internal sources of error


• Sources of error are often predictable (systematic)
• can be identified and then corrected using prelaunch or in-flight ephemeris
information
(ephemeris = information about location of the sensor at the time of
acquisition; recorded aboard of the satellite)

Example:
• Scan skew (Earth rotation)
• Panoramic distortion (distortions away from
nadir)

Scan skew – Earth rotation


• Earth rotates as the sensor scans the terrain.
This results in a shift of the ground swath
being scanned
• Easily predicted and correctable

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

2. External source of errors


• Non-systematic type of error
• Example:
A. Altitude variance: If the sensor platform departs from its normal altitude or the terrain
increases in elevation, this produces changes in scale or pixel size.

B. Platform attitude: One sensor system axis is usually maintained normal to Earth's
surface and the other parallel to the spacecraft‘s direction of travel. If the sensor
departs form this attitude, geometric distortion results (platform roll (x axis),
pitch (y axis) and yaw (z axis)).

scale change: pixel size


Compression and expansion of pixels

The external errors (non-systematic) will result in spatial distortions


• How to correct some of these distortions? How to assign a reference system to an image?
• This process is called geometric correction

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Two approaches to geometric correction:


1) image to image
2) image to map

1. Image to Image
• Reference one image relative to another:
o alignment process by which two images of
like geometry and the same area are
positioned coincident with respect to each
other.
o The source image is considered geometrically
correct
o The uncorrected image is tied to the source
image
o GCP = ground control point

2. Image to Map
• The image is “tied” to a map, which is based on a
specific projection.
• The referenced map is the source: hard-copy
planimetric maps, digital format, digital
orthophotos, GPS data

Geometric Correction Steps


Step 1: Define Ground Control Points (GCPs): link the
uncorrected image to a source image/map using.
Step 2: Spatial interpolation: geometric relationship between the input pixel coordinates and the associated
map/image coordinates of this same point. Calculate location of the new pixel.
Step 3: Intensity interpolation: mechanism for determining the digital number value (DN ) to be assigned to the output
rectified pixel.

What is a “Ground Control Point” (GCP)?


• A location on the surface of the Earth or on a “reference” image (e.g. a road intersection) that can be
identified on the imagery AND located accurately on a map (or reference image).
• Two distinct sets of coordinates are needed:
(1) image coordinates
(2) map or reference image coordinates
GCP pair:
• Image coordinates – pixel location specified in rows (i) and columns (j) (uncorrected image)
• Map/reference image coordinates – measured in degrees of latitude and longitude, meters in UTM, etc.
(Source --> corrected image, map, GPS coordinates)
• These paired coordinates are used to create a mathematical model used to geometrically correct the remote
sensing data

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Geometric Correction (Step 1)


• The reference image should be at least the
same resolution as the uncorrected image.

• Below diagram compares the mismatch of the


pixel size of landsat and Geoeye
o May not be able to pin point the
edges
o Source reference should be close to
the spatial resolution of the
uncorrected image to get good GCPs

• link the uncorrected image (A) to a reference image/map (B) by


collecting Ground Control Points (GCPs)
o Points need to be spread out to avoid distortion
▪ Edges of the image

Recommendation for good GCPs


1. Location:
• The objects chosen should be permanent. Lake shores, rivers, coastlines should not be used, as their location
can vary over a period of time.
• The points should be as distinctive as possible (image and map). Use objects such as roads, fence lines, etc.
1. Spatial distribution of the GCPs is most critical.
• GCP locations should uniformly cover the entire area of the uncorrected image
• GCPs cover the entire image; fewer needed at image centre
• More where topography is complex (i.e., in mountainous regions use more GCPs than in
flat areas)
Consequences of Poor distribution of GCP’s
• Distortion →

Summary: an ideal set of GCP’s must meet the following criteria:


• They must be clearly identifiable on both uncorrected image and source
• They must be permanent objects, such as road intersections, fence-lines, buildings, etc.
• They must be distributed across the entire uncorrected image, more where topographical variation is present
• They must be of adequate number for the mathematical model chosen for spatial interpolation (more on this
later)
• They must result in an RMS error of < 0.5 (more on this later)
STEP 2: Spatial interpolation
- geometric relationship between the input pixel coordinates and the associated map/image coordinates of this
same point. Calculate location of the new pixel.
You need….
1. A good set of GCP’s
2. A mathematical model: models based on polynomial equation are commonly used
a. First order polynomial
b. Higher order polynomial transformations
3. Accuracy evaluation

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Statistical Models
• Polynomials: This math model produces the 'best' fit, mathematically, to a
set of GCPs.

• Linear – first order to predict the NEW coordinates (y’) for


the out put image
y‘ = a0 + a1x + a2y
• Quadratic – second order
y’ = c0+c1x+ c2y+ c3xy+c4x2+c5y2
• Cubic – third order
• a and c are parameters defined based on the GCPs you
selected (uncorrected and reference images)
• y’ is the predicted new position in the output NEW
corrected image and y and x are the in the original
uncorrected image.

• Ortho model for topography (DTM)

1st order polynomial transformation


• Parallel lines are preserved -- geometrically fixed

• Only three pair of GCPs are required (to define the equation of the line) but
more can be used to better define the geometric properties of the image.

Minimal number of GCPs required for each order polynomial

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

General rule:
• For moderate distortions, a first-order transformation is sufficient to rectify the imagery to a geographic frame
of reference. -- do not go more than 2nd order
• This type of transformation can model six kinds of distortion in the remote sensor data, including:
o translation in x and y
o scale changes in x and y
o skew, and rotation.

High order polynomial transformation


• This transformation does not preserve parallel lines or shape

• Requires 6 or more pair points, depending on the degree of the polynomial. Higher orders require more GCPs
• The greater the image displacement, the higher the order of the polynomial needed to correct it.
• Increasing the degree of the polynomial results in unstable image edges, which must be controlled by
increasing the number of GCP’s.

3. Evaluation -- accuracy
• Root Mean Square (RMS) error of each GCP: indicates the level of distortion in the out put new image after the
transformation (after applying the polynomial equation)
• An acceptable RMS error is equal to half of the size of one cell on an image (RMSE < 0.5). But, it depends on
the application…..
• How do we calculate RMS error?

• Rooted to remove negatives

Where:
• xorig and yorig are the original row and column coordinates of the GCP in the uncorrected image (location you
select); and
• x′ and y′ are the model predicted coordinates (location model estimate) in the output image when we utilize
the polynomial equation.
• The difference between the “location the model estimated” and the “location you select” is the residual.

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

• By computing RMS error for all GCPs, it is possible to (1) see which GCPs contribute the greatest error, and (2)
calculate the global RMS error (Global RMS).
• What to do where RMS error is high:
1. reject the GCP
2. collected more GCPs
3. Compute new RMS error
4. you can also move to a higher polynomial transformation, however you must keep in mind the
implications of this in terms of preservation of parallel lines image warping

STEP 3: Intensity interpolation


- mechanism for determining the digital number value (DN) to be assigned to the output geometrically corrected
pixel.
• Resampling (intensity interpolation)
• Intensity = value of the pixel number --> digital number (DN)
• Extraction of the DN from an x,y location in the original (uncorrected) input image and its relocation to the
appropriate x’,y’ location in the new corrected output image.
• Three resampling strategies: a)
A. Nearest neighbour -- easiest
B. Bilinear interpolation
C. Cubic convolution

Nearest neighbour resampling


• The DN from the pixel in the uncorrected image is
transferred to the nearest pixel location in the new
corrected image.
• Defining what value each pixel will take after the
interpolation has been run
o Missing value for yellow box -- getting value
from nearest pixel on uncorrected image
and assimilating to corrected image --
closest to the pixel position
o Does not create values (averages), only
takes values from neighbours
o BIOPHYSICAL PERAMETERS

• The simplest method


• Original values are not altered, but some pixel values may be duplicated while others are lost
• Largely used for biophysical properties estimates
• This method tends to result in a blocky image appearance
Bilinear Interpolation resampling
• Takes an average of four DN pixels in the original
image nearest to the new pixel location
o New values will be produced in output image
• The averaging process alters the original pixel DN
values and creates new digital values in the output
image
• The advantage is smoothing of the image
• The disadvantage is that averaging may be
undesirable if further processing and analysis is to be
done
• Requires ~3 to 4 times the processing time of nearest
neighbour

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Cubic Convolution Resampling


• Goes further than bilinear resampling to calculate an average
of a block of 16 DN pixels from the unrectified image which
surround the new pixel location in the corrected image
• The advantage of this technique is that the image appears to
be much less blurred than that obtained with the bilinear
interpolation
• Best for smoothing out noise
• Disadvantages are problems of averaging data
• Cubic convolution requires up to 12 times the time of
nearest neighbour resampling

1. A good set of GCP’s


2. Spatial interpolation
3. Intensity interpolation
• Focus on nearest neighbour
o Spatial and intensity interpolation
• Can try to run any method; each method has different
applications depending on the clarity of an image required
• Enhancement techniques

Enhancement Techniques
Atmospheric corrections --> geometric correction --> enhancement --> classification

Introduction
• Applied to remote sensing data to aid human visual and digital image analysis/interpretation, and feature
identification
• Once applied, enhancements can greatly aid the classification process

Raw vs. Enhanced image


Key concepts
• DN, history
• Contrast manipulation
• Spatial filtering
• Band math
• Edge detection (not covered here)

Digital numbers
• Recall: Digital numbers (DN) or brightness value/gray level, are the values that are
stored in the cells of an image file

Spectral profile
• SPOT 20x20m
• Multispectral data of Marco Island Florida
• SPOT colour composite
o B3 - NIR lambda
o B2 - Red lambda
o B1 - green lambda
How to plot a spectral profile
• Do not connect to y axis, implies neighbour value has the same value of previous
band

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

PLOTS WILL BE ON FINAL

Histogram (frequency distribution)


• How often a certain digital number appears in an image
• Ordered by frequency (how often it occurs in each band)
• Very important for enhancement

• Gaussian distribution
• Very dark with no enhancements
• All 8-bit
• Dynamic range - min and max, range of digital numbers
• Enhancements stretch range, and make values use the entire
dynamic range
o Linear enhancement
• Linear enhancement stretches the values of the digital numbers
of the original image
• Still using 8bit image

1. Contrast manipulation
• Procedures applied to image data in order to more effectively visualise or
digitally analyse features
• In lab when you apply simple enhancements you improve visualization
(change DN on the display) but do not change the original DN values of
the image (display only)
• However new images can be derived from the original image and saved
as new image files with new enhanced DN values (...write new file)
How does it work?
• DNs of a single image usually have a range less than 256
• (8 bit)
• You can take a narrow distribution of DNs ans stretch it out over the full 256 range,
enhancing the image

Methods of contrast manipulation


Linear stretching -- know this
• Increases the contrast while preserving the original
relationships of the DNs to each other
• Applying linear curve that exploits the entire
dynamic range
• Contrast image will now produce a new histogram

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Calculation: Linear Stretching: Min-max

where:
DNOut(i,j) = output digital number at row i, column j,
DNIn(i,j) = input digital number at row i, column j,
MIN = minimum value found in the data distribution,
MAX = maximum value found in the data distribution,
DNMAXRANGE = maximum range in data value possible (usually 255).

Example:
• What would be the new min and max of the new enhanced
image?

• All original DN values between 4 and 105 are now linearly


distributed between 0 and 255.
o 4 becomes 0
o 105 becomes 255

• Instead of min and max, use the Standard deviation

• Linear stretching: +/- standard deviation

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Nonlinear Stretching -- know this


• Increases the contrast without preserving the
original relationships of the DNs to each other
• Different portions of the images may be stretched
differently than others
• Portions of the histogram are enhanced while
characteristics of other portions are retained or
compressed
• Example: histogram-equalized stretching
o DNs are assigned to new DNs based on
their frequency (how often the appear in
the image)
▪ Frequently occurring values are stretched
more than others
• e.g., histogram-equalized stretching
o Area with the highest distribution will be the main
area of enhancement
o Histogram equalization

2. Spatial Filtering - know this


• Spatial filtering is used to improve visualization Use values
and relationships with neighbouring pixels to enhance the
image or remove noise
o Due to variability -- salt and pepper look
o Remove noise or enhance features
• a filter for which the DN value of a pixel at a particular
location in the filtered image is a function of the DN values of
its neighbouring pixels in the original image

1. Locate a pixel
2. Define a neighbourhood (mask or kernel)
3. Based on the neighbourhood, compute new value

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Example: Averaging Filter


• A simple spatial filter where the output pixel is the average of all pixel values in the kernel.
• All pixels are weighted equally, kernel = 3x3

• Output Pixel = (10*1 + 2*1 + 19*1 + 18*1 + 10*1 + 8*1 + 15*1 + 20*1 + 9*1) /9 = 12.33 =12
o Multiplied by the weigh of the mask then averaged

• If output is 8 bit image, 5.67 will round to 6, 32 bit image would be left
as 5.67
o Has to do with the number of 0s and 1s allowed, significant
figures? -- storage capacity, larger file, defines precision of the
number (will specify for exam)

How do you calculate the DN for the next pixel?


• Ex: averaging filter

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

• Always a moving window


o Goes back to the first column and starts all over
again -- truncated
o Window always has to be inside the image
• The Averaging Filter is an example of a low-pass filter
because de-emphasizing areas that have a high spatial
frequency (make edges smooth)
• What is Spatial Frequency?
o It is measure of the frequency of variation in DN
that appears in the image (think about texture)
o Few changes in DN (smooth)= low frequency
o Many changes in DN (rough)= high frequency

3. Band Math
• Amplify a specific signal resulting from the division or addition
or subtracting of DN values in one spectral band by the
corresponding values in another spectral band
• Reduce the number of data sets that we input into the
classification process
• Useful for simple change detection or for adding
interpretation

Image subtraction

Band ratio

• Illumination differences minimized, spectral relationships highlighted (emphasis on geologic units)

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Vegetation Indices
• Common ratios have been developed for
interpretation of vegetation:
o Simple Ratio
o Normalized Difference Vegetation Index
(NDVI)

Vegetation Indices: Simple Ratio


• The near-infrared (NIR) to red simple ratio (SR)
SR = NIR/red
• It takes advantage of the inverse relationship
between chlorophyll absorption of red radiant
energy and increased reflectance of near-infrared
energy for healthy plant canopies
• Eliminates the shadow effects

Minimize shadow effects

Vegetation indices - NDVI -- most famous indicator


• NDVI = Normalized Difference Vegetation Index
o Widely used
o Varies from -1 to +1
o Positive = higher biomass, soil
o Negative: snow, clouds, water
NDVI = (Infrared-Red)/( Infrared+ Red) -- units are percent reflectance
• Healthy vegetation (left) absorbs most of the visible light that hits it,
and reflects a large portion of the near-infrared light.
o Close to 1 = healthy vegetation
• Unhealthy or sparse vegetation (right) reflects more visible light and
less near-infrared light.

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Image classification
• Classification scheme
• Unsupervised Classification
• Supervised Classification
• Accuracy Assessment

Introduction
• A key goal of many remote sensing project is to produce
a thematic interpretation of the image(s)

Classification: To organize reflectance or DN values in classes

Classification Process
1. State the nature of the classification problem what do you
want to achieve
o Classes of interest? (classification scheme) (Step 1)
2. Acquire appropriate data (Step 2)
o Remote sensing data (consider spatial, spectral,
temporal resolutions)
o Environmental considerations (atmospheric, phenological cycle, etc) •
o Ground reference data? Ancillary data? (DEM, GIS, maps etc)
3. Data Pre-Processing (Step 3)
o Radiometric correction, Geometric, Enhancements
4. Classification (Step 4)
5. Accuracy (Step 5)

Step 1:
• Classification scheme: Define the Classes of interest
o Should be achievable, given the available data and methods
o Ex: different kinds of crops, different tree
species, or different geologic units or rock
types, different water types
o Depends on: spatial and spectral
resolutions

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Spatial scale and remote sensing

Spectral resolution
• Different “classes” have different spectral signatures

Spectral Signatures
• Spectral signatures tend to be variable
• A broad information class (e.g. vegetation) may contain a
number of spectral sub-classes with unique spectral variations
o Ex: different vegetation forest, grass, shrub, etc

• Spectral mixing = Pixels are not pure -- causes pixel mixing, takes the average of the pixel to compute the DN,
hard to control but important
• They contain more than just vegetation: tree, shadow, soil
• 30m vs 2m spatial res?

• Pure classes vs mixed classes

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Step 4: Classification

• Statistical classifiers: algorithms that use statistical decision rules to separate


pixels into classes

• Scattergram represents the digital numbers not the spatial extent


o Takes (x (band 3),y (band 4)) and plots
o Like individual classes (x, y, z)
o 2-dimensional
o Similar values will start to gather together

Types:
1. Unsupervised classification
- used when there is no a priori knowledge of cover characteristics.
Image is automatically grouped into spectral classes
o Spectral classes (clusters) are grouped solely based on the
DN/reflectance in the image and the clusters are
interpreted (assigned classes) a after by the analyst

The only inputs from the analyst are:


1) Define the clusters: the number of classes,
DN distance (DN difference)
• These are digital number values -- NOT LOCATION
• R = radius distance in spectral space (DN
difference)
• Cmax = maximum number of clusters

• Pixel 1 and pixel 2 should be in the same cluster since our Euclidean distance is lower than our defined R

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

• Creates a new cluster with new DN values that are averaged between the two pixel values
• Average of (10,10) and (20,20) = (15,15)

• New cluster will be generated for pixel 3 since it is greater than R value of 15.

1.2) Define the classification algorithm


A. Minimum distance to mean
▪ Assignment of pixels to one of the clusters
• Algorithm decides what cluster the point goes into
• No room for an unclassified pixel
• Relies on the shortest distance (Euclidean distance) from the
cluster means to the unclassified pixel
• All of the pixels in the image are assigned to one of the specified
cluster.
• Pixel 2 would be allocated to cluster S by the software, but the
better fit would be to cluster U -- unsupervised classification
results in S cluster --- only considers mean

B. K-means/ISODATA -- statistically more robust, considers the SD to the mean


o Summary: Unsupervised Classification
▪ Good: no a priori information required; opportunity for human error is minimized; unique classes are
recognized; computationally fast
▪ Bad: spectral classes may not correspond to needed classes; analyst has limited control over classes;
model decides everything

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

2. Supervised classification
- used when there is information available for determining classes and for identifying
training areas; analyst defines the class
a. Classification scheme
b. Select training areas
c. Analyse spectral separability
d. Classification

a. Classification scheme

B. Training areas
▪ Training area = a sample of the Earth's surface with known
cover
▪ The statistics of the DN values within the training area are
used to determine decision boundaries in the classification

1. Each training area is usually composed of many pixels. And, many training areas are generally defined for each
class. The general rule is that if training data are being extracted from n bands then > 10n pixels of training
data are collected for each class. This is sufficient to compute the requirements of some classification
algorithms
• Issue for hyperspectral images
• As homogenous as possible
2. The DN for the pixels comprising the training area are used to "train" the algorithm to recognize spectrally
similar (similar DN) pixels in each band
• Must be representative
3. To ensure effective classification, training data must be complete and representative
• May have pixels that are unclassified -- didnt fully cover the spectral variability of the image
• Will result in unclassified pixels
• DN will be much higher -- may want to add two classes (clear vs turbid water) then join after
classification

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

How should training areas be selected?


1. The selection of training areas is based on the analyst's familiarity with the
geographical area, field data, secondary information (maps, more detailed imagery)
2. Select areas that are representative and homogenous
3. Avoid delineating training area boundaries too close to edges
Red: not good
Yellow: good

4. Training areas are often refined after initial delineation


• May refine training areas because classes need to be
generalized
• e.g., Spectral signatures for shrub and grass classes are too
similar to distinguish and a general vegetation class is required
• May refine training areas because spectral signatures are
highly variable and subclasses are required.
• e.g., A single agricultural class is too general and subclasses
are required: corn, wheat, canola…

▪ May refine training areas because of a ‘strange’ signature


associated with one training area
• e.g., A training area in a shadowy part of the image has a
different response Vegetation: sun-lit Vegetation: shadow

▪ Once we have “good” training areas (which is the most time-


consuming part of the classification), statistics for each training
class can be generated
▪ Training class = all training areas of a particular type considered
together
▪ Training areas organized in training classes

C. Analyzing spectral separability of training classes:


▪ Goals
• allows you a means to reconsider the classification
scheme you have set-up and the training areas you
have selected (are the training areas really good?)
• allows you to determine which subset of image
bands provides the greatest degree of separability
between classes; before a classification algorithm is
selected
• allows you to evaluate how likely the classification algorithm will perform (poor separability
means classes will not be discriminated)
▪ Methods
• Table with statistics -- won't be done by hand
• Pay attention of the variability (SD) of the each Class for each band.
• Class 5 and class 6 are not separable using band 1
• Scatterplot analysis
• Look at water and forest: Poor separability in bands 1 and 2 but good
separability in bands 3 and 4.
• The colored circles represent the DN distribution of each training
class
• Distribution of the class

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

• Quantitative statistical separability


• Ex: Bhattacharyya distance: used for selecting which bands/products provide the
greatest degree of separability between any two classes
• Assumes gaussian (normal) distribution -- parametric approach
• Values ranges:
<1.0 Very poor
1-1.9 Poor
1.9 Good
• Vc is covariance matrix of class c
• Mc is the mean of class c
• Same for class d
• T: transpose matrix

• Ex: class c = water class d = vegetation


D. Classification
• Classification algorithms:
1) Minimum Distance to Mean
▪ How is distance calculated?
• Training the definition of the clusters
• Pixel b? - 3

Euclidean distance pros and cons


▪ Mathematically simple
▪ Computationally efficient
▪ Works well for spectrally distinct classes
▪ Assumes normal distribution
▪ Insensitive to class variance (does not considered the
standard deviation/variability of DNs of the class)

2) Maximum Likelihood
▪ Maximum likelihood decision rule based on probability
▪ Probability of a pixel belonging to each of a predefined set of training classes is calculated, and the pixel is then
assigned to the class for which the probability is the highest
▪ How do we obtain the probability information we will need from training data we have collected?
• Consider the hypothetical histogram (data
frequency distribution) of forest training data
obtained in band k. We could choose to store
the values contained in this histogram in the
computer, but a more elegant solution is to
approximate the distribution by a normal
probability density function (curve), as shown
superimposed on the histogram

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

• No overlap between classes


• Any pixel within probability function will be grouped accordingly
• May end up with pixels that are not classified -- maximum likelihood
• Min-distance to mean -- everything will be classified

▪ How are the pixels assigned to the classes?


• Assigns pixels to the class where the pixel DN has the highest probability of
belonging
• If classes are homogenous -- it will still calculate
• Overlaps will result in different probability -- things start to get a little
fuzzy
• (@red arrows) Pforest > Pagriculture -- then pixel X
belongs to the class forest
• Probability density function -- optimal approach

• Accuracy assessment is the last step


▪ Can be from your own knowledge
▪ Need testing samples to test the accuracy of the classification

• Classified through NDVI


3) Parallelepiped
4) Decision Tree Classification

Accuracy assessment
• Testing samples
• Accuracy

Accuracy assessment
• Once an image has been classified it is essential to report the accuracy of the classification
• What do we need?
1. Classified map
2. Testing data
3. Accuracy assessment

1. Classified map

• Define the testing areas

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

2. Testing data
• Usually, they are generated the same way as training data: field data, analyst knowledge, additional
imagery, maps
• Optimally, testing data are collected in the field
i. How many testing samples?
• Optimal sample size (N) is based on a probability model:

• where p is the expected percent accuracy of the entire map


• q = 100 – p
• E is the allowable error
• Z = 2 (constant)
• This is used as an approximation for N considering two classes
• However...
• Classification maps generally contain multiple (more than two) classes; e.g. agriculture, barren, soil,
water, forest, etc.
• So a general rule of thumb of 50 samples/class (50 samples per class) has been successfully adopted
(Congalton, 1991).
ii. Where to sample?
• Common sampling designs used to collect testing data:
1. random sampling
• a random number generator is used to identify random x,y coordinates
within the study area.
• Issues:
(i) it may undersample important classes
(ii) may be located in areas that can not be accessed (field data).

2. systematic sampling
• sample through the area in a consistent and orderly manner.
• Issues:
(i) may sample similar class over and over (ex: forest; water)
(ii) may over or under estimate representation of a class.

3. stratified random sampling


• samples are randomly selected from each class after the
thematic map has been created. All classes, no matter how small
their proportion in the map, will be represented.
• Issues:
(i) samples must be allocated after the map is created.
• BIAS APPROACH

3. Accuracy assessment
• A common way of expressing accuracy is using a classification
error or confusion matrix
• Compare on a class-by-class basis relationships between
known reference data (testing data) and classification results
▪ Example: not enough testing samples -- should be 250
for proper accuracy

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

• How to calculate the overall accuracy?


▪ The overall classification accuracy

• Other forms of evaluation to inform about the errors:


▪ Users accuracy
• From the perspective of the user of the classified map, how accurate is the map? ◦
• For a given class, how many of the pixels (or testing areas) on the map are actually
what they say they are?
• Calculated as:
Number of correctly classified testing areas in a class
Number of testing areas that were actually classified in that class (rows)

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

▪ Producers accuracy
• From the perspective of the maker of the classified map, how well a certain class was classified.
• For a given class, how many testing areas are correctly classified on the map?
• Calculated as:
Number of correctly classified testing areas on a given class
Number of testing areas of that class as derived from the reference data

• For practice…calculate the User’s Accuracy and Producer’s Accuracy for the other classes

Gedownload door Faissal Bozi (faissalbozi@gmail.com)


lOMoARcPSD|34709425

Final Review
35%
Format
• Short answer
• Multiple choice
• Lectures 4 to 11
o Satellite images
o Spectral resolution
o Atmospheric correction
▪ Aerial photography is not part of the final
Main topics
• Properties of digital remote sensing image
• Sensors
• Light attenuation
• Atmospheric correction
• Geometric correction
• Image enhancement
• Classification
• Accuracy assessment

Course review strategy


1. What is the problem you are trying to solve/question you are trying to answer?
2. What remote sensing data? Field data?
3. Image processing -- radiometric correction? Geometric correction? Enhancements/band math etc.?
4. Classification -- supervised/unsupervised?
5. Accuracy assessment
6. Delivery of final product (does your final product solve the problem?)

Gedownload door Faissal Bozi (faissalbozi@gmail.com)

You might also like