Professional Documents
Culture Documents
Geog 228 Final and Midterm
Geog 228 Final and Midterm
[School]
[Course title]
Gedownload door Faissal Bozi (faissalbozi@gmail.com)
lOMoARcPSD|34709425
Table of Contents
Intro to remote sensing.......................................................................................................................................................................... 2
Scale ................................................................................................................................................................................................... 1
Types of sensors: ............................................................................................................................................................................... 1
Aerial photos - passive .................................................................................................................................................................. 1
Interpretation of aerial photos .............................................................................................................................................................. 5
The role of scale on texture ................................................................................................................................................................... 6
Photo Interpretation key ................................................................................................................................................................... 7
Scale ................................................................................................................................................................................................... 9
Mission planning .................................................................................................................................................................................. 10
Let’s plan a mission ....................................................................................................................................................................... 5
Sensors ................................................................................................................................................................................................... 2
Radiometric resolution ...................................................................................................................................................................... 4
Image characteristics ............................................................................................................................................................................. 5
Product .............................................................................................................................................................................................. 8
According to the orbit: ....................................................................................................................................................................... 9
Active sensors ...................................................................................................................................................................................... 16
Backscatter ...................................................................................................................................................................................... 16
Electromagnetic spectrum and light attenuation by the atmosphere ................................................................................................. 17
Scattering ......................................................................................................................................................................................... 18
Radiometric/atmospheric correction ................................................................................................................................................... 22
Scene illumination ........................................................................................................................................................................... 22
When do we need it? ....................................................................................................................................................................... 23
Types of correction .......................................................................................................................................................................... 23
1. Relative ............................................................................................................................................................................... 24
2. Absolute .............................................................................................................................................................................. 25
Geometric Correction........................................................................................................................................................................... 26
Sources of distortion ........................................................................................................................................................................ 26
Two approaches to geometric correction: ...................................................................................................................................... 28
Geometric Correction Steps ............................................................................................................................................................ 28
What is a “Ground Control Point” (GCP)? ................................................................................................................................... 28
Enhancement Techniques .................................................................................................................................................................... 33
Key concepts .................................................................................................................................................................................... 33
PLOTS WILL BE ON FINAL............................................................................................................................................................. 34
1. Contrast manipulation ........................................................................................................................................................ 34
2. Spatial Filtering - know this ................................................................................................................................................ 36
3. Band Math ............................................................................................................................................................................... 38
Image classification .............................................................................................................................................................................. 40
Final Review ......................................................................................................................................................................................... 51
Scale
• High spatial resolution to low spatial resolution; you loose
information.
• High spatial resolution = more detail
• Low spatial resolution = low detail
Types of sensors:
• Passive: energy (radiation) naturally emitted/reflected by an object, area, or phenomenon is detected by a
sensor and used to infer information about the object, area, or phenomenon from which energy originated.
o sunlight
• Active: energy is directed at an object, area, or phenomenon, and that portion which returns to a sensor is
used to infer information about the object, area, or phenomenon from which the energy returned.
o Interacts with objects in the landscape
o Platform producing the energy
o LiDAR
2. Films
A. Reflected light
• Type of film, and how
the surface reflects light
in different ways
o Sensitive to one
part of the light
spectrum
o Different parts
of the colour
spectrum shown
▪ Vegetation - reflects infrared colour
• Sensitivity of the film
• How the different surfaces reflect the light
• Infrared light absorbed by water
• Depends on application for picking type of aerial photo
• Why are these photos different?
o Electromagnetic spectrum - sun
▪ Suns photosphere....
▪ 400-700 nano meters - human vision
B. Surface reflectance
• Percentage of light reflected from a specific surface
• Green dominates reflection - 60% (our eyes only pick up 20%, not infrared)
• Asphalt - very low reflectance
• Water absorbs light - film won’t be able to detect it
C. Film properties
• Doesn’t separate light into BGR wavelengths
• Homogenous to 0.7 micrometers
• Panchromatic - not very many differences, an average of the
visible light reflected by the surface
• B&W infrared - easily differentiable
• Strong signal
• Water has very low reflection in the infrared
spectrum
• Natural
• Infrared colour
• Natural vegetation in red
• Turf inside stadium not natural
so does not reflect red
3. Photo properties
• Vertical: optical axis of camera at vertical (perpendicular to
the surface) 90* (+/-3)
o Field of view dictates the width of air photo
o The scale is essentially constant
o Directions can be measured
o Can be used as a map (with supplementary data)
o Stereo
o Info: planimetric (x, y) topographic (z - DEM)
General characteristics
• 23x23cm is constant
4. Interpretation
A. Stereoscopy:
• One eye: allow monocular vision (limited depth perception)
• Two eyes: depth perception = stereoscopy --> each eye focuses on the same object from a
different position and sends a slightly different image to the brain, where the two images are
focused in 3D.
1. Shape
• Shape: external form or configuration of an object
o Regular uniform shapes often indicate a human
involvement
o Ex. Linear, curvilinear, circular, elliptical, radial, square etc.
2. Size
• Size: a measure of the objects surface dimensions
o Absolute: length, width, perimeter, area and volume. It
depends on the scale of the aerial photo
o Relative: small, medium, large
▪ Ex. Single lane vs. Multi-lane highways
3. Tone/colour
• Tone/colour: related to the reflective characteristics of
objects within the photographic spectrum
• Tone/colour = f (surface spectral signature, film, filter,
sunlight)
• Representation
o Shades of grey (tone) - light (bright), intermediate
(gray), dark (black)
o colour
• Sand under the water -- signal not penetrating water
• Near infrared - mangrove most visible due to foliage
reflecting -- associated with red colour
• Water = low signal
4. Texture
• Texture: the visual impression of coarseness or smoothness caused by the
variability of uniformity (repetition) of image tone or colour
o Types: smooth (uniform, homogenous), intermediate, rough (course,
heterogenous)
o Ex. Calm water has a smooth texture; a forest canopy has rough
texture
o Scale dictates texture too
▪ Changes the visible texture depending on the different types of scale
6. Shadow
• Shadow: a shadow provides information about the object's height, shape,
and orientation
7. Association/Site
• Association/Site: associating the presence of one object with another, or
relating it to its environment
o Ex: industrial buildings often have access to railway sidings;
nuclear power plants are often located beside large bodies of
water
• Can also relate to elevation or vertical location (e.g. low-lying, or high
elevation)
8. Time
• Time: temporal characteristics of a series of photographs can be helpful in
determining the historical change of an area
o Ex: looking at a series of photos of a city taken in different years can
help determine the growth of suburban neighbourhoods.
• OPEN WATER
o Texture: Smooth, glossy surface
o Pattern: White and blue-green
o Color: Dark to light blue; flat blue
• MARSH
o Texture: Marbled; Rough
o Pattern: White with speckles of blue
o Color: Predominately white; White with blue; Light blue; Some pink
• FLATS and BEACHES
o Texture: Smooth to Rough appearance
o Pattern: Solid white elongated feature
o Color: Bright white; White with blue; Light gray-white
• URBAN
o Texture: Smooth; Flat
o Pattern: Regular patterns
o Color: White; Some red mixed in; White with gray mottled in regular patterns
• FORESTED WETLAND
o Texture: Rough, uneven surface; Bumpy
o Pattern: Generally, non-regular
o Color: Red; Dark red, even black; Brilliant red; Gray blue with some red spots
• WETLAND SCRUB-SHRUB
o Texture: Rough, uneven surface
o Pattern: Pin-point red dots
o Color: Light pink; Some red
• UPLAND FORESTED
o Texture: Dense red dots; Uneven surface
o Pattern: Dark red to black
o Color: Dark red; Some pink with red for scrub-shrub
• AGRICULTURE
o Texture: Smooth; Flat
o Pattern: Regular patterns; Possibly visible field rows
o Color: White; Pink; Red (dependent on season); Some gray mixed in with other colors
**note: this guide is referring to colour IR photograph
Classes
1. Trees (woody vegetation, forested areas, landscaped trees)
2. Fields (grass, other herbaceous vegetation, gravel, sand, dirt)
3. Impervious surfaces (concrete, buildings, etc.)
Scale
• Scale is the ratio between the distance on the photo and the corresponding distance on the ground
• Expressed as a representative fraction (RF)
o Written as 1:100000 etc.
o LARGE SCALE: large scale photos (ex. 1:25,000) cover smaller areas in greater detail. A large scale
photo simply means the ground features are at a larger more detailed size
o SMALL SCALE: smaller scale photos (ex. 1:50,000) cover larger areas in less detail. A small scale photo
simply means that ground features are at a smaller less detailed size.
o Focal length (f): linear distance from the center of lens to the focal
plane
Example
o You have a photograph that was taken with a camera of focal
length 210mm and average ground elevation of 400m at a flight
altitude of 2500m above mean sea level
RF = 1/ H/f
▪ Flight altitude :2500m
▪ Average ground elevation 400m
▪ F = 210mm
• RF = 1/(2500-400m)/0.21m)
• RF = 1/10,000
1:10,000
• This is the average or nominal scale of the photograph
• ONLY GUARUNTEED FOR PP (principal point, center of aerial pic)
• Example
• Distance between two roads on a map = 6.0cm. Map scale =
1:50,000 (RF = 50,000). Distance between the same two roads
on the photo 3.2cm
RF = 1/ (MD)(MS)/PD
RF = 1/ (6*50,000)/3.2
RF= 1:93,750
Mission planning
• Each flight costs money
• The best coverage requires a trade-off between scale and coverage
• Scale is based upon what you are studying and what the final product will be
• Hopefully you have enough funding to get great coverage at great resolution
• 120km2 = ~$18,000
• Higher altitude = scale may not be sufficient
1. Study area
• Basic info
1. Delineation of the area to be photographed
2. Flight altitude: important considerations
• Maximum limit 13,700m for jet aircraft
• Lower limit 300m for fixed wing
• Unpiloted aerial vehicle (UAV) (special flight operations certificate)
3. Standard 23X23 cm photo or digital
3. Film-filter combination
i. Films
• Black and white
• Panchromatic -- not sensitive to blue,
green, or red; everything approximated in
grey level
• Infrared
• Colour
• Normal (natural)
• Colour infrared (false-colour) -- takes into consideration the infrared part of the
spectrum
• $ black and white < natural colour < colour infrared $
ii. Filter: subtraction of light
• Yellow filter: subtracts blue (minus
blue) -- most commonly used
• Magenta filter: removes green
• Cyan filter: remove red
• Others: polarized, natural density
4. Film tilting
• Date; project code; exposure numbers; scale; time; photo
number etc.
7. Photo alignment:
- drift should not affect more than 10% of the print width for any 3 consecutive photographs
8. Tilt
• Not exceed 2° to 3° for a single exposure
• Average < than 1° for the entire project
9. Flight-plan map:
• How do we calculate this?
A. Flight altitude above terrain
B. Flight altitude above sea level
C. Lateral ground distance per photo
D. Alignment of flight lines
E. Number of flight lines
F. Number of photos per flight
G. Total number of photos
Number of air photos that will be required to cover the study area?
We need to calculate all of these
(a). flight altitude above terrain (e) number of flight lines
(b) flight altitude above sea level (f) number of photos per flight line
(c) lateral ground distance per photo (g) total number of photos.
(d) alignment of flight lines
H = (RF)(f)
H = (25000)(0.152)
H = 3,800m above terrain
D = (RF)(d)
D = (25,000)(0.23)
D = 5,750m or 5.75km
• D and d should be proportionate
(D) : Alignment
• Alignment : North-South Alternating
• We know: Area specification=20 km east-west by 35 km north-south
(a) flight altitude above terrain
(b) flight altitude above sea level
(c) lateral ground distance per photo
(d) alignment of flight lines
(e) number of flight lines
(f) number of photos per flight line
(g) total number of photos
𝑊
𝑁𝐿 = +2
(𝐷)(𝑆𝑔)
20
𝑁𝐿 = +2
(5.75)(0.7)
= 4.97 +2
= 6.97 or 7 flight lines
• ALWAYS ROUND UP
Sensors
IR = Infrared
Sensors are designed considering
• How the earth reflects radiation
• Light at different parts of the sun spectrum
• ABGR-IR
• Grass has high reflectance in the green part of the spectrum;
increases greatly with IR
• Vegetation absorbs blue and red light
o Photosynthesis!!!
Atmosphereic windows
• Atmosphereic window = Portions of the spectrum that
transmit radiant energy effectively -- NOT ABSORBED
• Light is going through the atmosphere with relative
ease in the light sections of the plot
o Ozone blocks light
▪ Not absorbing light --> transmitted
through atmosphere
▪ Can’t have satellites operating in the
ozone (0.2micrometres)
o Visible light = atmosphereic window
▪ Specific wavelengths -- cannot build a satellite for (dark dips in image)
Our space
• High vs low orbit satellites
• Number of debris objects estimated by statistical models to be in orbit
o 36500 objects greater than 10cm
o 1000000 objects from greater than 1cm to 10cm
o 330 million objects form greater than 1mm to 1cm
• Garbage collectors in atmosphere
ClearSpace 1 - 2025
• Escort it down to a lower orbit where the debris will enter the atmosphere and
burn up
o To burn it up -- won't come to earth
• Remotely-sensed imagery is stored using a raster representation. Each band is a raster layer.
• When L reaches satellite through atmosphere, it is AT Sat L
Radiometric resolution --
• takes a mean value of pixel cell (that are not
homogenous) to create value
• Ex: Red band (= sensitive to red wavelength)
o The DN for each pixel corresponds to the average radiance coming from the surface within the pixel
cell.
Image characteristics
1. Spatial resolution
o It is the smallest linear separation between two objects that can be
solved by the remote sensing system.
o Corresponds to the cell size used to define the pixel size in the raster
file. Depends on the IFOV (instantaneous field of view)
▪ Large cell size = low spatial resolution
▪ Small cell size = high spatial resolution
2. Spectral resolution
o The number of and size (width) of specific wavelength intervals in the electromagnetic spectra to
which a remote sensing instrument is sensitive
o Visible light 400 nano metres to 700 nano metres (or 0.4 micro metres to 0.7 micro meters)
o Thermal infrared -- emitted light
• Water is absorbing the infrared light which is why we have a lower digital number
o Water is a greater absorber of infrared light, it reflects green and red light
3. Radiometric resolution
o Sensitivity of a remote sensing detector to differences in signal strength
as it records the energy reflected from the terrain
o The number of bits that you image has -- related to the sensitivity of the
detector
▪ Defines the number of discriminable signal levels
▪ 8 bit -- 256 variability of values
• The larger the bit-image, the better the quality of the
image
4. Temporal resolution
o Images are collected on regular time
intervals; other are only collected upon
request
o Depends on:
▪ Orbit:
• Polar -- landsat
• Equatorial
• geostationary -- assimilation of
data plus a model for the weather
forecast, continually collects data
at one site
▪ Swath: narrow x wide -- wider swath more
easily gathers information
▪ Sensor Configuration
• *continuous x programmed (landsat is not programable) at 90°
• *nadir looking x side looking- clarify
Sensors
• As used for earth observation
o Weather satellites/sensors
o Land observation satellite/sensors
o Marine observation satellites/sensors
• Many different payloads -- multiuse
• New satellites equipped with less payloads to reduce size and light
• Geostationary Orbit
• Platform moves at the same rotation as the Earth; appears to be
stationary.
• It is always above the same location
• ~ 35, 000 km – high orbit
• Higher temporal resolution
• GOES -- used for weather forecasting
Landsat 8
• Grey areas -- high atmospheric transmission, light is being absorbed by grey parts
SPOT series
• Système Pour l'Observation de la Terre
• High Resolution Visible (HRV) than Landsat
• French space company -- beginning of the 80s
• Second longest time series
• Passive, Sun-synchronous
• Spot 6 and 7:
• Footprint 60 by 60 km
• Bands:
SPOT 6
• 2012
• Panchromatic: 1.5m
• Multispectral: 6.0m
• 60km x 60km
• SPOT has the ability to collect off-nadir movements to acquire images
from the side orbits, costs more, reflectance values can get
compromised, emergency use is most common for this function
• Landsat only acquires images with a nadir observation, 90* to the ground
• Panchromatic bands -- difficult to separate features on the ground but very high resolution
• Colours of visible bands and resolution of panchromatic
IKONOS
• First to collect publicly available high-resolution
imagery at ~1- and 4-meter resolution
• Launched on Sep 24th, 1999.
WorldView 3 – 2014
• 31cm Pam; 1.24 m Multispectral
Spectral Resolution
• Changes within generations of high spatial resolution satellites with
positions of the bands
Active sensors
• RADARSAT-1 (1995-2013) and 2 (2007-) – MICROWAVE
• Produce their own energy
• Radar -- very long wave lengths -- 35cm wavelength
• Interacts with the roughness of the surface then
bounces back
Active microwave
• Much longer wavelengths
• Bands defined by letters
• Can see through the atmosphere... Mostly
• Even through clouds
Backscatter
• We measure the signal that is backscattered from the earth
• Low to medium to high signals
• Low signal = dark surfaces
• Medium = interacting with the roughness of the surface
• High = energy will hit the surface, considering geometry of the surface
the light is backscattering from it may bounce directly back and give a
very bright signal, metal gives a strong signal -- double bounce back
scatter
RADARSAT-2
• Several “modes” including fine resolution
• RCM = RADARSAT constellation mission?
• Can program images to cater to what areas need mapping,
• Very programable -- very expensive
Several modes
• Antenna detecting the roughness of the surface
• Smooth areas -- backscatter will be away from the
antenna
• Fine/ML -- may be able to see the boats due to their
high bounce back rates
Marine safety
• Iceberg detection
• Oil spill detection -- RCM and RADARSAT 2, can detect boat and oil
Sentinel -1 ESA
• C band -- same as RADARSAT and RCM
• Limitation on spatial resolution
• Free data
• Double bounce off metallic surface where boats are, and dark surface is where oil is.
• Doesnt matter if its cloudy
• Wave size in
perspective
• Microwave is the
longest, Infrared is
longer than visible,
visible longer than
ultraviolet
UV to
400nm -- 500nm = Blue
500nm - 600nm = green
600 nm - 700nm = red
To infrared
Scattering
• Scattering = the direction of the scattered light is not always predictable
1. Rayleigh
o Scattering which occurs when the size of the particles is smaller than the
wavelength of light
o Mainly impacts short wavelengths in the upper 4.5km of the atmosphere
o If we didn't have this scattering, the sky would not be blue! -- molecules are
about the same size as the visible light spectrum, dominantly in the blue.
o Needs to be corrected -- a lot of signals from the atmosphere
o The least rayleigh scattering in the red wavelengths
2. Mie
o Occurs when there is sufficient quantity of materials with diameter
approximately the same size of the wavelength
▪ Dust and smoke
o Longer wavelengths impacted in the lower atmosphere (< 4.5km)
o Rayleigh no longer dominant
o Can be seen on mars
3. Nonselective
o Particles are > 10 times the size of the wavelength
o Lowest portions of the atmosphere
▪ Water droplets
o Equal amounts of all visible wavelengths are scattered causing a
white appearance -- nonselective
o Creates white as all the visible light is scattered evenly
o Diagram
o Not very important for remote sensing
o Difficult to correct
Absorption
• Process by which radiant energy is absorbed and converted into other forms of energy
• In the atmosphere, efficient absorbs of solar radiation include: water vapour, carbon dioxide, and ozone
• Need to be careful -- molecules get absorbed by gases in the atmosphere
Atmosphereic windows = portions of the spectrum that transmit radiant energy effectively -- least absorbed by gases
• Dark means its not transmitted through the atmosphere -- its been absorbed, suspects listed in diagram
• No gases that successfully absorb visible light radiation
overall residual
Radiometric/atmospheric correction
Important concepts:
Reflectance (Rλ): is the ratio of the radiant flux reflected from a surface (Lλ) to the
radiant flux incident to it (Eλ), at certain wavelengths
(E lambda = irradiation incident -- wavelength dependent)
(L lambda = radiation from a surface)
• Converting radiance measurements to reflectance provides a means of
normalizing illumination differences
𝑅λ = 𝐿λ/𝐸λ
• Information from the target, atmosphere and from adjacent pixels
Amount of light will be greater in July so it will be better for viewing foliage,
shorter path for radiation to travel -- sun elevation level
Scene illumination
• Solar elevation controls scene illumination, which has a big impact on radiance
measurements
• Radiance measurements may need to be normalized to facilitate comparisons
between different dates
• Low irradiance during winter, low sun elevation -- corrections improve
irradiance
• E -- irradiance coming in
o Some light may not even reach the surface, as if it is all coming from a single pixel -- path radiance (LP)
= interference
• Digital number without any atmospheric correction = path radiance + target radiance + adjacency radiance
Rayleigh doesnt affect NIR or IR -- only addressing scattering in the visible wavelength
Types of corrections:
1. Relative
Types of correction?
2. Absolute
i. Statistical methods – ELC (Empirical Line Calibration)
• forces remote sensing image data to match in situ spectral reflectance measurements (ideally obtained at
roughly the same time)
ii. Models: Physical models to account for sensor and atmosphere characteristics at the time of image
acquisition
Geometric Correction
Imagery needs to be spatially anchored to the
world in order to allow images and data from
the same areas to be used together, or to
compare among images.
• It is therefore usually necessary to
remove geometric distortion so that
individual pixels are in their proper
planimetric (x,y) map locations
Sources of distortion
• Remotely sensed imagery typically exhibits internal and external geometric
error.
Example:
• Scan skew (Earth rotation)
• Panoramic distortion (distortions away from
nadir)
B. Platform attitude: One sensor system axis is usually maintained normal to Earth's
surface and the other parallel to the spacecraft‘s direction of travel. If the sensor
departs form this attitude, geometric distortion results (platform roll (x axis),
pitch (y axis) and yaw (z axis)).
1. Image to Image
• Reference one image relative to another:
o alignment process by which two images of
like geometry and the same area are
positioned coincident with respect to each
other.
o The source image is considered geometrically
correct
o The uncorrected image is tied to the source
image
o GCP = ground control point
2. Image to Map
• The image is “tied” to a map, which is based on a
specific projection.
• The referenced map is the source: hard-copy
planimetric maps, digital format, digital
orthophotos, GPS data
Statistical Models
• Polynomials: This math model produces the 'best' fit, mathematically, to a
set of GCPs.
• Only three pair of GCPs are required (to define the equation of the line) but
more can be used to better define the geometric properties of the image.
General rule:
• For moderate distortions, a first-order transformation is sufficient to rectify the imagery to a geographic frame
of reference. -- do not go more than 2nd order
• This type of transformation can model six kinds of distortion in the remote sensor data, including:
o translation in x and y
o scale changes in x and y
o skew, and rotation.
• Requires 6 or more pair points, depending on the degree of the polynomial. Higher orders require more GCPs
• The greater the image displacement, the higher the order of the polynomial needed to correct it.
• Increasing the degree of the polynomial results in unstable image edges, which must be controlled by
increasing the number of GCP’s.
3. Evaluation -- accuracy
• Root Mean Square (RMS) error of each GCP: indicates the level of distortion in the out put new image after the
transformation (after applying the polynomial equation)
• An acceptable RMS error is equal to half of the size of one cell on an image (RMSE < 0.5). But, it depends on
the application…..
• How do we calculate RMS error?
Where:
• xorig and yorig are the original row and column coordinates of the GCP in the uncorrected image (location you
select); and
• x′ and y′ are the model predicted coordinates (location model estimate) in the output image when we utilize
the polynomial equation.
• The difference between the “location the model estimated” and the “location you select” is the residual.
• By computing RMS error for all GCPs, it is possible to (1) see which GCPs contribute the greatest error, and (2)
calculate the global RMS error (Global RMS).
• What to do where RMS error is high:
1. reject the GCP
2. collected more GCPs
3. Compute new RMS error
4. you can also move to a higher polynomial transformation, however you must keep in mind the
implications of this in terms of preservation of parallel lines image warping
Enhancement Techniques
Atmospheric corrections --> geometric correction --> enhancement --> classification
Introduction
• Applied to remote sensing data to aid human visual and digital image analysis/interpretation, and feature
identification
• Once applied, enhancements can greatly aid the classification process
Digital numbers
• Recall: Digital numbers (DN) or brightness value/gray level, are the values that are
stored in the cells of an image file
Spectral profile
• SPOT 20x20m
• Multispectral data of Marco Island Florida
• SPOT colour composite
o B3 - NIR lambda
o B2 - Red lambda
o B1 - green lambda
How to plot a spectral profile
• Do not connect to y axis, implies neighbour value has the same value of previous
band
• Gaussian distribution
• Very dark with no enhancements
• All 8-bit
• Dynamic range - min and max, range of digital numbers
• Enhancements stretch range, and make values use the entire
dynamic range
o Linear enhancement
• Linear enhancement stretches the values of the digital numbers
of the original image
• Still using 8bit image
1. Contrast manipulation
• Procedures applied to image data in order to more effectively visualise or
digitally analyse features
• In lab when you apply simple enhancements you improve visualization
(change DN on the display) but do not change the original DN values of
the image (display only)
• However new images can be derived from the original image and saved
as new image files with new enhanced DN values (...write new file)
How does it work?
• DNs of a single image usually have a range less than 256
• (8 bit)
• You can take a narrow distribution of DNs ans stretch it out over the full 256 range,
enhancing the image
where:
DNOut(i,j) = output digital number at row i, column j,
DNIn(i,j) = input digital number at row i, column j,
MIN = minimum value found in the data distribution,
MAX = maximum value found in the data distribution,
DNMAXRANGE = maximum range in data value possible (usually 255).
Example:
• What would be the new min and max of the new enhanced
image?
1. Locate a pixel
2. Define a neighbourhood (mask or kernel)
3. Based on the neighbourhood, compute new value
• Output Pixel = (10*1 + 2*1 + 19*1 + 18*1 + 10*1 + 8*1 + 15*1 + 20*1 + 9*1) /9 = 12.33 =12
o Multiplied by the weigh of the mask then averaged
• If output is 8 bit image, 5.67 will round to 6, 32 bit image would be left
as 5.67
o Has to do with the number of 0s and 1s allowed, significant
figures? -- storage capacity, larger file, defines precision of the
number (will specify for exam)
3. Band Math
• Amplify a specific signal resulting from the division or addition
or subtracting of DN values in one spectral band by the
corresponding values in another spectral band
• Reduce the number of data sets that we input into the
classification process
• Useful for simple change detection or for adding
interpretation
Image subtraction
Band ratio
Vegetation Indices
• Common ratios have been developed for
interpretation of vegetation:
o Simple Ratio
o Normalized Difference Vegetation Index
(NDVI)
Image classification
• Classification scheme
• Unsupervised Classification
• Supervised Classification
• Accuracy Assessment
Introduction
• A key goal of many remote sensing project is to produce
a thematic interpretation of the image(s)
Classification Process
1. State the nature of the classification problem what do you
want to achieve
o Classes of interest? (classification scheme) (Step 1)
2. Acquire appropriate data (Step 2)
o Remote sensing data (consider spatial, spectral,
temporal resolutions)
o Environmental considerations (atmospheric, phenological cycle, etc) •
o Ground reference data? Ancillary data? (DEM, GIS, maps etc)
3. Data Pre-Processing (Step 3)
o Radiometric correction, Geometric, Enhancements
4. Classification (Step 4)
5. Accuracy (Step 5)
Step 1:
• Classification scheme: Define the Classes of interest
o Should be achievable, given the available data and methods
o Ex: different kinds of crops, different tree
species, or different geologic units or rock
types, different water types
o Depends on: spatial and spectral
resolutions
Spectral resolution
• Different “classes” have different spectral signatures
Spectral Signatures
• Spectral signatures tend to be variable
• A broad information class (e.g. vegetation) may contain a
number of spectral sub-classes with unique spectral variations
o Ex: different vegetation forest, grass, shrub, etc
• Spectral mixing = Pixels are not pure -- causes pixel mixing, takes the average of the pixel to compute the DN,
hard to control but important
• They contain more than just vegetation: tree, shadow, soil
• 30m vs 2m spatial res?
Step 4: Classification
Types:
1. Unsupervised classification
- used when there is no a priori knowledge of cover characteristics.
Image is automatically grouped into spectral classes
o Spectral classes (clusters) are grouped solely based on the
DN/reflectance in the image and the clusters are
interpreted (assigned classes) a after by the analyst
• Pixel 1 and pixel 2 should be in the same cluster since our Euclidean distance is lower than our defined R
• Creates a new cluster with new DN values that are averaged between the two pixel values
• Average of (10,10) and (20,20) = (15,15)
• New cluster will be generated for pixel 3 since it is greater than R value of 15.
2. Supervised classification
- used when there is information available for determining classes and for identifying
training areas; analyst defines the class
a. Classification scheme
b. Select training areas
c. Analyse spectral separability
d. Classification
a. Classification scheme
B. Training areas
▪ Training area = a sample of the Earth's surface with known
cover
▪ The statistics of the DN values within the training area are
used to determine decision boundaries in the classification
1. Each training area is usually composed of many pixels. And, many training areas are generally defined for each
class. The general rule is that if training data are being extracted from n bands then > 10n pixels of training
data are collected for each class. This is sufficient to compute the requirements of some classification
algorithms
• Issue for hyperspectral images
• As homogenous as possible
2. The DN for the pixels comprising the training area are used to "train" the algorithm to recognize spectrally
similar (similar DN) pixels in each band
• Must be representative
3. To ensure effective classification, training data must be complete and representative
• May have pixels that are unclassified -- didnt fully cover the spectral variability of the image
• Will result in unclassified pixels
• DN will be much higher -- may want to add two classes (clear vs turbid water) then join after
classification
2) Maximum Likelihood
▪ Maximum likelihood decision rule based on probability
▪ Probability of a pixel belonging to each of a predefined set of training classes is calculated, and the pixel is then
assigned to the class for which the probability is the highest
▪ How do we obtain the probability information we will need from training data we have collected?
• Consider the hypothetical histogram (data
frequency distribution) of forest training data
obtained in band k. We could choose to store
the values contained in this histogram in the
computer, but a more elegant solution is to
approximate the distribution by a normal
probability density function (curve), as shown
superimposed on the histogram
Accuracy assessment
• Testing samples
• Accuracy
Accuracy assessment
• Once an image has been classified it is essential to report the accuracy of the classification
• What do we need?
1. Classified map
2. Testing data
3. Accuracy assessment
1. Classified map
2. Testing data
• Usually, they are generated the same way as training data: field data, analyst knowledge, additional
imagery, maps
• Optimally, testing data are collected in the field
i. How many testing samples?
• Optimal sample size (N) is based on a probability model:
2. systematic sampling
• sample through the area in a consistent and orderly manner.
• Issues:
(i) may sample similar class over and over (ex: forest; water)
(ii) may over or under estimate representation of a class.
3. Accuracy assessment
• A common way of expressing accuracy is using a classification
error or confusion matrix
• Compare on a class-by-class basis relationships between
known reference data (testing data) and classification results
▪ Example: not enough testing samples -- should be 250
for proper accuracy
▪ Producers accuracy
• From the perspective of the maker of the classified map, how well a certain class was classified.
• For a given class, how many testing areas are correctly classified on the map?
• Calculated as:
Number of correctly classified testing areas on a given class
Number of testing areas of that class as derived from the reference data
• For practice…calculate the User’s Accuracy and Producer’s Accuracy for the other classes
Final Review
35%
Format
• Short answer
• Multiple choice
• Lectures 4 to 11
o Satellite images
o Spectral resolution
o Atmospheric correction
▪ Aerial photography is not part of the final
Main topics
• Properties of digital remote sensing image
• Sensors
• Light attenuation
• Atmospheric correction
• Geometric correction
• Image enhancement
• Classification
• Accuracy assessment