Download as pdf or txt
Download as pdf or txt
You are on page 1of 203

Course: Remote sensing

By
Mulualem A & Endalkachew S
Fundamental of Remote Sensing
➢ Introduction To Spatial Data Acquisition
➢ Element/Process Of Remote Sensing
➢ Application Area Of Remote Sensing
➢ Electromagnetic Energy And Remote Sensing
➢ Platforms, Orbit, Sensors And Concept Of Resolution
➢ Image Enhancement, Visualization And Classification
➢ Visual Image Interpretation
➢ GPS And Remote Sensing
UNIT ONE
Data Acquisition
and

Mulualem A & Endalkachew S


Spatial Data Acquisition Techniques

A geographical features or spatial and non-spatial data was used for


different applications and purpose like

a) Geologist: Changes of the earth’s surface with mm accuracy


b) Ecologist: Surveying bird species in the jungle may be satisfied with 100m
positional accuracy.
c) Environmental and/or Natural Resource manager uses different spatial
data to assess different pollutants and identification of resources …..
d) Etc.…..

Ground based or/and Remote sensing enables to acquire spatial data.

Mulualem A & Endalkachew S


Spatial data and attribute values are acquired by:
1. Ground based approach
▪ Land surveyors, forestry, natural resources manager, geographer and
etc. observe and measure the outline of the object in the ground.
How to determine the coordinates on the ground by using this method?
Example …..
A) Total stations measure distances and angles to points with a
known position.
From these data coordinates of the unknown points can be

calculated.
B) Thematic (attribute) data are collected by:
▪ Observation
▪ Questionnaire
▪ Interview
▪ Field tests
▪ Laboratory measurements, etc.

Mulualem A & Endalkachew S


2. Remote Sensing Approach
Spatial and attribute data are collected using:
▪ a) Aerial Photograph
▪ The photo interpreter /photogrammetric determines the
outline of the building and measures the coordinates from
the photo
▪ Geometry of an area object and interpretation of the
situation by our visual system (observation).
▪ b) Earth observations which cover all remote sensing
techniques directed at retrieving information about the
earth.

Mulualem A & Endalkachew S


E.g……Some Remote Sensors

Mulualem A & Endalkachew S


▪ What are the differences between grounds based and

remote sensing based data acquisition techniques?

▪ Which one did you choose?

Mulualem A & Endalkachew S


What are the differences between grounds based and remote sensing
based data acquisition techniques?

1. In ground based approach, the land surveyor interprets the real world
by visual observation and then determines which point defines the
outline of the object whereas in the remote sensing based approach an
aerial photograph is used to determine the outline of the object by the
photo interpreter.

2. In ground based approach, the coordinates of the object are measured


using surveying instrument or GPS by the surveyor whereas in remote
sensing based approach the coordinates of the object are measured by
the photo interpreter from the photo.

3. In ground based approach, the characteristics of the object or attribute


data are collected through data collection techniques (observation,
questionnaire ,etc.) by the surveyor whereas in remote sensing based
approach, the type of the object or attribute data are determined by the
interpreting the image characteristics on the photograph.

Mulualem A & Endalkachew S


4.In ground based approach, observation and measurements
are carried out on the ground whereas in remote sensing
based approach, observation and measurements are carried
out on the image taken in the air or in space.

5. It is more difficult to collect data on dangerous or


inaccessible areas by using Ground based apprach. But
remote sensing makes it possible to collect data on dangerous
or inaccessible areas. Remote sensing applications include
monitoring deforestation in areas such as the Amazon Basin,
glacial features in Arctic and Antarctic regions, and depth
sounding of coastal and ocean depths. Military collection
during the Cold War made use of stand-off collection of data
about dangerous border areas.

6. Ground based approach is also costlier and slower in data


collection, and is more likely in the process that areas or
objects are more disturbed than Remote sensing approach.
Mulualem A & Endalkachew S
Definition

"Remote sensing is the science and to some extent, art of


acquiring information about an object, area or phenomenon
through the analysis of data acquired by a device that is not in
contact with the object, area or phenomenon under investigation.

Remote sensing is the acquisition of information about


an object or phenomenon, without making physical
contact with the object.

This is done by sensing and recording reflected or emitted energy


and processing, analyzing, and applying that information."

Mulualem A & Endalkachew S


History of Remote
Sensing
◼ Balloon photography
(1858)
◼ Pigeon cameras
(1903)
◼ Kite photography
(1890)
◼ Aircraft (WWI and
WWII) Images: Jensen (2000)

◼ Space (1947)

Mulualem A & Endalkachew S


➢The sensor: your eye responding to the light reflected
from the screen
➢The data: impulses acquired by your eye
corresponding to the amount of light reflected from the
light and dark areas on the screen
➢The analysis: your mental computer (brain)
interpreting letters, words, and sentences to derive
information

Mulualem A & Endalkachew S


Mulualem A & Endalkachew S
Mulualem A & Endalkachew S
▪ Basic Process of Remote Sensing!
Who Can interpret this ?

Mulualem A & Endalkachew S


1. Energy Source or Illumination (A) – the first requirement for remote
sensing is to have an energy source which illuminates or provides
electromagnetic energy to the target of interest.

2. Radiation and the Atmosphere (B) – as the energy travels from its source
to the target, it will come in contact with and interact with the atmosphere it
passes through. This interaction may take place a second time as the energy
travels from the target to the sensor.

3. Interaction with the Target (C) - once the energy makes its way to the target
through the atmosphere; it interacts with the target depending on the
properties of both the target and the radiation.

Mulualem A & Endalkachew S


4. Recording of Energy by the Sensor (D) - after the energy has been scattered by, or
emitted from the target, we require a sensor (remote - not in contact with the target)
to collect and record the electromagnetic radiation.

5. Transmission, Reception, and Processing (E) - the energy recorded by the


sensor has to be transmitted, often in electronic form, to a receiving and processing
station where the data are processed into an image (hardcopy and/or digital).

6. Interpretation and Analysis (F) - the processed image is interpreted, visually


and/or digitally or electronically, to extract information about the target which was
illuminated.

7. Application (G) - the final element of the remote sensing process is achieved when
we apply the information we have been able to extract from the imagery about the
target in order to better understand it, reveal some new information, or assist in
solving a particular problem.
Mulualem A & Endalkachew S
▪ To sum up, remote sensing refers to the activities of
recording/observing/perceiving (sensing) objects or events at far away (remote)
places.
▪ In remote sensing, the sensors are not in direct contact with the objects or
events being observed.
▪ The information needs a physical carrier to travel from the objects/events to the
sensors through an intervening medium.
▪ The electromagnetic radiation is normally used as an information carrier in
remote sensing.
▪ The output of a remote sensing system is usually an image representing the
scene being observed.
▪ A further step of image analysis and interpretation is required in order to
extract useful information from the image.

Mulualem A & Endalkachew S


Remote sensing cycle

◼ Remote Sensing Includes:

◼ A) The mission plan and choice of sensors

◼ B) The reception, recording, and processing of the


signal data

◼ C) The analysis of the resultant data.


i. Visual interpretation of digital data using computer
software (e.g. Erdas) and stereoscopes.
ii. Using image enhancement techniques.
iii. Generation of maps, tables, computer files for
further decision making using GIS
iv. Presentation to the users

Mulualem A & Endalkachew S


APPLICATION
OF
REMOTE SENSING

Mulualem A & Endalkachew S


Application of Remote Sensing

Each sensor is designed with a specific purpose. With optical


sensors, the design focuses on the spectral bands to be
collected.
These and other factor plays important role in defining which
applications the sensor is best suited for.
– Each application itself has specific demands, for
spectral resolution, spatial resolution, and temporal
resolution.
• There can be many applications for remote sensing, in
different fields, as described below.
Mulualem A & Endalkachew S
Urban & Regional Planning
Scope
• Mapping & updation of Lyari Express Way – Section (Essa Nagri)
city/town maps
• Urban sprawl monitoring
• Town planning
• Facility management
• GIS database development

Benefits
• Better decision support,
planning & management
• Rapid information updation
• Infrastructure development
monitoring
• Spatial information analysis
Agriculture

• Satellite and airborne images are used as mapping tools to


classify crops, examine their health and viability, and monitor
farming practices. NDVI
• Agricultural applications of remote sensing include the
following:
– crop type classification
– crop condition assessment
– crop yield estimation
– mapping of soil characteristics
– mapping of soil management practices
– compliance monitoring (farming practices)

Mulualem A & Endalkachew S


Agriculture
Scope
• Crop acreage estimation
• Crop modeling for yield &
production forecast / estimation
• Crop & Orchard monitoring

Benefits
• Timely availability of crop
statistics for decision making &
FFC Goth Macchi
planning Dec 16, 2005, Pre-Frost Jan 12, 2006, Damage Mar 05, 2006, Recovery

• Crop growth monitoring


• Soil status monitoring
• Regular reports regarding total
area under cultivation

Banana Plantation – Muhammad Pur (Ghotki)


Flood Damage to Standing Crops

Pre Flood – 17 July 2006 Post Flood – 09 Aug 2006

● Muhro Mari ● Muhro Mari


● Darapur ● Darapur
● Kot Shahgarch ● Kot Shahgarch

10098 acr

● Godhpur
● Godhpur
● Phulani
● Phulani
● Than Lake
● Than Lake
● Goth Lataran
● Goth Lataran
● Shahpur
● Shahpur ● Ural
● Ural
● Junno Dhand
● Junno Dhand

● Goth Raza Mahar


● Goth Raza Mahar
● Goth Azizpur
● Goth Azizpur

3516 acr

Mulualem A & Endalkachew S Sukkur


Forestry
Scope
• Satellite image based forest
resource mapping and updation
• Forest change detection
• Forest resource inventory
• GIS database development

Benefits Sarhad Reserve Forest (Ghotki)


• Availability of baseline
information
• Planning for aforestation
strategies
• Futuristic resource planning
• Sustainability of environment
• Wild life conservation & development
for recreation purpose Nausharo
Firoz
Geology
• Geology involves the study of landforms, structures, and
the subsurface, to understand physical processes creating
and modifying the earth's crust. It is most commonly
understood as the exploration and exploitation of mineral
and hydrocarbon resources, generally to improve the
conditions and standard of living in society.
• Geological applications of remote sensing include the
following:
– surficial deposit / bedrock mapping
– lithological mapping
– structural mapping
– sand and gravel (aggregate) exploration/ exploitation
– mineral exploration
– hydrocarbon exploration
– environmental geology

Mulualem A & Endalkachew S


Hydrology
• Hydrology is the study of water on the Earth's surface,
whether flowing above ground, frozen in ice or snow, or
retained by soil
• Examples of hydrological applications include:
– wetlands mapping and monitoring,
– soil moisture estimation,
– snow pack monitoring / delineation of extent,
– measuring snow thickness,
– determining snow-water equivalent,
– river and lake ice monitoring,
– flood mapping and monitoring,
– glacier dynamics monitoring
– river /delta change detection
– drainage basin mapping and watershed modeling
Mulualem A & Endalkachew S
Landuse / Landcover Mapping
Scope
• Monitoring dynamic changes
• Urban/Rural infrastructure
• Waterlogging & salinity

Benefits
• Assessment of spatial distribution
of land resources
• Infrastructure monitoring
• Availability of usable land
• Future planning for better land
management for socio-economic
development
Coast Resource Mapping
Scope
• Mangrove forest monitoring
• Change detection
• Hazard impacts
• Aqua-culture zones

Benefits
• Availability of updated
information on mangroves
forest
• Planning strategies for
aforestation and deforestation
trend
• Timely Intervention in specific
areas as and when needed Satellite image Mangroves forest map
➢ Atmospheric Modeling
air pollution
Mulualem A & Endalkachew S
climate change
➢ Ocean; topography
What is the Difference between GIS and Remote Sensing ?

➢ A geographic information system (GIS) is a computer-based tool for mapping


and analyzing features and events on earth. GIS technology integrates
common database operations, such as query and statistical analysis, with
maps. On the other hand, remote sensing is the science of collecting data
regarding an object or a phenomenon without any physical contact with the
object. Below are some of the differences between remote sensing and GIS.

Mulualem A & Endalkachew S


1. It can retrieve large amounts of data:
2. It reduces manual field work dramatically:
3. It allows retrieval of data in difficult regions or impossible to
access:
4. It allows collection of more data in a short period of time:
5. Mostly use in data collection.
6. Has a more complex user interface:
7. It covers a limited study area at a time:
8. Remote sensing technology is far less robust than a GIS system
because of its limited ability to interpret the data and also more
susceptible to damage.
9. Less ideal for communicating information between departments:

Mulualem A & Endalkachew S


1. It can cope with larger amounts of data:
2. It can cover large study areas:
3. It can cope with unlimited and frequent data edits:
4. More robust and resistant to damage:
5. Faster and more efficient:
6. It requires less person, time and money:
7. Mostly used for data analysis:
8. Has a more simplified user interface:
9. Is an ideal tool for communication between different
departments:

Mulualem A & Endalkachew S


ELECTROMAGNETIC ENERGY
and
Remote Sensing

Mulualem A & Endalkachew S


Electromagnetic Radiation
What is radiation?

Mulualem A & Endalkachew S


▪ The sun is the most obvious source of electromagnetic radiation
for remote sensing.
▪ However, all matter with temperatures above absolute zero (0 K,
where n °C = n + 273 K) radiates EM energy due to molecular
agitation.
▪ Agitation is the movement of the molecules.
▪ The amount of energy radiated by an object depends on its
absolute temperature, its emissivity and is a function of the
wavelength.
▪ A black body completely absorbs and reemits all radiation
incidents (striking) to its surface.
▪ The emitting ability of a real material
compared to that of the blackbody is referred to as the
material’s emissivity.

Mulualem A & Endalkachew S


▪ The first requirement for remote sensing is to have an energy source to
illuminate (to observe) the target (unless the sensed energy is being emitted
by the target). This energy is in the form of electromagnetic radiation.
▪ The most important source of electromagnetic energy at the Earth’s surface is
the sun. Many sensors used in remote sensing measure reflected sun light.
Some sensors, however, detect energy emitted by the earth itself or provide
their own energy.
▪ Energy recorded by remote sensing systems undergoes fundamental
interactions.
▪ For example, if the energy being remotely sensed comes from the Sun, the
energy:-
▪ is radiated by atomic particles at the source (the Sun),
▪ propagates through the vacuum of space at the speed of light,
▪ interacts with the Earth's atmosphere,
▪ interacts with the Earth's surface,
▪ interacts with the Earth's atmosphere once again, and
▪ finally reaches the remote sensor where it interacts with
▪ various optical systems, filters, emulsions, or detectors. Mulualem A & Endalkachew S
Mulualem A & Endalkachew S
▪ Electromagnetic Radiation characterized by two fields,
electrical (E) and magnetic (M) fields which are perpendicular
to each other. For this reason, the term electromagnetic energy
is used.
▪ The vibration of both fields is perpendicular to the direction of
travel of the wave. Both fields propagate through space at the
speed of light c, which are approximately 3 X 108m/s.

Mulualem A & Endalkachew S


ELECTROMAGNETIC RADIATION MODELS
To understand how electromagnetic radiation is created, how it propagates through
space, and how it interacts with other matter, it is useful to describe the processes
using two characteristics of electromagnetic radiation ;-
➢ Wavelength and
➢ Frequency.

Wavelength
▪ The wavelength is the length of one wave cycle, which can be measured as the

distance between successive wave crests.


▪ Wavelength is usually represented by the Greek letter lambda (λ). Wavelength is

measured in meters (m) or some factor of meters such as nanometers (nm, 10-9
meters), micrometers (μm, 10-6 meters) or centimeters (cm, 10-2 meters).

Mulualem A & Endalkachew S


Frequency

• Frequency refers to the number of cycles of a wave passing a


fixed point per unit of time.
• Frequency is normally measured in hertz (Hz),
equivalent to one cycle per second, and various multiples of
hertz.
• Since the speed of light is constant, wavelength and
frequency are inversely related to each other.

Mulualem A & Endalkachew S


▪ Wavelength and frequency are related by the following formula:
▪ c=λν
▪ Where:
λ= wavelength
▪ ν= Frequency (cycles per second, Hz)
▪ c= speed of light (3x108 m/s)
▪ Therefore, the two are inversely related to each other. The shorter the
wavelength is, the higher the frequency.
▪ On the contrary, the longer the wavelength is, the lower the frequency.
▪ Understanding the characteristics of electromagnetic radiation in terms
of their wavelength and frequency is crucial to understand the
information to be extracted from remote sensing data.

Mulualem A & Endalkachew S


Mulualem A & Endalkachew S
Mulualem A & Endalkachew S
Mulualem A & Endalkachew S
Mulualem A & Endalkachew S
All matter with a temperature above absolute zero (K) radiates
electromagnetic waves of various wavelengths.
The total range of wavelengths is commonly referred to as the
electromagnetic spectrum.
The electromagnetic spectrum ranges from the shorter wavelengths
(including gamma and x-rays) to the longer wavelengths (including
microwaves and broadcast radio waves).

Mulualem A & Endalkachew S


➢ There are several regions of the electromagnetic spectrum which are useful for
remote sensing.
➢ Remote sensors are engineered to detect specific spectrum wavelength and
frequency ranges.
➢ Most sensors operate in the visible, infrared and micro wave regions of the
spectrum.
Mulualem A & Endalkachew S
▪ For most purposes, the ultraviolet or UV portion of the
spectrum has the shortest wavelengths (0.300 to
0.446µm).
▪ Some Earth surface materials, primarily rocks and
minerals, fluoresce or emit visible light when illuminated
by UV radiation.
▪ In the upper atmosphere, UV light is greatly absorbed by
ozone (O3).

Mulualem A & Endalkachew S


Visible Spectrum

➢The light which our eyes (our remote sensors) can detect is part of
the visible spectrum.
➢It is important to recognize that the range of visible portion is very
small (narrow) as compared to the rest of the spectrum.
➢There is a lot of radiation around us which is "invisible" to our eyes,
but can be detected by other remote sensing instruments and used to
our advantage.
➢The visible wavelengths cover a range from approximately 0.4 to
0.7 µm.
➢The longest visible wavelength is red and the shortest is Violet.
➢It is important to note that, this is the only portion of the EM
spectrum we human beings can associate with the concept of colors.

Mulualem A & Endalkachew S


➢ Blue, green, and red are the primary colors or wavelengths of the
visible spectrum.
➢ They are basic colors because all other colors can be formed by
combining blue, green, and red in various proportions (reflection).
➢ Although we see sunlight as a uniform or homogeneous color, it is
actually composed of various wavelengths and frequencies.
➢ The visible portion of this radiation can be shown when sunlight is
passed through a glass prism.
Mulualem A & Endalkachew S
▪ Covers the wavelength range from approximately 0.7 µm to 100 µm -
more than 100 times as wide as the visible portion!
▪ The infrared region can be divided into two categories based on their
radiation properties the reflected IR, and the emitted or thermal IR.
▪ The reflected IR covers wavelengths from approximately 0.7 μm to 3.0
μm.
▪ Radiation in the reflected IR region is used for remote sensing purposes
in similar manner with radiation in the visible portion.
▪ It is valuable for delineating healthy verses unhealthy or fallow
vegetation, and for distinguishing among vegetation, soil and rocks.
▪ The thermal IR covers wavelengths from approximately 3.0 μm to 100
μm.
▪ The thermal IR region is quite different than the visible and reflected IR
portions, as this energy is essentially the radiation that is emitted from
the Earth's surface or in an object in the form of heat.

Mulualem A & Endalkachew S


Mulualem A & Endalkachew S
Microwave Region

➢ The portion of the spectrum is becoming more important in


recent years to remote sensing, which covers from about 1 mm
to 1 m.

➢ This covers the longest wavelengths used for remote sensing.

➢ The shorter wavelengths have properties similar to the thermal


infrared region while the longer wavelengths approach the
wavelengths used for radio broadcasts

Mulualem A & Endalkachew S


Mulualem A &
Endalkachew S
Emission of Radiation from Energy Sources

▪ Each emitted energy has similar characteristic with array of


incoming radiation waves.
▪ A useful concept (reference), which is widely used in optics-
study is a blackbody.

▪ A “blackbody” is defined as an object that absorbs all of the


incident energy upon it, and emits the maximum amount of
radiation at all wavelengths.
▪ Thus, emission (emitted energy) of natural surfaces is
always compared with those of a black-body: often serve as
reference to measure efficiency of others

Mulualem A & Endalkachew S


Mulualem A & Endalkachew S
ENERGY
INTERACTION

Mulualem A & Endalkachew S


▪ The coming energy interacts

i. First With Atmosphere


ii. After that with target or object
If it interact with this object, so what happing in the next

Energy interaction with the atmosphere


▪ Remote sensing requires that electromagnetic radiation travel some distance
through the Earth’s atmosphere from the source to the sensor.
▪ The coming energy interacts with the atmosphere two times
i. Before energy interact with target
ii. After interact/ bounce with target or object
▪ Radiation from the sun or an active sensor will initially travel through the
atmosphere, strike the ground target, and pass through the atmosphere a second time
before it reaches a sensor.
▪ The total distance the radiation travel in the atmosphere is called the path length.

▪ As radiation passes through the atmosphere, it is greatly affected by the atmospheric


particles and gases it encounters. As a result, three fundamental interactions in the
Mulualem A & Endalkachew S
atmosphere are possible: absorption, transmission and scattering
Scattering (change its direction)
▪ It occurs when particles or large gas molecules present in the atmosphere

interact with and cause the electromagnetic radiation to be redirected from


its original path.
▪ How much scattering takes place depends on several factors including the

wavelength of the radiation, the abundance of particles or gases, and the


distance the radiation travels through the atmosphere.
There are three (3) types of scattering which take place.
▪ i). Rayleigh scattering occurs when particles are very small compared to

the wavelength of the radiation.


▪ Eg. small specks of dust or nitrogen (NO2) and oxygen (O2) molecules)

▪ Rayleigh scattering is the dominant scattering mechanism in the upper

atmosphere.
Mulualem A & Endalkachew S
ii). Mie scattering occurs when the particles are just about the same size as the
wavelength of the radiation.
▪ Dust, pollen, smoke and water vapour are common causes
of Mie scattering.
▪ Mie scattering occurs mostly in the lower portions of the atmosphere where

larger particles are more abundant, and dominates when cloud conditions are
overcast.
iii). Nonselective scattering. This occurs when the particles are much larger than
the wavelength of the radiation
▪ Water droplets and large dust particles can cause this type of scattering.

▪ In nonselective scattering that all wavelengths are scattered about equally.

▪ This type of scattering causes fog and clouds to appear white to our eyes

because blue, green, and red light are all scattered in approximately equal
quantities
Mulualem A & Endalkachew S
Mulualem A & Endalkachew S
• Electromagnetic energy traveling through the atmosphere is partly
absorbed by various molecules.
• Ozone (O3), carbon dioxide (CO2), and water vapour (H2O) are the
three main atmospheric constituents which absorb radiation.
• The engineering and design of spectral sensors are developed to
collect wavelength data not influenced by absorption.
• The areas of the spectrum that are not severely influenced by
atmospheric absorption and thus, are useful to remote sensors are
called atmospheric windows.
Transmission
▪ Transmission is passing or transfer of radiation or energy through the
atmosphere, and reaches the earth surface.

Mulualem A & Endalkachew S


▪ Radiation that is not absorbed or scattered in the atmosphere (Transmission

from the atmosphere) can reach and interact with the Earth's surface.
▪ There are three (3) forms of interaction that can take place when energy

strikes, or is incident upon the surface. These are:


1. absorption;
2. transmission; and
3. reflection.
▪ The proportions of each will depend on the wavelength of the energy and the

material and condition of the feature.


▪ Absorption occurs when radiation (energy) is absorbed into the target or object.

▪ transmission occurs when radiation passes through a target.

▪ Reflection occurs when radiation "bounces" off the target and is redirected.

Mulualem A & Endalkachew S


▪ In remote sensing, we are most interested in measuring the radiation reflected
from targets.
▪ There are two types of reflection, which represent the two extreme ends of the
way in which energy is reflected from a target: specular reflection and
diffuse (Lambertian) reflection.
▪ Specular reflection, or mirror-like reflection, typically occurs when a
surface is smooth and all (or almost all) of the energy is directed away from
the surface in a single direction.
▪ Specular reflection can be caused, for example, by a water surface or a
glasshouse roof. It results in a very bright spot (also called ‘hot spot’) in the
image.
▪ Diffuse reflection occurs when the surface is rough and the energy is
reflected almost uniformly in all directions.
▪ Whether a particular target reflects diffusely, depends on the surface
roughness of the feature in comparison to the wavelength of the incoming
radiation.
▪ If the wavelengths are much smaller than the surface variations or the particle
sizes that make up the surface, diffuse reflection will dominate.
Mulualem A & Endalkachew S
▪ Scattering differs from reflection in that the direction associated with
scattering is unpredictable,
▪ whereas the direction of reflection is predictable but its reflection is depend
on the object shape, coverage, flatness and other characteristics of the
object.

These are the most common factors to the occurrence of radiation deflection
The change the radiation experiences is a function of
i. The atmospheric conditions,
ii. Path length,
iii. Composition of the particle,
iv. The wavelength measurement relative to the diameter of the
particle.

Mulualem A & Endalkachew S


Mulualem A & Endalkachew S
▪ is an object that carries a sensor

- That enables to collect and record energy and/or data.

• A platform is a vehicle, such as a satellite or aircraft, used for a


particular activity or purpose or to carry a specific kind of
equipment or instruments.
• Platforms for remote sensors can be situated at heights ranging
from just a few centimeters, using field equipment, up to orbits in
space as far away as 36,000 km (geostationary orbits) and beyond.
• Platforms for remote sensors may be situated on the ground, on an
aircraft, balloon (or some other platform within the Earth's
atmosphere), or on a spacecraft or satellite outside of the Earth's
atmosphere.

Mulualem A & Endalkachew S


Remote sensing can be classified
i. Based on sources of energy (type of sensor),
➢ Passive Remote Sensing and
➢ Active remote sensing
ii. Based on Methods of Data Acquisition
➢ ground-based methods, and

➢ remote sensing based methods

iii. Based on the platform that used in data acquisition


➢ Ground based,

➢ Airborne and

➢ space born

iv. Based on Wave lengths region.


➢ Optical Remote Sensing (0.3mm-3mm)
➢ Thermal Remote Sensing (3mm-5mm and 8mm to 16mm)
➢ Microwave Remote Sensing (1mm to 1m)
Mulualem A & Endalkachew S
Sensor
▪ Sensor is a device that gathers or measures and records EME.
▪ Converted in to suitable information about an object.

Based on the energy that used there are two types of remote sensing:-
i. Passive Remote Sensing and
ii. Active remote sensing
Passive (Passive sensors) Remote Sensing
▪ Remote sensing systems which acquired, obtained and measure naturally
available energy like Sun and do not have there own energy.
▪ depend on an external source of energy, usually the sun, and sometimes the
Earth itself.
▪ These sensors do not have their own source of energy and can not be used at
night time, except thermal sensors.

Mulualem A & Endalkachew S


PASSIVE SENSORS
▪ receive solar electromagnetic energy reflected from the surface or
energy emitted by the surface itself ( Earth surface).
▪ cover the EMR in the range from less than 1 Pico meter( gamma rays)
to over 1 m ( micro and radio waves).
▪ The oldest and most common type of passive sensor is photographic
camera.


❖ Video Camera
❖ Aerial Camera

Mulualem A & Endalkachew S


ACTIVE SENSORS
▪ Active remote sensing (Active sensors) is on the other hand,
provide their own energy source for illumination.
▪ It is functional in all the time(24/7) or day, night and any type
of season.
▪ Advantages for active sensors include the ability to obtain
measurements anytime, regardless of the time of day or
season.
▪ Earth surface is illuminated through energy emitted by its
own source, a part of its reflected by the surface in the
direction of the sensor is received to gather the information.

Mulualem A & Endalkachew S


ACTIVE SENSORS
▪ examples of active sensors are radar (radio detecting and
ranging, lidar (light detection and ranging), sonar (sound
navigation ranging) and a laser fluorosensor.

Mulualem A & Endalkachew S


There are two main categories of spatial data acquisitions methods:
i. ground-based methods
• Ground-based methods such as making field observations, taking in situ

measurements and performing land surveying.


• e.g. chain surveying, compass traversing theodolite measurements.

• The principle of a ground-based method: measurements and observations are

performed in the real world.


ii. remote sensing based methods
• Remote sensing methods, which are based on the use of image data acquired

by a sensor such as aerial cameras, scanners or a radar.


• The principle of a remote sensing based method: measurement and analysis

are performed on image data.

Mulualem A & Endalkachew S


▪ But based on the platform that used in data acquisition, there are

three types of Remote sensing


i. Ground based,
ii. Airborne and
iii. space born

Mulualem A & Endalkachew S


• Ground-based sensors: are often used to
record detailed information about the
surface which is compared with information
collected from aircraft or satellite sensors.
• Sensors may be placed on a ladder, tall
building, cherry picker, crane, etc.
▪ Mobile hydraulic platforms
carried by vehicles
· Extendable to a height of 15m above
the surface.
▪ Portable masts
▪ Towers
Mulualem A & Endalkachew S
▪ Sensors are carried out using aircraft with specific modifications to
carry sensors.
▪ The aircraft needs a hole in the floor.
▪ Sometimes ultra light vehicles(ULVs), balloons, Airship or kits are
used for airborne remote sensing.
▪ Depending on the platform and sensor, airborne observations are possible
at altitudes ranging from less than 100 m up to 40 km.
▪ Speed of aircraft: 140-600km/hour depending on the mounted
sensor system.
▪ Orientation affects the characteristics of data acquired. Orientation
influenced by wind condition.

Mulualem A & Endalkachew S


▪ Three rotations relative to the reference path are possible:
➢ roll
➢Pitch
➢Yaw

▪ Most aircrafts are equipped with satellite navigation technology with error of
less than 30m.
▪ Differential approaches: up to decimeter accuracy
▪ In aerial photography the measurements are stored on Hardcopy material; the negative
film.
▪ The recorded data are available only after the aircraft has returned to its base.
▪ Owning, operating and maintaining survey aircraft, as well as employing a professional
flight crew is expensive.

Mulualem A & Endalkachew S


▪ Satellites and space stations are used to mount sensors
(used to carry a sensor).
▪ Satellites are launched in to space with rockets; and earth
observation satellites are positioned in orbits between 150-
36,000 km altitude.

Mulualem A & Endalkachew S


Based on wavelength remote sensing can be classified as;
i. Optical Remote Sensing (0.3mm-3mm)
ii. Thermal Remote Sensing (3mm-5mm and 8mm to
16mm)
iii. Microwave Remote Sensing (1mm to 1m)

Mulualem A & Endalkachew S


▪ All objects have a
temperature above
absolute zero (0 K) emit EM
energy (in 3.0-100 µm).
▪ Human being has normal
98.6 ºF (37 ºC)
▪ Our eyes are only sensitive
100 m to visible energy (0.4-0.7
µm). Human sense thermal
energy through touch. while
detectors (sensors) are
sensitive to all EM
3.0 m spectrum.
▪ All objects (vegetation, soil,
rock, water, concrete, etc)
selectively absorb solar
short-wavelength energy
0.7 m and radiate thermal
infrared energy.
Mulualem A & Endalkachew S
▪ Land and ocean surface temperature,
▪ Atmospheric
▪ Temperature and humility
▪ Trace gas concentrations
▪ Radiation balance
▪ Emissivity

Mulualem A & Endalkachew S


Orbits:-
Geostationary satellites and polar-orbiting
▪ An orbit is a circular path followed by a satellite in its

revolution around the Earth.


▪ Satellites have several unique characteristics which make

them particularly useful for remote sensing of the Earth's


surface. These are orbits (geostationary, polar, and sun-
synchronous) sensors and swath.
❖ Nadir point: The surface directly below the satellite.
❖ Steerable sensors on satellites can view an area (off nadir) before
and after the orbits passes over a target.

Mulualem A & Endalkachew S


Swath
As a satellite revolves around the Earth, the sensor "sees"
a certain portion of the Earth's surface. The area imaged
on the surface, is referred to as the swath.

Mulualem A & Endalkachew S


▪ The following orbit characteristics are relevant for Remote sensing:
1. Altitude: distance( in km) from satellite to the mean surface level of the
earth.
i. polar orbit satellite: 600-800 km and
ii. geostationary orbit: at 36,000 km
The distance influences extent and detail of view.

2. Inclination angle
➢Is the angle ( in degrees) between the orbit and the equator.
➢Determines which latitude can be observed.
➢E.g. If inclination is 70° the satellite flies over the earth between the latitude
70° North and 70° south, degrees beyond this cannot be observed.

Mulualem A & Endalkachew S


3. Period
➢Is the time( in minutes) required to complete one full orbit.
e.g. A polar satellite orbits at 800 km altitude and has a period of 90
minutes. That is 28,000 km/hour is almost 8 km/s.
➢The speed of the platform has implications for the type of image
acquired.
4. Repeat cycle
➢Is the time in days between two successive identical orbits.
➢Revisit time determined by repeat cycle together with the pointing
capability of the sensor.
➢ pointing capability: possibility of the sensor platform to look sideways.

Mulualem A & Endalkachew S


Geostationary orbits

▪ Satellites which view the same


portion of the earth’s surface at all
times
▪ placed at an altitude of 36,000 km
above the equator( inclination
angle is 0°). At this distance, the
period of satellite is equal to the
period of the earth.
▪ satellite is at a fixed position
related to the earth.
▪ Commonly used for
meteorological and
communication.
▪ Today geostationary and polar
orbiters used together for modern
meteorology since they offer
continuous view with high
resolution.
Mulualem A & Endalkachew S
❑ follow a north south orbit which, in
conjunction with the earth’s rotation
(west-east), allows them to cover most
of the earth’s surface over a period of
time.
❑ the satellite travels north ward on one
side of the earth and the southward on
the second half of its orbit.
❑ orbits with inclination angle between 80
and 100 degrees and enable the
observation of the whole Globe.
❑ Placed in orbit at 600-800 km altitude.

Mulualem A & Endalkachew S


▪ Many remote sensing platforms are designed to follow

an orbit (basically north-south) which, in conjunction


with the Earth's rotation (west-east), allows them to cover
most of the Earth's surface over a certain period of time.
These are called near polar orbits.

▪ Many of these satellite orbits are also sun-synchronous

such that they cover each area of the world at a constant


local time of day called local sun time.

Mulualem A & Endalkachew S


Sun-synchronous orbit:
❑ passes overhead at the same local solar time.
❑ Cross equator at mid morning(around 10:30 hr).
❑ allows the satellite to record images at two fixed times( day and night) in 24
hours but dose not work in night time.
e.g. Landsat, SPOT,IRS

Mulualem A & Endalkachew S


Some known satellites

◼ NOAA-AVHRR (1100 m)
◼ GOES (700 m)
◼ MODIS (250, 500, 1000 m)
◼ Landsat TM and ETM (30 – 60 m)
◼ SPOT (10 – 20 m)
◼ IKONOS (4, 1 m)
◼ Quickbird (0.6 m)
AVHRR (Advanced Very High
Resolution Radiometer) NASA

Mulualem A & Endalkachew S


GOES (Geostationary Operational
Environmental Satellites) IR 4

Mulualem A & Endalkachew S


MODIS (250 m)

Mulualem A & Endalkachew S


Landsat TM
(False Color Composite)

Mulualem A & Endalkachew S


SPOT (2.5 m)

Mulualem A & Endalkachew S


QUICKBIRD (0.6 m)

Mulualem A & Endalkachew S


IKONOS (4 m Multispectral)

Mulualem A & Endalkachew S


IKONOS (1 m Panchromatic)

Mulualem A & Endalkachew S


RADAR
(Radio Detection and Ranging)

Image: NASA 2005

Mulualem A & Endalkachew S


LIDAR
(Light Detection and Ranging)

Image: Bainbridge Island,


WA courtesy Pudget Sound
LIDAR Consortium, 2005

Mulualem A & Endalkachew S


Mulualem A & Endalkachew S
▪ Remote sensors acquire data using scanning systems,
▪ Scanners employ a sensor with a narrow field of view
(IFOV).
▪ Scanning systems can be mounted on both aircraft and
satellite platforms and have essentially the same
operating principles.
▪ Scanners are passive sensors that capture the reflected
or emitted energy intensity from observed objects into
digital picture elements called pixels.
▪ It is composed of sensors and detectors.

Mulualem A & Endalkachew S


Types of scanner are described as follow:
▪ Thermal Scanner: collects a data in the longer infrared
wavelengths (8–13 μm range) which are actual temperature
emitted from the object (thermal energy)

▪ Thematic Mapper (TM) or Multispectral Scanner (MSS):


collect a data in several selected bandwidths
simultaneously between visible light and thermal
bandwidths (0.4–8.0 μm).

There are two methods of scanning employed to


acquire multispectral image data; -
i. Across-track scanners
ii. Along-track scanners

Mulualem A & Endalkachew S


➢ It Scan the Earth perpendicular to the direction of motion of the
sensor platform (i.e. across the swath).
➢ using a rotating mirror or rotational Mirror.

➢ The IFOV of the sensor and the altitude of the platform determine
the ground resolution cell viewed or the spatial resolution.
➢ The rotation or the movement of mirror creates the geometrical
distortion in the satellite image.
➢ Because of Its property it is called Whiskbroom Scanner.

➢ E.g. of A cross-track scanner satellites are; - NOAA, Landsat,


AVHRR

Mulualem A & Endalkachew S


➢ It is parallel to the data that acquired and the object that scanned or the direction
of satellite is parallel to the data acquired.
➢ Each pixel has its own detector and It is the common method of data acquisition.

➢ It uses a linear array of detectors instead of a scanning mirror or charged


coupled device (CCDs) for measuring the EMR.
➢ IFOV and area coverage determines the spatial resolution (quality of the data or
its datelines)
➢ It is called Push broom scanner because it keeps forwarding with the direction
of the satellite.
➢ Less geometrical distortion is presented with compare to Across the track
scanner because less noise and stable geometry
➢ E.g. of satellite that uses the Along the track scanners are; - SPOT, HRV,
IKONOS

Mulualem A & Endalkachew S


Image Data Characteristics

The image quality primarily determined by sensor platform system, these


sensor platform characteristics or concept of resolution.

All remote sensing systems have four types of


resolution:
▪ Spatial

▪ Spectral

▪ Temporal

▪ Radiometric

Mulualem A & Endalkachew S


1. Spatial resolution

▪ Spatial resolution, commonly referred to as “pixel size” in digital


images, is a key element of both digital and air photo remote
sensing.
▪ Spatial resolution, which refers to the smallest unit-area
measured.
▪ This indicates the minimum detail of objects that can be
distinguished.
▪ The detail noticeable in an image is dependent on the spatial
resolution of the sensor and refers to the size of the smallest
possible feature that can be detected.
▪ Large area covered by a pixel means low spatial resolution and
vice versa
▪ Generally speaking, the finer the resolution, the less total
ground area can be seen.
Mulualem A & Endalkachew S
Spatial Resolution

High vs. Low?

Source: Jensen (2000)


2. Spectral resolution

▪ Spectral resolution, which is related to the widths of the spectral


wavelength bands that the sensor is sensitive to.
▪ It is the amount of EMR in the specific spectrum range measured
or the spectral width.
▪ The finer the spectral resolution, the narrower the wavelength
ranges for a particular channel or band.
▪ E.g. photos are captured in the EMS range of 0.3 up to 0.9 nm but
image that captured in the visible part is range 0.4 up to 0.7 nm.

Mulualem A & Endalkachew S


Spectral Resolution
3.Radiometric resolution

▪ Radiometric resolution, which refers to the smallest differences in


levels of energy that can be distinguished by the sensor.
▪ It is the difference in energy that can be observed and recorded by
the sensor.
▪ The finer the radiometric resolution of a sensor, the more sensitive
it is to detecting small differences in reflected or emitted energy.

Mulualem A & Endalkachew S


Radiometric Resolution
2-bit range
0 4

6-bit range
0 63

8-bit range
0 255

10-bit range
0 1023
4. Temporal Resolution

▪ Temporal Resolution refers to the length of time it takes for a


satellite to complete one entire orbit cycle.
▪ Frequency at which images are recorded/ captured in a specific
place on the earth.
▪ The more frequently it is captured, the better or finer the temporal
resolution is said to be
▪ For example, a sensor that captures an image of an agriculture land
twice a day has better temporal resolution than a sensor that only
captures that same image once a week
▪ Revisit time, which is the (minimum) time between two successive
image acquisitions over the same location on Earth.

Mulualem A & Endalkachew S


Temporal Resolution

July 2 July 18 August 3

16 days

Time

11 days

July 1 July 12 July 23 August 3


Data Selection Criteria
◼ All type of data is not suitable for any purpose, the data must be selected by
different criteria.
i. Spatial Temporal characteristics
Selection of the appropriate data type (the selection of data is determined)
• The application that apply.
• The type of cloud cover
• Seasonal cycles and vegetation cover.
ii. Availability of image data ;- The size and accessibility of image
iii. Cost of the image data. The cost of image data is depending on; -
➢ Size of the area (coverage).

➢ The photo scales


➢ The type of film (sensor) scanner or camera etc.
Chapter 4
Digital Image Processing, Enhancement,
Visualization and Classification

Mulualem A & Endalkachew S


Digital Image Processing
• Digital image processing may involve numerous procedures
including formatting/arrangement and correcting of the data,
digital enhancement to facilitate better visual interpretation.
• In order to process remote sensing imagery digitally, the data must
be recorded and available in a digital form suitable for storage on
a computer tape or disk.
• Several commercially available software systems have been
developed specifically for remote sensing image processing and
analysis.
• The most common image processing functions available in image
analysis systems can be categorized into the following four
categories:
• Preprocessing (Image rectification and restoration)
• Image Enhancement
• Image Transformation
• Image Classification and Analysis
Mulualem A & Endalkachew S
Image Preprocessing
• Preprocessing functions involve those operations that are normally
required prior to the main data analysis and extraction of information,
• Preprocessing are generally grouped as radiometric or geometric
corrections.
• The objective of image preprocessing functions is to improve the
appearance of the imagery to assist in visual interpretation and
analysis. (image enhancement)
Cont.…
A-Geometric Correction
• Geometric corrections include correcting for geometric
distortions due to
✓ sensor-Earth geometry variations, and
✓ conversion of the data to real world coordinates (e.g. latitude and
longitude) on the Earth's surface.
• It is necessary when accurate area, distance and direction
measurements are required to be made from the imagery.
• It is achieved by transforming the data from one grid system into
another grid system using a geometric transformation.

Mulualem A & Endalkachew S


Cont.….
Geometric distortions are classified in to two that are:-
✓ Internal distortion, it results from geometric of the sensor.
✓ External distortion, it resulting from the altitude of the sensor
or the slope of the object.
The geometric registration process can be made in two steps
– Identifying the image coordinates (i.e. row, column) of
several clearly discernible points, called ground control
points ( GCPs)
– Resampling
Geometric distortions can be corrected in to two ways
1. Georeferencing
2. Geocoding

Mulualem A & Endalkachew S


Endalkachew S.
B. Radiometric Correction Methods
• Radiometric correction is a preprocessing method to correct the
spectral errors and distortions caused by sensors, sun angle,
topography and the atmosphere.
• Radiometric corrections include correcting the data for sensor
irregularities and unwanted sensor or atmospheric noise, and converting
the data so they accurately represent the reflected or emitted radiation
measured by the sensor.

Mulualem A & Endalkachew S


Cont.…..
Radiometric corrections have grouped in to Two
1. The 'cosmetic' rectification to compensate for data errors ,
i. Periodic line dropouts
ii. Line striping
iii. Random noise or spike noise
2. The atmospheric corrections to compensate for the effect of
atmospheric and illumination parameters, such as
i. haze,
ii. sun angle and
iii. skylight on the image data

Mulualem A & Endalkachew S


Cont.……
Periodic line dropouts occur due to recording problems when one of
the detectors of the sensor in question either gives wrong data or
stops functioning

Mulualem A & Endalkachew S


Cont.……
Line striping often occurs due to non-identical detector response.
Line striping is far more common than line dropouts.

Mulualem A & Endalkachew S


Image Enhancement
• Enhancements are used to make image easier for visual interpretation
and understanding of imagery.
• The goal of image enhancements is to improve the visual
interpretability of an image.
Image enhancement techniques are:-
1. Contrast enhancement
2. Density slicing
3. Frequency filtering
4. Band rationing

Mulualem A & Endalkachew S


Histogram/Contrast enhancement

• Contrast generally refers to the difference in luminance or grey


level values in an image.
• It can be defined as the ratio of the maximum intensity to the
minimum intensity over an image.
• Some examples of histogram enhancement are the following.

• Contrast enhancement have Two Types


✓ Linear Contrast Stretch
✓ Histogram-equalized stretch

Mulualem A & Endalkachew S


Linear Contrast Stretch
• This is the simplest contrast stretch algorithm.
• This involves identifying lower and upper bounds from the
histogram (usually the minimum and maximum brightness values
in the image) and applying a transformation to stretch this range to
fill the full range.

Mulualem A & Endalkachew S


• In our example, the minimum value (occupied by actual
data) in the histogram is 84 and the maximum value is 153
• These 70 levels occupy less than one-third of the full
256 levels available.
• A linear stretch uniformly expands this small range to cover
the full range of values from 0 to 255.
• This enhances the contrast in the image with light toned
areas appearing lighter and dark areas appearing darker,
making visual interpretation much easier.

Mulualem A & Endalkachew S


Histogram-equalized stretch
A histogram is a graphical representation of the brightness values that comprise an
image. The brightness values (i.e. 0-255) are displayed along the x-axis of the graph.
The frequency of occurrence of each of these values in the image is shown on the y-
axis.
A uniform distribution of the input range of values across the full range
may not always be an appropriate enhancement, particularly if the input
range is not uniformly distributed.
In this case, a histogram-equalized stretch may be better.

Mulualem A & Endalkachew S


Frequency filtering(Spatial)
• Spatial filters are designed to highlight or suppress specific features in an
image based on their spatial frequency.
• Spatial filtering is the process of dividing the image into its constituent
spatial frequencies, and selectively altering certain spatial frequencies to
emphasize some image features.
• It more used for Image classification.
• spectral filters serve to block or pass energy over various spectral ranges,
spatial filters emphasized or deemphasize image data of various spatial
frequencies.
• This technique increases the analyst’s ability to discriminate detail. The
four types of spatial filters used in remote sensor data processing are:-
✓ Low pass filters,
✓ Band pass filters and
✓ High pass filters
✓ Directional, or edge detection filters

Mulualem A & Endalkachew S


• Low pass filters
• low-pass filter is designed to emphasize low frequency features (larger,
homogeneous areas) and deemphasize the high frequency components
of an image.
• Thus, low-pass filters generally serve to smooth the appearance of an
image.
• Average and median filters, often used for radar imagery are examples
of low-pass filters.
High-pass filters
• High-pass filters do the opposite and serve to sharpen the appearance
of fine detail in an image.
• A high-pass filter is designed to emphasize high frequency features and
deemphasize the low frequency components of an image.
• One implementation of a high-pass filter first applies a low-pass filter to
an image and then subtracts the result from the original, leaving behind
only the high spatial frequency information.

Mulualem A & Endalkachew S


Image Classification
• Classification is a process by which a set of items is grouped into classes
based on common characteristics.
• Classification of satellite image data is based on placing pixels with similar
values into groups and identifying the common characteristics of the items
represented by the pixels.
• There are two broad classes of image classification:
✓ Human analyst and
✓ Digital image classification.
• A human analyst attempting to classify features in an image uses the
elements of visual interpretation to identify homogeneous groups of pixels
which represent various features or land cover.
• While, digital image classification uses the spectral information represented
by the digital numbers in one or more spectral bands, and attempts to
classify each individual pixel based on this spectral information.
• The overall objective of image classification is to automatically categorize
all pixels in an image into land cover classes or themes, for further decision
making or planning purpose.
Mulualem A & Endalkachew S
What are the activities that should be done before
the classification?
There are many activities that involves in the image classification but
the most common activities that should be done before the
classification are typically five steps:-

1. Selection and preparation of the image data


2. Definition of the clusters in the feature space
3. Selection of classification algorithm
4. Running the actual classification and
5. Validation of the result

Mulualem A & Endalkachew S


Cont.….
1. Selection and preparation of the image data.

Depending on the cover types to be classified,


➢ the most appropriate sensor,
➢ the most appropriate date(s) of acquisition and
➢ the most appropriate wavelength bands should be selected.

Mulualem A & Endalkachew S


Cont.….

2. Definition of the clusters in the feature space

There are two approaches, based on the method used:


➢ supervised classification and
➢ unsupervised classification.

Mulualem A & Endalkachew S


Cont.…
3. Selection of classification algorithm
Once the spectral classes have been defined in the feature space,
the operator needs to decide on how the pixels (based on their
DN-values) are assigned to the classes. The assignment can be
based on different criteria.

4. Running the actual classification.

Once the training data have been established and the classifier
algorithm selected, the actual classification can be carried out.
This means that, based on its DN-values, each individual pixel in
the image is assigned to one of the defined classes.

Mulualem A & Endalkachew S


Cont.….
5. Validation of the result.
Once the classified image has been produced its quality is assessed
by comparing it to reference data (ground truth). This requires
selection of a sampling technique, generation of an error matrix, and
the calculation of error parameters.

State different types of classification techniques and


their differences?
There are two general approaches to image classification, They differ
in how the classification is performed. namely
1. Supervised and
2. Unsupervised.

Mulualem A & Endalkachew S


Supervised Classification
• Supervised classification allows the user to define the training data
(or signature) that tells the software what types of pixels to select
for certain land use.
• These samples are referred to as training areas.
• Facts about the area, knowledge about aerial photography, and
experience in image interpretation permit selected for a better
classification of the image.
• Through experience, supervised classification become easier and
more accurate.

Mulualem A & Endalkachew S


Supervised Classification

• In a supervised classification, the analyst identifies in the imagery


homogeneous representative samples of the different surface cover
types (information classes) of interest.
• Thus, the analyst is "supervising" the categorization of a set of
specific classes.
• The computer uses a special program or algorithm to determine the
numerical "signatures" for each training class.
• Once the computer has determined the signatures for each class,
each pixel in the image is compared to these signatures and labeled
as the class it most closely "resembles" digitally.

Mulualem A & Endalkachew S


Supervised Classification

• The five basic steps involved in a typical supervised classification


procedure are as follows:-
➢ The training stage (collecting GCP points)
➢ Feature selection (determine LULC types based on GCP points &
DN values)
➢ Selection of appropriate classification algorithm
➢ Post classification
➢ Accuracy assessment

Mulualem A & Endalkachew S


Cont.….

Mulualem A & Endalkachew S


Cont.….

Mulualem A & Endalkachew S


Unsupervised Classification
• Unsupervised classifiers do not utilize training data as the basis for
classification.
• In an unsupervised classification a clustering algorithm
automatically finds and defines a number of clusters in the feature
space.
• In unsupervised classification, the signatures are automatically
generated by an algorithm named ISODATA. ISODATA algorithm
stands for "Iterative Self-Organizing Data Analysis Technique.“
• The resulting classification has less discerning ability than a
supervised classification due to the lack of training data supplied to
the clustering algorithm.

Mulualem A & Endalkachew S


Unsupervised Classification
• Spectral classes are grouped first, based solely on the numerical
information in the data.
• Clustering algorithms are used to determine the natural (statistical)
groupings or structures in the data.
• Usually, the analyst specifies how many groups or clusters are to be
looked for in the data.
• unsupervised classification is not completely without human
intervention. However, it does not start with a pre-determined set of
classes as in a supervised classification.

Mulualem A & Endalkachew S


Cont.…

Mulualem A & Endalkachew S


Classification Algorithm
After the training sample sets have been defined, classification of the
image can be carried out by applying a classification algorithm.
Several classification algorithms exist. The choice of the algorithm
depends on the purpose of the classification and the characteristics of
the image and training data.

Among the most frequently used classification algorithms


The most commons are
➢ maximum likelihood,
➢ minimum distance, and
➢ Parallelepiped (Box classifier)

Mulualem A & Endalkachew S


Classification Accuracy Assessment
and Error Matrix
Accuracy Assessment;-
• Accuracy assessment is an important part of any classification project.
• It compares the classified image to another data source that is
considered to be accurate or ground truth data.
• Accuracy assessment is a general term for comparing the classification to
geographical data that are assumed to be true, in order to determine the accuracy
of the classification process.
• Sources of reference data include among other things ground truth can
be collected in the field, higher resolution satellite images, and maps
derived from aerial photo interpretation.
• Ground truth can be collected in the field; however, this is time
consuming and expensive.
• Ground truth data can also be derived from interpreting high-resolution
imagery, existing classified imagery, or GIS data layers.

Mulualem A & Endalkachew S


Classification Accuracy Assessment
and Error Matrix
• From the accuracy assessment cell array, two kinds of reports can
be derived. These are
i. Error matrix and
ii. Accuracy report.
i. Error Reports
• Error matrix (confusion matrix) – compares ground truth data with
results of classification
• The error matrix simply compares the reference points to the
classified points in a c × c matrix, where c is the number of
classes.
• The error matrix (Similar name include confusion matrix,
correlation matrix, or covariance matrix) summarizes the
relationship between two data sets, often a classification map or
model and reference test information or alternative model.
Mulualem A & Endalkachew S
ii. Accuracy report
• The accuracy report calculates statistics of the percentages of
accuracy, based upon the results of the error matrix.
• When interpreting the reports, it is important to observe the
percentage of correctly classified pixels and to determine the
nature of errors of the producer and yourself.
Kappa Coefficient
• The Kappa coefficient expresses the proportionate reduction in
error generated by a classification process compared with the error
of a completely random classification.
• For example, a value of 82 implies that the classification process
is avoiding 82 percent of the errors that a completely random
classification generates.

Mulualem A & Endalkachew S


Chapter 5
Visual image interpretation

Mulualem A & Endalkachew S


Introduction, data acquisition and interpretation
• Interpretation and analysis of remote sensing imagery involves the
identification and/or measurement of various targets in an image in order to
extract useful information about them.
• Image interpretation or information extraction is the extraction of
qualitative and quantitative information, about the shape, location,
structure, function, quality, condition and relationship between objects, etc.
by using different mechanism.
• Targets in remote sensing images may be any feature or object which can
be observed in an image, and have the following characteristics:
• Targets may be a point, line, or area feature.

Mulualem A & Endalkachew S


• The target must be distinguishable; it must contrast with other features around it
in the image.
• Much interpretation and identification of targets in remote sensing imagery is
performed manually or visually, i.e. by a human interpreter.

• when remote sensing data are available in digital format, digital processing
and analysis may be performed using a computer.
• digital processing and analysis is carried out to automatically identify targets
and extract information completely without manual intervention by a human
interpreter.

Mulualem A & Endalkachew S


• manual interpretation and analysis dates back to the early beginnings
of remote sensing for air photo interpretation.
• digital processing and analysis is more recent with the advent of digital
recording of remote sensing data and the development of computers.
• both manual and digital techniques for interpretation of remote sensing
data have their respective advantages and disadvantages.
• generally, manual interpretation requires little, if any,
specialized equipment, while digital analysis requires specialized, and
often expensive, equipment.

Mulualem A & Endalkachew S


• manual interpretation is often limited to analyzing only a single data or a single
image at a time. the computer environment is more amenable/capable to
handling complex images.
• In this sense, digital analysis is useful for simultaneous analysis of many
spectral bands and can process large data sets much faster than a human
interpreter.
• Manual interpretation is a subjective process, meaning that the
results will vary with different interpreters. digital analysis is based on the
manipulation of digital numbers in a computer and is thus more objective,
generally resulting in more consistent results.

Mulualem A & Endalkachew S


ELEMENTS OF VISUAL INTERPRETATION
When Dealing With Image Data ,Visualized As Pictures, A Set Of
Terms Is Required To Express and Define Characteristics Present In
Picture These Characteristics Are Called Interpretation Element. which
provide guidelines on how to recognize certain objects.

Some Common Interpretation Elements Are : Tone, Texture , Shape,


Site ,Pattern , Size, shadow, color and Association relationship or
context

Mulualem A & Endalkachew S


1. Tone(Hue):-
• Tone (Hue) refers to the relative brightness or colour of objects in an
image.
• The tonal expression of objects on the image is directly related to
the amount of light (energy) reflected from the surface or the object .
• Generally, tone is the fundamental element for distinguishing
between different targets or features.
• Variations in tone also allow the elements of shape, texture, and
pattern of objects to be distinguished.
• Without tonal differences, the shapes, patterns, and textures of
objects could not be discerned. (recognize/identify)
• Example, distinguishing water and sand from black and white
Mulualem A & Endalkachew S
Mulualem A & Endalkachew S
AN EXAMPLE OF TONE
2. Shape Or Form :
• Refers to the general form, structure, or outline of individual
objects.
• Shape can be a very distinctive clue for interpretation.
• Straight edge shapes typically represent urban or agricultural
(field) targets, while natural features, such as forest edges, are
generally more irregular in shape, except where man has created a
road or clear cuts.
• The Shape Of Objects Often Helps To Determine The Character
Of The Object (Buildup Areas , Roads And Railroads ,
Agricultural Fields Etc ).
Mulualem A & Endalkachew S
Mulualem A & Endalkachew S
3. SIZE :
• Size of objects in an image is a function of scale.
• Size of object can be considered in relative or absolute sense .
• For example, if an interpreter had to distinguish zones of land
use, and had identified an area with a number of buildings in
it, large buildings such as factories or warehouses would
suggest commercial property, whereas small buildings would
indicate residential areas

Mulualem A & Endalkachew S


Mulualem A & Endalkachew S
4. PATTERN:-
• Refers to the spatial arrangement of objects or the characteristic
repetition of certain forms.
• Pattern can be described by terms such as concentric , radial ,
checkerboard.
• Typically an orderly repetition of similar tones and textures will
produce a distinctive and ultimately recognizable pattern
• For example, Orchards with evenly spaced trees and urban streets
with regularly spaced houses are good examples of pattern.

Mulualem A & Endalkachew S


Mulualem A & Endalkachew S
5.Texture :-
• Texture refers to the arrangement and frequency of tonal variation in particular
areas of an image.
• Texture can often be related to terrain roughness.
• Rough textures would consist of a mottled/different tone where the grey levels
change abruptly in a small area, whereas smooth textures would have very little
tonal variation.
• Smooth textures are most often the result of uniform, even surfaces, such as
fields, asphalt, or grasslands.
• A target with a rough surface and irregular structure, such as a
forest canopy, results in a rough textured appearance
• For example homogeneous grassland exhibits a smooth texture; coniferous
forests usually show a coarse texture. However this will depend on the scale of
the photograph or image.
Mulualem A & Endalkachew S
Mulualem A & Endalkachew S
An example of texture.
6. Association Relationships Or Context :-
• Association takes into account the relationship between other
recognizable objects or features in proximity to the target of
interest.
• The identification of features that one would expect to associate
with other features may provide information to facilitate
identification.
• For example, commercial properties may be associated with
proximity to major transportation routes, whereas residential areas
would be associated with schools and sports fields or bouts is
found in water area or in a sea it is don't found in land area.

Mulualem A & Endalkachew S


Mulualem A & Endalkachew S
7. Color
• Color is more convenient for the identification of object details.
• For example, vegetation types and species can be more easily
interpreted by less experienced interpreters using color
information.
• Sometimes color infrared photographs or false color images will
give more specific information, depending on the emulsion of
the film or the filter used and the object being imaged.

Mulualem A & Endalkachew S


8. Shadow:-
• Shadow it may provide an idea of the profile and relative
height of a target or targets which may make identification
easier.
• However, shadows can also reduce or eliminate interpretation in
their area of influence, since targets within shadows are much
less (or not at all) discernible from their surroundings

Mulualem A & Endalkachew S


Application of Visual Image
Interpretation

There are different application areas of visual image


interpretation:
✓Land Use Land Cover and
✓Soil Mapping are the dominant ones.

Mulualem A & Endalkachew S


CHAPTER SIX
GPS and REMOTE SENSING

Contents of the Chapter


❖Introduction
❖Satellite based positioning
❖Absolute & Relative positioning
❖Prose and work of GPS
❖Application of GPS

Mulualem A & Endalkachew S


GPS (Global Positioning System)

Mulualem A & Endalkachew S


Introduction to GPS
What is GPS ? How it works ?
• Global Positioning System (GPS) is a satellite-based radio-
positioning, time-transfer and navigation system, that was developed,
designed, financed, deployed, and operated by the U.S. Department of
Defense.
• Initially, GPS was developed as a military system to fulfill U.S.
military needs. However, it was later made available to civilians, and
is now a dual use system that can be accessed by both military and
civilian users.
• GPS has also demonstrated a significant benefit to the civilian
community who are applying GPS to a rapidly expanding number of
application.
Mulualem A & Endalkachew S
How does GPS Works?

• Almost all GPS types works on the


principle of triangulation.

• By knowing its distance from three or


more satellites, the receiver can calculate
its position by solving a set of equations.

• Information from three satellites is


needed to calculate longitude and
latitude at a known elevation; four
satellites are needed to include altitude
(3D) as well.

Mulualem A & Endalkachew S


Cont.…..

• The over all GPS system includes at least 24


satellites in orbit to cover the over all shape of
the earth and 19,320 kilometers (12,000 miles)
above the earth and inclined at 55°.

• These satellites continuously broadcast their


position, a timing signal, and other
information.

• By combining the measurements from four


different satellites, users with receivers can
determine their 3-dimensional position,
currently within 4–20 meters (13–66 feet)
Mulualem A & Endalkachew S
accuracy.
Absolute Positioning(hand held GPS)
• The mode of the positioning relies upon a single
receiver station.
• It also referred to as stand- alone GPS, because
unlike differential positioning, ranging is carried
out strictly between the satellite and the receiver
station, not on a ground based reference station
that assist with the computation of error
corrections.

Mulualem A & Endalkachew S


Relative Positioning (RTK GPS)
Relative or deferential positioning carries the
triangulation principles one step further, with a second
receiver at a known reference point (ground
segmentation).

To further facilitate determination of point’s position,


relative to the known earth surface point, this
configuration demands collection of an error-
correcting message from the reference receiver.
Differential mode positioning relies upon an
established control points.

Mulualem A & Endalkachew S


RTK GPS

Mulualem A & Endalkachew S


Hand Held GPS

Mulualem A & Endalkachew S


GPS Segments:
➢ Space segment

➢ Control segment

➢ User segment

To determine one position first this three


segmentation should be functional and

Because of this three segmentation GPS


works on the principle of triangulation

Mulualem A & Endalkachew S


GPS Segments:

Mulualem A & Endalkachew S


Space Segment:
➢ 4 to 8 satellites are typically visible from any
unobstructed viewing location on earth
➢ GPS satellites fly in circular orbits at an altitude of
19,320 km and with a period of 12 hours.
➢ Powered by solar energy, the satellites continuously
orient themselves to point their solar panels toward
the sun and their antenna toward the earth.
➢ Orbital planes are centered on the Earth
➢ Each planes has about 55° tilt relative to Earth's
equator in order to cover the polar regions.

Mulualem A & Endalkachew S


Space Segment:

Mulualem A & Endalkachew S


Ground Segment:
➢ The ground segment has:
o one master control,
o one alternative master control station,
o 12 command and control antennas and
o 16 monitoring sites
All this are under control of USA, for GPS and
Russia for GLONASSA

Mulualem A & Endalkachew S


Ground Segment:

Mulualem A & Endalkachew S


User Segment:
➢ The user's GPS receiver is the US of the GPS system.
➢ GPS receivers are generally composed of:
o an antenna, tuned to the frequencies transmitted by
the satellites,
o receiver-processors, and
o a highly-stable clock, commonly a crystal oscillator.
➢ They can also include a display for showing location
and speed information to the user.
➢ A receiver is often described by its number of
channels this signifies how many satellites it can
monitor simultaneously.
➢ As of recent, receivers usually have between twelve
and twenty channels.
Mulualem A & Endalkachew S
User Segment:
User Segment application
➢ Military.
➢ Search.
➢ Disaster relief.
➢ Surveying.
➢ Marine, aeronautical and terrestrial navigation.
➢ Remote controlled vehicle and robot guidance.
➢ Satellite positioning and tracking.
➢ Shipping.
➢ Geographic Information Systems (GIS).
➢ Recreation.

Mulualem A & Endalkachew S


GPS Segment: Summary

Mulualem A & Endalkachew S


What attracts us to use GPS:-

➢ The relatively high positioning accuracies,


➢ The capability of determining velocity and time, to an
accuracy corresponding with position.
➢ The signals are available to users any where on the globe: in
the air, on the ground, or at sea.
➢ It is a positioning system with no user charges, that simply
requires the use of relatively low cost hardware.
➢ It is functional in all-weather system, if the signals are visible,
available 24 hours a day.
➢ It provides position information with three dimensions, that is,
vertical as well as horizontal information is provided.

Mulualem A & Endalkachew S


Compared to conventional surveying
technology, GPS:-
❖ Is more faster.
❖ Requires less labor(one labor).
❖ Requires less training(simple to user).
❖ Comparatively more accurate

Mulualem A & Endalkachew S


Advantages / Disadvantages…
ADVANTAGES
• Mobility
• Global coverage
•Day or Night
• Accuracy
DISADVANTAGES
• Requires clear view of sky
• Ionospheric influences
• Multi-path
- buildings, canyons, trees
- large, wet leaves
- large flat buildings
- chain link fences

Mulualem A & Endalkachew S


Applications
• Military (DoD) – civilian uses now exceed military
• Space Travel (NASA)
• Survey, Mapping & GIS
• Resource and Asset Management
Environmental & Forestry
Mining, Oil & Gas Precision Construction
• Agriculture & Agriculture
• Utilities & Construction
• Transportation
• Vehicle Security (Fleet Management)
• Public Safety
Emergency Management, Search & Rescue
Crime Prevention
• Timing & Synchronization
(banking, telecommunications)
• LBS - Location Based Services
(cell phones, wireless web)

Solar GPS Cattle Herder


(noise or electric shock)

Mulualem A & Endalkachew S


Other Applications…
•On Board Vehicle Navigation Systems
• Vehicle Tracking Systems (beyond fleet management)
- Rental Car Companies
- GPS-measured Tolls – variable taxation (UK)
- Family/Friends vehicle location
- Crime: Stolen cars; Criminal tracking
- Accident notification systems
• Child/Senior/Pet Safety Tracking Systems
• Parole, Probation Tacking Systems
• Package/Asset Tracking Systems Golf Courses
• Bridge structural monitoring
• Sports and Broadcasting (Skiers, NASCAR, Sailboat races)
• Golf Courses (distance to next hole…)
• Geo-Caching (GPS scavenger/treasure hunts)
• Beer Bottle GPS
• etc, etc,

Parole Anklet

Car Navigation

Mulualem A & Endalkachew S


Other Applications…
Tracking Systems
- Pet Collar ($300 + $20/month)
- Teddy Bears, Backpacks
- Implants…

Xega injectable GPS chip


($4,000 + $350/month)
Mexico (requires additional
wearable accessory)

GTX Ambulator
(Alzheimer’s GPS Shoes)
Nano GPS tracker ($200 + $45/month), (tracking and geo-fence)
panic button, for people or nativity scenes…

Pet collars
Web or cell connection
Virtual/Geo-Fence Garmin Astro Pet Tracker
Mulualem A & Endalkachew S (communicates with base unit)
Other Applications…
Mobile Phones
• E911, (Enhanced 911) passed into law in 1999…
- Either GPS or Network (tower) based – or both
• Over 50% of all GPS receivers ever built have been for Cell Phones
• Most in the last 3 years
• 500,000,000 + phones have GPS

Smart Phones
• Almost 100% GPS enabled
• Cost is now less than $5 per phone
• Many are augmented GPS systems (A-GPS or GPS +)
• Cell towers (closer and faster than satellites)
• Wi-fi
• Inertial movement units (inertial navigation systems)
• GPS + WAAS, EGNOS, MSAS, Galileo, GLONASS…

Mulualem A & Endalkachew S


Other Applications…
Mobile Phones
- iPhone
- gPhone Here I Am
(connect with other users,
- Blackberry friends in your contacts)
- etc.…
iWant
Traffic/Routing/directions
Yellowpages (Starbucks, car repair)
Friend Finder
Geotagged photos
G-Park
Astronomy (car finder)
Golf
Find lost (or stolen) phone

More GPS units in smart phones GPS


than all other GPS receivers Simulators
combined…

Transit
Weather
Not strictly GPS… (iPhone)
GPS (satellites, if signal is available)
Cell tower triangulation and/or signal strength
+ built in digital compass
Mulualem A & Endalkachew S Virtual Sky
Other Applications…
Academics
• Study areas, sample sites
• Animal tracking (goats, birds, turtles)
• Survey documentation
• Photo geo-tagging

• Accuracy improving
-Surveying world’s largest salt flat (Bolivia), readings were 5 mm
less at the end of the day…

Mulualem A & Endalkachew S


Q&A

?
Mulualem A & Endalkachew S

You might also like