Photogrammetry Lecture Note 231-1

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

WHAT IS PHOTOGRAMMERY?

The term “photogrammetry" is derived from the three Greek words phot which means light,
gramma which means something drawn, and metre in, the noun of measurement. Photogrammetry
can be defined as the science and art of determining qualitative and quantitative characteristics of
objects from the images recorded on photographic emulsions without coming in physical contact
with the objects. Here information is obtained through processes of recording patterns of
electromagnetic radiant energy, predominantly in the form of photographic images. Objects are
identified and qualitatively described by observing photographic image characteristics such as
shape, pattern, tone, and texture. Photogrammetry also allows for the extraction of three-
dimensional features from remotely sensed data.

WHAT IS REMOTE SENSING?


Remote Sensing is the Art and Science where information/data is extracted or interpreted by
indirect measurement of the object under study (I.e. there is no physical contact with the object).
The information is obtained via electromagnetic radiation (E.g. Light or Radar) and has been
reproduced on film, paper or in digital format, from which the information/data is then extracted
and interpreted.

Brief Introduction to Photogrammetry and Remote sensing


Photogrammetry
Photogrammetry, as its name implies, is a 3-dimensional coordinate measuring technique that uses
photographs as the fundamental medium for metrology (or measurement). The fundamental
principle used by Photogrammetry is triangulation or more specifically called Aerial
Triangulation. By taking photographs from at least two different locations, so-called “lines of
sight” can be developed from each camera to points on the object. These lines of sight (sometimes
called rays owing to their optical nature) are mathematically intersected to produce the 3-
dimensional coordinates of the points of interest.
The expression photogrammetry was first used by the Prussian architect Albrecht Meydenbauer in
1867 who fashioned some of the earliest topographic maps and elevation drawings.
Photogrammetry services in topographic mapping is well established but in recent years the

SVI 231: Elements of Photogrammetry and Remote Sensing


Surv. Dr. Esomchukwu Igbokwe
technique has been widely applied in the fields of architecture, industry, engineering, forensic,
underwater, medicine, geology and many others for the production of precise 3D data.

Branches of photogrammetry
there are three broad based branches in photogrammetry

• Analogue Photogrammetry: is the branch of Photogrammetry that includes all methods and
techniques to extract information from analog photos based on mechanical and optical methods
or their combination.
• Analytical Photogrammetry: Deals with the precise measurements and computations on
photographs regarding the size, shape, and position of photographic features and/or obtaining
other information such as relative locations (coordinates) of features, areas, volumes, these
photographs are taken using a metric camera and is mostly used in the engineering fields e.g.
surveying etc.
• Digital Photogrammetry: Deals with recognition and identification of the photographic features
on a photograph such as shape, size, shadow, pattern etc. to add value and intelligence to
information seen on the photograph (annotation).

Remote Sensing

SVI 231: Elements of Photogrammetry and Remote Sensing


Surv. Dr. Esomchukwu Igbokwe
Remote Sensing is a closely aligned technology to photogrammetry in that it also collects
information from imagery. The term is derived from the fact that information about objects and
features is collected without coming into contact with them. Where remote sensing differs from
photogrammetry is in the type of information collected, which tends to be based on differences in
colour, so land use and land cover is one of the primary outputs of remote sensing processing.
Remote sensing was originally conceptualized to exploit the large number of colour bands in
satellite imagery to create 2D data primarily for GIS. Nowadays remote sensing tools are used
with all types of imagery to assist in 2D data collection and derivation, such as slope. Software
tools today tend to hold a much wider range of image technologies such as image mosaicing, 3D
visualisation, GIS, radar as well as softcopy photogrammetry.

Key concepts:
a) Spatial resolution.
b) Radiometric resolution.
c) Spectral resolution.
d) Temporal resolution

a) Spatial resolution describes the ability of a sensor to identify the smallest size detail of a
pattern on an image. In other words, the distance between distinguishable patterns or
objects in an image that can be separated from each other and is often expressed in meters.
b) Spectral resolution is the sensitivity of a sensor to respond to a specific frequency range
(mostly for satellite and airborne sensors). The frequency ranges covered often include not
only visible light but also non-visible light and electromagnetic radiation. Objects on the
ground can be identified by the different wavelengths reflected (interpreted as different
colours) but the sensor used must be able to detect these wavelengths in order to see these
features.
c) Radiometric resolution is often called contrast. It describes the ability of the sensor to
measure the signal strength (acoustic reflectance) or brightness of objects. The more
sensitive a sensor is to the reflectance of an object as compared to its surroundings, the
smaller an object that can be detected and identified.

SVI 231: Elements of Photogrammetry and Remote Sensing


Surv. Dr. Esomchukwu Igbokwe
d) Temporal resolution depends on several factors–how long it takes for a satellite to return
to (approximately) the same location in space, the swath of the sensor (related to its
‘footprint’), and whether or not the sensor can be directed off-nadir. This is more formally
known as the ‘revisit period’.
AERIAL PHOTOGRAPHY
Aerial photography is the process of taking photographs from the air. but there is more to it than
simply using a light aircraft or helicopter and flying up to take photographs. There are many
elements to an aerial survey that must be considered to ensure that the data is useful enough to
extrapolate whatever is being investigated. It is often difficult to see elements of the landscape on
the ground, features can easily be missed, and what might seem like an insignificant bump from
ground level can become more significant in a wider context, some landscape types are difficult
to access on foot so aerial photographs are vital to study and map them.

Aerial photography have been used as a method of landscape studies for over a century, especially
in archaeology and researchers have learnt much about the world around us; its applications today
are broad and coupled with the growing technology of GIS (geographic information systems), the
potential means that the method will not become obsolete any time soon. Aerial photographs are
taken in two basic forms and both have different uses and applications: oblique and vertical
photographs.

Vertical Aerial Photograph

Vertical aerial photography is an aerial photography technique

SVI 231: Elements of Photogrammetry and Remote Sensing


Surv. Dr. Esomchukwu Igbokwe
where the shots are taken from directly above the subject of the image. Allowable
tolerance is usually + 3° from the perpendicular (plumb) line to the camera axis.
This method of aerial photography is also referred as “overhead aerial
photography.” In vertical aerial photograph, the lens axis is perpendicular to the
surface of the earth. In vertical photograph, we may see flat and map-like image of
the rooftops and canopies of the building and structure being photographed. There
are three common ways
that vertical aerial photography can be conducted:

i. Low Altitude – For this particular shot, the resulting images will show
bigger and closer shots of the subject and its surroundings,
ii. Medium Altitude – Here, the resulting images of the subject and the
surroundings are smaller than those produced in low altitude vertical
aerial photography,
iii. High Altitude – The images of the subject and its surroundings produced
from high altitude vertical aerial photography are way smaller than those
produced from low altitude and medium altitude vertical aerial
photography. Nonetheless, they are able to cover a wider section of the
land.

Oblique Photography

SVI 231: Elements of Photogrammetry and Remote Sensing


Surv. Dr. Esomchukwu Igbokwe
The word oblique means having a sloping direction or angular position. Therefore, Photographs
taken at an angle are called oblique photographs.

Low Oblique Aerial Photography: Low oblique aerial photograph is a photograph taken with
the camera inclined about 30° from the vertical. In this type of photograph horizon is not visible.
The ground area covered is a trapezoid, although the photo is square or rectangular. No scale is
applicable to the entire photograph, and distance cannot be measured. Parallel lines on the ground
are not parallel on this photograph; therefore, direction (azimuth) cannot be measured. Relief is
detectable but distorted.

High Oblique Aerial Photography: The high oblique is a photograph taken with the camera
inclined about 60° from the vertical. In this type of aerial photograph horizon is visible. It covers
a very large area. The ground area covered is a trapezoid, but the photograph is square or

SVI 231: Elements of Photogrammetry and Remote Sensing


Surv. Dr. Esomchukwu Igbokwe
rectangular. Distances and directions are not measured on this photograph for the same reasons
that they are not measured on the low oblique. Relief may be quite detectable but distorted as in
any oblique view.

Basics of Aerial Camera

An aerial camera is a highly specialized camera, designed for use in aircraft and containing a
mechanism to expose the film in continuous sequence at a steady rate. They are referred to as
"passive sensors" because they detect and capture the natural light reflected from objects.

An aerial camera is a mechanical optical instrument with automatic and electronic elements. It is
designed for obtaining aerial photographs of the earth’s surface from an airplane or other type of
aircraft. Aero-camera differs from ordinary camera and has specific features like fully automatic
operation, shock absorbing support frame, large picture format, and rapid frame advance. Apart
from these, aero-camera is accomplished with photographing from great distance, rapid movement
and vibration during exposure. The world’s first aerial camera for an area photography from an
areophane was invented by the Russian army engineer V. F. Potte during World War I.

Aerial cameras may have one or more lenses for plan views, perspectives and panoramic survey.
Basic features of aerial cameras are their focal length, negative size, and minimum exposure time,
which is as short as 1/1000 sec in Soviet aerial cameras.

Standard Soviet aerial cameras designed for topographical surveys, are having 18 x 18 cm negative
size, have focal lengths from 50 to 500 mm and corresponding field of-view angles from 150° to
30°. Aerial cameras of either type are used for black and white or color aerial photographic
surveying.

Aerial cameras with varying focal lengths, beginning with 88 mm, are also used aboard. With the
most popular 23 x 23 cm negative size, this corresponds to field-of view angles up to 125°. Like
other cameras, aerial camera also has basic features of Lens, Shutter and Diaphragm, working as
focal plane/focal length, controlling exposure speed and aperture respectively.

SVI 231: Elements of Photogrammetry and Remote Sensing


Surv. Dr. Esomchukwu Igbokwe
Focal Plane and Focal Length

Focal plane is the flat surface where film is held. Focal length is the distance from the focal plane
to approximately the center of the camera lens. Thin lens equation is:

1/𝑓 = 1/o + 1/𝑖

f = focal length of the camera, o = distance between object and camera and i = distance between
lens and image plane.

Most metric aerial cameras have a fixed focal length such as 152 mm and 305mm. Military
photoreconnaissance operations commonly employ lenses 3 to 6 feet to obtain detailed
photographs from extremely high altitudes.

Types of Aerial Cameras

Single-lens mapping (metric) cameras: It provides highest geometric and radiometric quality of
aerial photography to map the planimetric (x, y) location of features and to derive topographic
contour map. Individual exposures are typically 23 x 23 cm.

Multiple-lens cameras: Each of the lenses (camera) simultaneously records photographs of the
same area, but using different film and filter combination. This creates multiple band photographs.

Panoramic camera: This type of camera uses a rotating lens (or prism) to produce a narrow strip
of imagery perpendicular to the flight line. It is commonly used by military but much less in
civilian applications due to poor geometric integrity.

SVI 231: Elements of Photogrammetry and Remote Sensing


Surv. Dr. Esomchukwu Igbokwe
Digital camera: Digital camera uses “charge-coupled-device (CCD) detectors. These detectors
are arranged in a matrix and located at the film plane.

A digital camera takes light and focuses it via the lens onto a sensor made out of silicon. It is made
up of a grid of tiny photosites that are sensitive to light. Each photosite is usually called a pixel
("picture element"). There are millions of individual pixels in the sensor of a digital camera.
Advantage with digital camera is that it records and store photographic images in digital form.
These images can be stored directly in the camera or can be uploaded onto a computer or printer
later on. To replicate the spatial resolution of standard 9x9 inches metric aerial photograph, a
digital camera would require approximately 20000 x 20000 detectors.

BASIC OPTICS OF PHOTOGRAMMETRY


A simple, converging lens will produce an image at a distance, i, from the lens (along the
optic axis) of an object at a distance, o, on the opposite side of the lens (Figure 2.1a). The object
distance and image distance are related by:

1/o + 1/i = 1/f


where f is the focal length of the lens. The size of the image is v tan θ and the size of the object
is u tan θ. When discussing lenses, the relative size of the image and object defines the
magnification of the system, and the magnification, m, is given by:

m = i/o = image size / object size

Figure 2.1: Formation of an image by a simple converging lens: a) the object is relatively
close to the lens, b) the object distance is much greater than the image distance.

SVI 231: Elements of Photogrammetry and Remote Sensing


Surv. Dr. Esomchukwu Igbokwe
In remote sensing, the distance to the object is normally the altitude, h, of the aircraft or
satellite which is much greater than the image distance (o >> i) in which case the image distance
approaches the focal length (i f). In such systems we refer to the scale of the image rather than
its magnification, and define scale in terms of the focal length of the lens system:

S = f/h

Since scale is expressed as a ratio of the image size to the object size, objects in a "large
scale" image (or map) objects are larger than in a "small scale" image. Conversely, the total area
covered by a large scale image is less than that of a small scale image.

Figure 2.2: Illustration of imaging for a nadir-viewing camera.

For remote sensing systems, it is common to represent the imaging system as in Figure 2.2
where the camera is nadir viewing, the ground is flat, and the object distance is the altitude of the
lens of the camera. In Figure 7.2, the ground length imaged is labeled the FOV (field of view) and
the image size is labeled w. The image scale may then be represented as:

S = f/h = w/FOV = image size /object size


Another fundamental characteristic of an imaging system is its spatial resolution. Spatial
resolution defines the ability of an optical system to distinguish between two distinct points. For
photographic systems, resolution is normally defined in terms of line pairs (lp) per unit length.
When a bar pattern such as that shown in Figure below. The resolution of the system (including
the lens system and film or sample spacing of the detector) is then the smallest separation, d, for
which a pair of bars can be distinguished on the film.

SVI 231: Elements of Photogrammetry and Remote Sensing


Surv. Dr. Esomchukwu Igbokwe
Figure: A bar pattern with spacing, d, between consecutive bars.

The spatial resolution of a digital frame camera will depend on the spacing of the elements
in the detector array. The closer the spacing, the higher the resolution will be for a given lens
system and focal length. For film the resolution depends on the size of the grains of the silver salts
that form the image. The smaller the grain size, the higher the spatial resolution of the film and
the slower the film speed. The highest spatial resolutions available for aerial photographic films
are typically 200 lp mm-1. This corresponds to a typical grain size of ~1 μm. The detector spacing
for a comparable digital frame camera would be ~5-10 μm.

The line pair definition of spatial resolution is reasonable for film, where the image
formation depends on grains of silver salts whose orientation is random. It is less appropriate for
digital imaging systems which have arrays of (usually) square detection elements. Since the
spacing of the elements is different along the rows and columns of the array than along a diagonal,
the spatial resolution usually varies with the orientation of the array. Spacing of the detectors in a
high-resolution CCD array is roughly an order of magnitude greater than that of high-resolution
film. At present it is also difficult to create large arrays. For the time being, large format (e.g., 9-
inch) aerographic film is inherently capable of both higher resolution and greater area coverage
than a CCD array for an equivalent lens system. The digital systems are able to compensate,
however, with lens design and adjustments to the focal distance.

Scale of a photograph

The concept of scale for aerial photograph is same as that of a map. Scale is the ratio of a distance
on an aerial photograph and the distance between the same two places on the ground in the real
world. It can be expressed in unit equivalents like 1 cm= 1,000 km (or 12,000 inches) or as a
representative fraction (1:100,000). To determine the dimension during air photo interpretation, it
will be necessary to make estimates of lengths and areas, which require knowledge of the photo
scale.

SVI 231: Elements of Photogrammetry and Remote Sensing


Surv. Dr. Esomchukwu Igbokwe
Scale maybe expressed in three ways

a) Scale ratio: It is also referred to as the proportional scale. 1:20,000 is read as "one to
twenty thousand".
b) Equivalent scale: Equivalent scale is also known as the descriptive scale. For example:
one-inch equals 5,280 feet (1 inch = 5,280 feet).
c) Graphic scale: Also called a bar scale, used on maps and drawings to represent length
scale on paper with length units.

Large scale: Larger-scale photos (e.g. 1:25 000) cover small areas in greater detail. A large-scale
photo simply means that ground features are at a larger, more detailed size. The area of ground
coverage that is seen on the photo is less than at smaller scales.

Small scale: Smaller-scale photos (e.g. 1:50 000) cover large areas in less detail. A small-scale
photo simply means that ground features are at a smaller, less detailed size. The area of ground
coverage that is seen on the photo is greater than at larger scales.

Following methods are used to compute scale of an aerial photograph using different sets of
information:

Method 1: Scale is the ratio of the distance between two points on a photo to the
actual distance between the same two points on the ground (i.e. 1 unit on the photo
equals "x" units on the ground). If a 1 km stretch of highway covers 4 cm on an air
photo, the scale is calculated as follows:

𝑝ℎ𝑜𝑡𝑜 𝑑𝑖𝑠𝑡𝑎𝑛𝑐𝑒 4𝑐𝑚 4𝑐𝑚 1


= = =
𝐺𝑟𝑜𝑢𝑛𝑑 𝑑𝑖𝑠𝑡𝑎𝑛𝑐𝑒 1𝑘𝑚 100000𝑐𝑚 25000

So, the scale is: 1/25000

SVI 231: Elements of Photogrammetry and Remote Sensing


Surv. Dr. Esomchukwu Igbokwe
Method 2: Another method used to determine the scale of a photo is to find the ratio between the
camera's focal length and the plane's altitude above the ground being photographed.

If a camera's focal length is 152 mm, and the plane's altitude Above Ground Level (AGL) is 7 600
m, using the same equation as above, the scale would be:

𝐹𝑜𝑐𝑎𝑙 𝑙𝑒𝑛𝑔𝑡ℎ 152𝑚𝑚 152𝑚𝑚 1


= = =
𝐴𝑙𝑡𝑖𝑡𝑢𝑑𝑒 7600𝑚 57600000 50000

So, the scale is: 1/50000.

SVI 231: Elements of Photogrammetry and Remote Sensing


Surv. Dr. Esomchukwu Igbokwe
Method 3: Scale of the photograph can also be calculated if we know focal length of camera and
height of aircraft above the ground level.

Scale = f/H-h

Where, H=flying height of aircraft above sea level, h = height of ground above sea level and f is
focal length.

SVI 231: Elements of Photogrammetry and Remote Sensing


Surv. Dr. Esomchukwu Igbokwe

You might also like