Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 57

IPMV

Chapter 4:
Image Segmentation
Element of Image Analysis

Preprocess
Image acquisition, restoration, and enhancement

Intermediate process
Image segmentation and feature extraction

High level process


Image interpretation and recognition

2
Preview
 Segmentation is to subdivide an image into
its component regions or objects.

 Segmentation should stop when the


objects of interest in an application have
been isolated.
3
Importance of Image Segmentation
Image segmentation is used to separate an image into constituent
parts based on some image attributes. Image segmentation is an
important step in image analysis

Benefit
1. Image segmentation reduces huge amount of unnecessary
data while retaining only important data for image analysis
2. Image segmentation converts bitmap data into better
structured data which is easier to be interpreted

4
Principal approaches
 Segmentation algorithms generally are
based on one of 2 basis properties of
intensity values

 discontinuity : to partition an image based on


sharp changes in intensity (such as edges)

 similarity : to partition an image into regions


that are similar according to a set
of predefined criteria.
5
Image Attributes for Image Segmentation

1. Similarity properties of pixels inside the object are used to group


pixels into the same set.
2. Discontinuity of pixel properties at the boundary between object
and background is used to distinguish between pixels belonging to
the object and those of background.

Discontinuity:
Intensity change
at boundary

Similarity:
Internal
pixels share
the same
intensity
6
7
Detection of Discontinuities
 detect the three basic types of gray-
level discontinuities
 points , lines , edges
 the common way is to run a mask through
the image

8
Point Detection
 a point has been detected at the location
on which the mark is centered if
|R|  T
 where
 T is a nonnegative threshold
 R is the sum of products of the coefficients
with the gray levels contained in the region
encompassed by the mark.
9
Point Detection
 Note that the mark is the same as the
mask of Laplacian Operation (in chapter 3)
 The only differences that are considered
of interest are those large enough (as
determined by T) to be considered
isolated points.
|R|  T

10
Point Detection
 We can use Laplacian masks
-1 -1 -1 0 -1 0
for point detection.
-1 8 -1 -1 4 -1
 Laplacian masks have the largest -1 -1 -1 0 -1 0
coefficient at the center of the mask
while neighbor pixels have an
opposite sign.

 This mask will give the high response to the object that has the
similar shape as the mask such as isolated points.

 Notice that sum of all coefficients of the mask is equal to zero.


This is due to the need that the response of the filter must be zero
inside a constant intensity area

11
Point Detection
Point detection can be done by applying the thresholding function:
1 f ( x , y )  T
g ( x, y )  
0 otherwise

Location of
porosity
X-ray image of the Laplacian image After thresholding
turbine blade with
(Images from Rafael C. Gonzalez and12
porosity Richard E.
Wood, Digital Image Processing, 2nd Edition.
Line Detection
 Similar to point detection, line detection can be performed
using the mask the has the shape look similar to a part of a line

 There are several directions that the line in a digital image can be.

For a simple line detection, 4 directions that are mostly used are
Horizontal, +45 degree, vertical and –45 degree.

Line detection masks


(Images from Rafael C. Gonzalez and13Richard E.
Wood, Digital Image Processing, 2nd Edition.
Line Detection
 Alternatively, if we are interested in
detecting all lines in an image in the
direction defined by a given mask, we
simply run the mask through the image and
threshold the absolute value of the result.

 The points that are left are the strongest


responses, which, for lines one pixel thick,
correspond closest to the direction
defined by the mask. 14
15
16
Robert Collins
CSE486, Penn State
What Are Edges?
Simple answer: discontinuities in intensity.

17
Robert Collins
CSE486, Penn State
Boundaries of
objects

18
Robert Collins
Boundaries of Material
CSE486, Penn State
Properties

19
Robert Collins
CSE486, Penn State
Boundaries of
Lighting

20
Edge Detection
 we discussed approaches for implementing
 first-order derivative (Gradient operator)
 second-order derivative (Laplacian operator)
 Here, we will talk only about their properties for
edge detection.
 we have introduced both derivatives in chapter 3

21
Edge Models

22
23
Smoothed Step Edge and Its Derivatives
1.2
Edge

Intensity
1
0.8
Edge
Gray level profile 0.6
0.4
0.2
0
-0.2
-1.5 -1 -0.5 0 0.5 1 1.5 2 2.5 3 3.5

0.06 Minimum
0.04
0.02 point
The 1st derivative 0
-0.02
-0.04 Maximum
-0.5point
-0.06
-1.5 -1 0 0.5 1 1.5 2 2.5 3 3.5
x 10-3 Zero
5
4
3
crossing
2
1
+ +
The 2nd derivative 0
-1 - -
-2
-3
-4
-5
-1.5 -1 -0.5 0 0.5 1 1.5 2 2.5 3 3.5

24
Derivative Based Edge Detection

From the previous slide, we can conclude that:


Local maxima of the absolute of the 1st derivative and Zero crossing of the 2nd

derivative occur at edges.


the magnitude of the first derivative can be used to detect the presence of an edge at

a point in an image. Similarly,


the sign of the second derivative can be used to determine whether an edge pixel lies

on the dark or light side of an edge.


• Two additional properties of the second derivative around an edge are: (1) it
produces two values for every edge in an image; and (2)
its zero crossings can be used for locating the centers of thick edges
25
Thick edge
 The slope of the ramp is inversely proportional to the
degree of blurring in the edge.
 We no longer have a thin (one pixel thick) path.
 Instead, an edge point now is any point contained in the
ramp, and an edge would then be a set of such points
that are connected.
 The thickness is determined by the length of the ramp.
 The length is determined by the slope, which is in turn
determined by the degree of blurring.
 Blurred edges tend to be thick and sharp edges
tend to be thin

26
First and Second derivatives

the signs of the derivatives


would be reversed for an edge
that transitions from light to
dark 27
Noise Images
 First column: images and
gray-level profiles of a
ramp edge corrupted by
random Gaussian noise of
mean 0 and  = 0.0, 0.1,
1.0 and 10.0,
respectively.
 Second column: first-
derivative images and
gray-level profiles.
 Third column : second-
derivative images and
gray-level profiles.

28
Keep in mind
 fairly little noise can have such a
significant impact on the two key
derivatives used for edge detection in
images
 image smoothing should be serious
consideration prior to the use of
derivatives in applications where noise is
likely to be present.
29
Edge point
 to determine a point as an edge point
 the transition in grey level associated with
the point has to be significantly stronger
than the background at that point.
 use threshold to determine whether a value
is “significant” or not.
 the point’s two-dimensional first-order
derivative must be greater than a specified
threshold.

30
 f 
Gx   x 
f      f 
Gradient Operator G y   
 y 

 first derivatives are implemented using


the magnitude of the gradient.
gradient
1
f  mag (f )  [G  G ]
2
x
2
y
2

1
commonly approx.
 f  2  f  2  2

      
 x   y  
f  G x  G y
the magnitude becomes nonlinear
31
Gradient Masks

32
Diagonal edges with Prewitt
and Sobel masks

33
Example

34
Example

35
Example

36
Laplacian
Laplacian operator
 f ( x, y )  f ( x, y )
2 2

(linear operator)  f 
2

x 2
y 2

 f  [ f ( x  1, y )  f ( x  1, y )
2

 f ( x, y  1)  f ( x, y  1)  4 f ( x, y )]

37
Laplacian of Gaussian
 Laplacian combined with smoothing to
find edges via zero-crossing.

where r2 = x2+y2, and


r2  is the standard deviation

h( r )  e 2 2

r2
 r 2
  2
 
 h( r )   
2
e
2 2

 
4
 38
positive central term
surrounded by an adjacent negative region (a function of distance)
zero outer region
Mexican hat

the coefficient must be sum to zero 39


Linear Operation
 second derivation is a linear operation
 thus, 2f is the same as convolving the
image with Gaussian smoothing function
first and then computing the Laplacian of
the result

40
Example
a). Original image
b). Sobel Gradient
c). Spatial Gaussian
smoothing function
d). Laplacian mask
e). LoG
f). Threshold LoG
g). Zero crossing

41
Zero crossing & LoG
 Approximate the zero crossing from LoG
image
 to threshold the LoG image by setting all
its positive values to white and all
negative values to black.
 the zero crossing occur between positive
and negative values of the thresholded
LoG.

42
43
Thresholding
image with dark image with dark
background and background and
a light object two light objects

44
Multilevel thresholding
 a point (x,y) belongs to
 to an object class if T1 < f(x,y)  T2
 to another object class if f(x,y) > T2
 to background if f(x,y)  T1
 T depends on
 only f(x,y) : only on gray-level values  Global
threshold
 both f(x,y) and p(x,y) : on gray-level values and its
neighbors  Local threshold

45
easily use global thresholding
object and background are separated

The Role of Illumination


f(x,y) = i(x,y) r(x,y)

a). computer generated


reflectance function
b). histogram of
reflectance function
c). computer generated
illumination function
(poor)
d). product of a). and c).
e). histogram of product
image
difficult to segment 46
Basic Global Thresholding

use T midway
between the max
and min gray levels
generate binary image
47
Basic Global Thresholding
 based on visual inspection of histogram
1. Select an initial estimate for T.
2. Segment the image using T. This will produce two
groups of pixels: G1 consisting of all pixels with gray
level values > T and G2 consisting of pixels with gray
level values  T
3. Compute the average gray level values 1 and 2 for
the pixels in regions G1 and G2
4. Compute a new threshold value
5. T = 0.5 (1 + 2)
6. Repeat steps 2 through 4 until the difference in T in
successive iterations is smaller than a predefined
parameter To.
48
Example: Heuristic method
note: the clear valley
of the histogram and
the effective of the
segmentation
between object and
background

T0 = 0
3 iterations
with result T = 125

49
Basic Adaptive Thresholding
 subdivide original image into small areas.
 utilize a different threshold to segment
each subimages.
 since the threshold used for each pixel
depends on the location of the pixel in
terms of the subimages, this type of
thresholding is adaptive.

50
Example : Adaptive Thresholding

51
Further subdivision
a). Properly and improperly
segmented subimages from previous
example
b)-c). corresponding histograms
d). further subdivision of the
improperly segmented subimage.
e). histogram of small subimage at
top
f). result of adaptively segmenting d).

52
53
Boundary Characteristic for
Histogram Improvement and Local
Thresholding light object of dark background
0 if f  T

s ( x , y )   if f  T and  2 f  0
 if f  T and  2 f  0

 Gradient gives an indication of whether a pixel is on an edge


 Laplacian can yield information regarding whether a given
pixel lies on the dark or light side of the edge
 all pixels that are not on an edge are labeled 0
 all pixels that are on the dark side of an edge are labeled +
 all pixels that are on the light side an edge are labeled -
54
Example

55
Region-Based Segmentation
- Region Growing
 start with a set of “seed” points
 growing by appending to each seed those
neighbors that have similar properties
such as specific ranges of gray level

56
select all seed points with gray level 255

Region Growing
criteria:
1. the absolute gray-
level difference
between any pixel
and the seed has to
be less than 65
2. the pixel has to be
8-connected to at
least one pixel in
that region (if more,
the regions are
merged) 57

You might also like