Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

A Simple Image-based Object Velocity Estimation

Approach

Hung-Chi Chu* Hao Yang


Dept. of Infonnation and Communication Engineering Dept. of Information and Communication Engineering
Chaoyang University of Technology Chaoyang University of Technology
Taichung, Taiwan Taichung, Taiwan
hcchu@cyut.edu.tw srorzyang@hotmail.com.tw

Abstract-Image processing technology has recently been movement distance of the object. The proposed approach is
used for gesture recognition, face recognition, object tracking, simple and exhibits a low estimation error.
augmented reality, 3D robot vision, and object modeling.
Detecting object velocity is crucial to image processing research. This paper is organized as follows: Section 2 describes the
This study used a webcam as experimental equipment and research related to velocity estimation, Section 3 presents the
designed two main processing modules, an object detection detailed design of the proposed approach, Section 4 presents
module and a background measurement module, to estimate the the experimental results, and Section 5 presents a summary of
velocity of an object. The experimental results showed that the the paper and offers suggestions for future research.
proposed method can be used to effectively estimate the velocity
of an object with an average error of less than 6%. II. RELATED WORKS
Velocity estimation is a critical issue. This section
Keywords-object tracking; object velocity
describes traditional velocity estimation methods and image
processing techniques.
I. INTRODUCTION

Image processing technology is used in many aspects of Laser infrared detection [5] is a traditional method
everyday life, such as object tracking [1, 2], height commonly used for velocity estimation. It measures infrared
measurements [3], area measurements [4], and laser radar light waves to determine Measured with two lasers and two
velocimeters [5, 6]. These applications make people's lives channel laser reflection time thus that the speed of the
more convenient. Because of the camera focal length, object transmission time. Because the speed of light is constant, a
distance, backlight, and mobility affect the perfonnance of laser pulse is sent and the target reentry time is proportional to
image processing application systems. Therefore, studies have the distance. If two pulses are transmitted at fixed intervals, the
examined issues such as optical flow detection [7], camera two distances can be measured. The difference between the
stereo vision [8], and particle image velocimetry [9]. Detecting two distances is divided by the transmission time intervals,
object velocity is crucial to image processing research. producing the target speed. Another conventional method for
velocity estimation is radio detection and ranging (radar) [6].
The traditional object velocity measurement method is Radar uses the Doppler effect to perform velocity estimation.
highly accurate. The disadvantage of the method is that it When the target is close to the radar antenna, the reflected
requires special equipment and these instruments are not signal frequency is higher than the transmitter frequency.
readily available. Image processing techniques can also be used When the target is far from the antenna, the reflected signal
to measure object velocity. Based on image characteristics, frequency is lower than the transmitter frequency. By changing
geometric theory can be used to solve the object velocity the frequency values, the relative degree of the target and radar
measurement problem. The advantages of the method are that can be calculated. These traditional methods are highly
it is simple and easy. However, its disadvantages are that it accurate at estimating velocity, but they require specific
requires an additional reference object, including length equipment.
information, and it exhibits a higher estimation error than the
traditional method. Image processing research has focused on tracking objects
and measuring object lengths. Ellis et al. [1] used dynamic
Therefore, this paper proposes an image-based method to resizing of a template image in a video as it grew and shrank
solve the object velocity measurement problem without using and used a fast Fourier transform to improve the processing
additional equipment. It can be performed using a simple speed of object tracking. Stauffer and Grimson [2] modeled
webcam and a referred object with a known length. The pixels as a mixture of Gaussian functions and used an on-line
proposed object detection module (ODM) and background approximation to update the model. The Gaussian mixture
measurement module (BMM) can be used to obtain sufficient model (GMM) was created as a background image to
information for object velocity measurements. The ODM determine which results were most likely.
captures an image of the object and the BMM estimates the

*Corresponding author. Fax: +886 4 23305539.


E-mail address: hcchu@cyut.edu.tw (H.-C. Chu).

978-1-4799-3106-4114/$31.00 ©2014 IEEE 102


d

Webcams

Reference object Camera Screen

Fig. 1. An example of object moving.

[3] and [4] proposed spatial measurement and distance


measurement methods, respectively. Lee [3] used a reference
object with a known height to estimate the height of nearby
objects. Chuang et al. [4] used a camera, with four infrared
beams projected onto a measured space, and the geometric
relationship between an infrared light spot and nearby pixels
to obtain the length of a specific object. This was conducted
using the known length of an object in an image according to
its percentage share of the number of pixels to estimate the
NO
spatial distance of nearby objects. Although this method of Docs the 1I1rgct
Step 3 Icave FOV'!
estimating length exhibits errors, the method requires much
less equipment than traditional velocity estimation methods.
Therefore, if the error is within an acceptable range, this type
of method can be applied.

III. VELOCITY ESTIMATION ApPROACH


In this study, the following assumptions were applied in the
velocity estimation approach. It was assumed that the tracking Fig. 2. Example of a figure caption.
object within the image moved from left to right (or right to
left) for simplicity (Fig. 1). A reference object with a known
length was also included in the image. This reference object
Equation (2) was then applied with threshold S to filter out
could be deployed by the user or an existing, familiar object
2 the tracking object in the foreground (Fig. 3(c)).

, {
could be used (e.g., a 30 cm square tile).

The proposed velocity estimation approach included four 0, ijh(x,y)<S


steps, and Fig. 2 shows the system flow chart. The following h (x,y) = (2)
h(x,y), OthefWise
sections describe each step required by the proposed method.

Step I-Object Detection: The GMM was used to create a Fig. 3(c) exhibits high levels of noise or object rupturing. A
background image, as shown in Fig. 3(a). To produce each median filter and dilation and erosion process were then used
real-time image (Fig. 3(b)), background subtraction was used to remove noise and generate a clear foreground object, as
to obtain the object in the foreground, as described in (1). shown in Fig. 3(d).
Before the background subtraction process, the foreground and
background images should undergo grayscale processing. A median filter is a nonlinear digital filtering technique that
is often used to remove noise, such as salt-and-pepper noise.
N M An n x n window (in this study, n = 3) is taken and the median
S(x,y) = II f(x,y)-b(x,y) (1) of the pixels in each window centered around [i, j] is then
x=1 y=1 computed by:

1. Sorting the pixels into ascending order according to their


where j(x,y) represents the foreground image pixel at location
gray levels.
(x, y), and b(x,y) is the background image pixel at location (x,
y). S(x,y) is the image pixel at location (x, y) after the 2. Selecting the value of the middle pixel as the new value
background subtraction process is completed. for pixel [i, j].

103
Fig. 3. Foreground object detection (a) background image; (b) real-time
image; (c) background subtraction image; (d) noise removal image.

After the median filter process, erosion and dilation


operations were performed. The erosion and dilation processes Fig. 4. The experimental environment.
are fundamental to morphological image processing. In the
erosion process, every object pixel that is close to a
background pixel is changed into a background pixel. In the
dilation process, every background pixel that is close to an
object pixel is changed into an object pixel. This means that the
erosion process makes objects smaller, and can separate a
single object into multiple objects, and the dilation process
makes objects larger, and can merge multiple objects into one
object.

Step 2-Background Measurement: Fig. 4 shows the


experimental environment. The camera was placed in front of
the object. An image was obtained with pix(h) x pix(d) pixels
that included a pix{hre} x pix(dre} reference object. However,
in a real environment, the hrea! x drea! space is shown in the
image. Geometric theory was used in this step to determine the
actual length of a pixel in the image. In the experiment, a
rectangular reference object was included in the image. Step 1 (c) (d)
was repeated for the foreground and background images, as
shown in Figs. 5(a) and 5(b), which included the reference Fig. 5. Foreground reference object detection (a) background image; (b) real­
object to obtain the background subtraction image (Fig. 5(c)). time image; (c) background subtraction image (d) estimate the length d of the
background image.
Equation (3) was then used to estimate the real distance (drea!)
of the background image (Fig. 5(d)).

Pix(d) (3) Step 4-Velocity Estimation: The velocity of object 0 (Vo)


dreal ' d xdref
=

PIX ( ref ) is determined using (4),

P ix(d) is the number of pixels in length d, P ix( dref) is the (4)


number of pixels in length drefi and dref is the length of the
reference object.

Step 3-Moving Duration: The moving duration of the where do is the moving distance from point PA at time tA to
object (i.e., object tracking) must be recorded. For example, in point PB at time tB' Therefore, do is I PB - PA I and to is tB - tAo
Fig. 1, moving duration to is tB - tAo At time tA, the object is
detected at position PA, and at time tB, it is detected at position
PB.

104
IV. EXPERIMENTAL RESULTS V. CONCLUSIONS
In the experiment, a Logitech Webcam 120 was used to This paper presents a simple image-based object velocity
obtain an image with a resolution of 640 x 480 pixels. In the estimation approach. The proposed approach uses a webcam as
experiment the ball was rolled against the wall and the camera the experimental equipment instead of complex, expensive, and
was placed 20 cm,
cm, and 40 cm away from the wall. The
30 specific equipment. The simulation results show that the
2 2 2 2
reference object was a 1 cm , 2 cm , 5 cm , or 10 cm . A 45- proposed approach is feasible and practical, with a low average
degree slope along the wall was used to allow the ball to roll error. Future studies should apply the proposed approach to
down and the speed measurement was then performed. Each smartphones and use a more intuitive referred object to
experimental measurement was taken 30 times to evaluate the estimate the spatial location.
accuracy of the proposed method. Fig. 6 shows the
development of the experimental platform. The image zone Imaae Zone Function Zone

included four images: (a) a background image, (b) a grayscale '_OpenC.!Imer� I


background image, (c) a foreground image, and (d) a 2.CatchC.:ro&ra I
background removal image. The functional zone included 3.Gr�scale I
function buttons (open camera, catch image, grayscale,
4.Trackng I 0.1
tracking, background measurement, and stop) and image
IS.BkmeasUiement4 �
processing information (image resolution and tracking time).
25.90476226EU64

,St", I
Fig. 7 shows the estimation error between the real and ResoloJion: 640 X 480

estimated object velocities when the camera was placed 20 cm


2
away from the wall and the reference object was a 5 cm
square (Case 1). The average estimated error for Case 1 was Fig. 6. Development of the experimental platform.
0.28 cm/s, with a 1.9 cmls maximal error and a standard
deviation of 0.196. Similarly, Figs. 8 and 9 show the estimation
errors between the real and estimated object velocities when • Real Velocity
the camera was placed 30 cm (Case 2) and 40 cm (Case 3)
• Measured Velocity
away from the wall. The average estimated error for Case 2
was 1.66 cmls, with a 5 cm/s maximal error and a standard 200
deviation of 1.57. The average estimated error for Case 3 was 180
2.35 cm/s, with a
deviation of 1.99.
6.4 cm/s maximal error and a standard
When the detected object moved faster, the
160
pixel present in the object in the image also moved faster. -U140
�120
Q/
Therefore, the estimation error of one pixel increases the
estimation error. Fig. 10
error for each experiment with different
shows the object velocity estimation
d",all values.
], 100
'u 80
>
..

Qi 60
0
Figs. 11 and 12 show the average background
measurements (repeated times) and average velocity >
30
estimations for reference objects of various sizes with different
40
distances between the camera and wall. When d",all was 20 cm, 20 In n n
the measurement error was less than 1.23 cm. As d",all 0 III in III U Il Il � �
increased, the measurement error increased. This is because 1 3 5 7 9 11 13 15 17 1921 23252729
when the reference object was small, a few pixels could The number of experiments
represent the reference object. Therefore, a single pixel
measurement error has a greater impact. Similarly, when the
distance between the camera and wall increased, the reference Fig. 7. Object velocity estimation (distance =20cm).
object within the image became smaller and the measurement
error also increased. Hence, when a reference object was small
and the distance between the camera and wall was large, the
2
measurement error significantly increased (e.g., a 1 cm
reference object and a d",all of 40 cm).
The experimental results reflected the following
phenomena:

(1) When the camera was farther away from the wall, the
actual length of the wall image estimated error was greater.

(2) When the detected object moved faster, the estimated


error also increased.

105
...... dwall disance 40cm _dwall disance 30cm
• Real Velocity
�dwall disance 20cm
• Measured Velocity U

300 � 12
E 10 +-:---------....----- .

250 � 8 j
o
6 � , � �
u
� 200 �> 4 j " A r j:l
-
·0 2 FI
'""'l :-11 \ �". [ll ftl
l/ll . L. l.

E � ..
.!:!.> 150 .2 .� -= � � I..L." .. I"!!
.. � 0 ....

....

·u 1 3 5 7 9 11131517192123252729
Qj 100
o
The number of experiments
>

50
o Il III 1111 �� III Ih��� III I �
� II
Fig. 10. Estimation error of velocity.

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29
The number of experiments _ 40
5 35 .The actual
Fig. 8. Object velocity estimation (distance =30cm). ! 30
s::
distance
.. 25
!Q
.!a
• Reference object
Icmx Icm
• Real Velocity
"0
"0
20 • Reference object
2cm x 2cm
15
cu
"-
� • Reference object
• Measured V elocity
10
'"
!Q Scm x Scm
cu

400 � 5
lID
• Reference object
IOcm x 10cm
350 >
<t 0
300 20cm 30cm 40cm
U
dWall
-
�250
E
.!:!.200
..> Fig. II. Reference object size in the background measurement
] 150 -

cu
> 100 -

• • • III
50
�t 100
'II n

o
Il Il. In I� n Illn I I U
�80
• Real distance

1 3 5 7 9 11 13 15 17 1921 23252729 -
E • Reference object
.!:!.60 Icmx Icm
The number of experiments
..>
·u
• Reference object

040
Qj
2cm x 2cm
• Reference object

Fig. 9. Object velocity estimation (distance =40cm). > Scm x Scm


oj 20 • Reference object

:t IOcm x IOcm
o
20cm 30cm 40cm

Fig. 12. Reference object size in the velocity measurements

106
[4] C.T. Chuang, W.Y. Wang , c.P. Tsai ,Y.H. Chien , and M.C. Lu, "An
image-based area measurement system," International Conference on
ACKNOWLEDGMENT System Science and Engineering (ICSSE),pp. 644-648,2011.
[5] J. Chung, C.P. Grigoropoulos, and RalphGreif, "Infrared Thermal
THIS RESEARCH WAS SUPPORTED IN PART BY THE NATIONAL Velocimetry in MEMS-Based Fluidic Devices," Journal of
Microelectromechanical Systems, vol. 12,pp. 365-372, 2003.
SCIENCE COUNCIL, TAlWAN, ROC, UNDER GRANT NSC 102-
[6] R. Alejandro, C. Adolfo, E. Antoni, G. Enrique, M.E. JoseLuis, R.
2221-E-324-023. Gabino, "Low-power coherent laser radar velocimeter: applications to
law enforcement," 25th European Microwave Conference, vol.l, pp.
485-489,1995.
REFERENCES [7] L. Chen, H. Yang, T. Takaki, and 1. Ishii, "Real-time frame-straddling­
based optical flow detection," IEEE International Conference on
[I] J.G. Ellis, K.A. Kramer, and S.c. Stubberud, "Image Correlation Based Robotics and Biomimetics (ROBIO),pp. 2447-2452,2011.
Video Tracking," 21st International Conference on Systems Engineering
(ICSEng),pp. 132-136,20II. [8] C. Holzmann, and M. Hochgatterer, "Measuring Distance with Mobile
Phones Using Single-Camera Stereo Vision," 32nd International
[2] C. Stauffer, and W.E.L. Grimson, "Adaptive background mixture Conference on Distributed Computing Systems Workshops (ICDCSW),
models for real-time tracking," IEEE Computer Society Conference on pp. 88-93,2012.
Computer Vision and Pattern Recognition,vol. 2,pp. 246-252,1999.
[9] K. Komiya, T. Kurihara, and S. Ando, "3D Particle Image Velocimetry
[3] K.Z. Lee, "A SimpleCalibration Approach to Single View Height Using Correlation Image Sensor," Proceedings of STCE Annual
Estimation," Ninth Conference on Computer and Robot Vision (CRV), Conference (STCE),pp. 2774-2778, 20II.
pp. 161-166,2012.

107

You might also like