Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

Lecture Notes on Software Engineering, Vol. 1, No.

4, November 2013

Circle Marker Based Distance Measurement Using a


Single Camera
Yu-Tao Cao, Jian-Ming Wang, Yu-Kuan Sun, and Xiao-Jie Duan


imaging system remain unchanged.
Abstract—A new distance measurement method with the use
of a single camera and a circle marker is presented. The
distance measurement is based on the idea that a circle marker II. MATHEMATICAL FOUNDATION
at a longer distance forms a smaller image when the parameters
of the imaging system remain unchanged. Firstly, the image This method uses a monocular camera to obtain images of
region of the circle marker is segmented and rotation correction the circle marker and retrieve distance information by
algorithm is designed and applied to recover the supposed analyzing the images. The relationship between distance and
image region when the circle marker is perpendicular to the images of the circle marker is analyzed and built in the
camera optical axis. The distance value can be calculated by the
section.
corrected image region area and pinhole camera model. Finally,
experimental results show that distance information can be A. Distance Measurement Principle
retrieved by the proposed method.
Based on the pinhole imaging theory [9], the principle
used in the paper is shown in Fig. 1:
Index Terms—Pinhole imaging, circle markers, area,
distance.

I. INTRODUCTION
During tracking an object, distance between an mobile
robot and the object is required to control the action of the
robot. There are many ways of distance measurement, which
are found as ultrasonic ranging, millimeter-wave radar
ranging, laser ranging and vision based ranging[1]-[4].
Because of its merits, such as low cost, non-contact, vision Fig. 1. Principle of this paper.
based distance measurement is paid more and more attention.
Paper [5] proposed vision-based global positioning methods, In the world coordinate system, the radius of the marker is
it used object-based and spatial-based layout hybrid map, R, the area is S. In the camera coordinate system, the radius is
through the horizontal centerline axis passing through the r, the marker’ area is s, focal length is f, object distance that
image, which is similar to the 2D data laser range finder, need to calculate is d.
through the restoration of the depth information, to determine Based on pinhole imaging principle and Similar Triangles,
the position and angle of the robot. Paper [6] proposed a full the relationship can be obtained as follows:
range of stereo vision method, using mainly CamShift
algorithm, it could achieve real-time tracking of objects and f r
calculate the three-dimensional coordinates of objects
 (1)
d R
accurately. Paper [7] proposed method of tracking to get the
f 2
r 2 s
distance based on binocular, through the camera calibration;   (2)
it found the targets’ position in the world coordinate system. d 2 R 2 S
Paper [8] proposed tracking methods based on monocular, it
used the size of the rectangular marker by making a matching
d  f / s/S (3)
to determine whether to go forward or backward, with the
SURF algorithm to achieve tracking.
The main contribution of this paper is that a monocular For the area of plane geometry, it can be looked as the
camera based distance measurement algorithm is proposed, process to evaluate one function definite integral, which is
which adopts a circle marker and retrieve distance proven still meet the formula (3). In the actual tracking, the
information by the phenomenon that objects at farther target may be into various shapes, but, according to the
positions will form smaller images if the conditions of the meaning of area, the area of irregular shapes can be obtained
by the definite integral. Based on definition of definite
integral, it can be supposed that the function f (x) is
Manuscript received June 15, 2013; revised August 26, 2013. continuous in the interval [a, b], devise [a, b], into n
subintervals: a, x0  , x0 , x1  ,  x1 , x2  ,…, xi 1 , xi  , xi , b , seen
The authors are with School of Electronics and Information Engineering
Tianjin Polytechnic University, TJPU Tianjin, China (e-mail:
caoyt1989@163.com) the length of each section in turn is

DOI: 10.7763/LNSE.2013.V1.80 376


Lecture Notes on Software Engineering, Vol. 1, No. 4, November 2013

x1  x0  a , x2  x1  x0 ,…, xi  b  xi 1 . In each When the marker rotates around any axis of the three-
subinterval xi 1 , xi  , it takes a dot  i i  1,2,n  , and makes dimensional, it is easy to validate the projection’ longer axis
length obtained after rotation is the diameter of the circle at
summation. the same distance.
And formula: Assuming the angle of the marker take rotations about x, y,
z axis is respectively , , . Obtained after the rotation about
n
ba n

 f ()x
n
 f ( i )
nf f
x axis, the new coordinate is x , y , z  , the process of
i 1
 n
i 1
  (4) calculating as follows:
n
ba
 F ()x 
nd d
F ( i )
i 1 i 1 n  x   1 0 0  x
 y   = 0 cos   sin    y =
Denoted:      
 z   0 sin  cos    z 
s   f ( x)dx
b
(5)  x 
a
 y cos   z sin   (7)
 
S   F ( x)dx
b
(6)  y sin   z cos  
a

Obtained after the rotation about y axis, coordinate of new


Suppose   max x1 , x2 , xi  ( is the largest is x , y , z  , calculated as follows:
interval length), then when   0 , the formula (4) infinitely is
close to a constant, the constant can be called the area of
 x   cos  0 sin    x 
function f (x) in interval [a, b]. Thus even if the target is  y   =  0 1 0   y  
irregular shapes can also be represented by the formula (3) to   
calculate.  z     sin  0 cos    z  
During the movement of the target, it may occur deflection
transform with translation along an axis direction, when the  x cos   y sin  sin   z cos  sin  
target only occurs translation transform, it moves the position =
 y cos   z sin   (8)
on a plane, and the distance away from the camera do not  
change.   x sin   y sin  cos   z cos  cos  
B. Marker Occurs Rotational Movement
Obtained after the rotation about z axis, coordinate of new
For the photographs are two-dimensional, after the marker
makes a rotation, the projection resulting in the imaging will is  x , y , z  , calculated as follows:
become an ellipse. The distance calculated based on the area
will be a great error; it needs to make a redress to calculate  x  cos   sin  0  x 
the area of the circle on this distance.  y =  sin  cos  0   y   =
As is known, the three-dimensional rotation matrixes are     
as follows:  z    0 0 1   z  
The following three basic (gimbal-like) rotation matrices  x cos  cos   y sin  sin  cos   z cos  sin  cos   y cos  sin   z sin  sin  
[10] rotate vectors about the x, y, or z axis, in three  x cos  sin   y sin  sin  sin   z cos  sin  sin   y cos  cos   z sin  cos  
 
dimensions:   x sin   y sin  cos   z cos  cos  
(9)
1 0 0 

Rx    0 cos   sin   By the camera imaging principle, the projection can be
looked as the circle spatial coordinates’ z equals to 0, and it is
0 sin  cos   an ellipse. Calculating the maximum time-to-center, the long
axis of the ellipse can be obtained as follows:
 cos  0 sin  
Ry     0 1 0  l  ( IA) 2  ( IB) 2  ( IC ) 2  x 2  y 2  1 (10)

  sin  0 cos   where

IA  x cos  cos  y sin  sin  cos  y cos  sin 


cos   sin  0
Rz ( )   sin  cos  0  IB  x cos  sin   y sin  sin  sin   y cos  cos
 0 0 1  IC   x sin   y sin  cos 

377
Lecture Notes on Software Engineering, Vol. 1, No. 4, November 2013

Therefore, when the circle marker turns around the center, E. Rotation Correction
the area of the circle at this distance can be recovered by the Making the connection between the focus of the ellipse
measured length of the major axis. with the boundary points, it calculates the distance of
connection in turn, corresponding to the angle with the x
coordinate axes, to make a linear fit. If the graph has distinct
III. ALGORITHM IMPLEMENTATION peaks and valleys, then it is proven that the marker makes
To use the algorithm in a real application, algorithm rotating, and it needs to use to get the length axis to get the
implementation is discussed in the section. The algorithm right area, if not, directly the area of the detected to calculate
flowchart is shown in Fig. 2. distance.
By selecting the maximum distance away from two points
with angles of 180 degrees, the connection between the two
points is the major axis length. Results of the process are
shown in Fig. 3.

a) the original picture


Fig. 2. Algorithm flowchart. 35 25

20
30

A. Image Data Acquisition


15

One monocular camera is used to obtain images of the 25

circle marker. The camera could be shared with other 10

functions, such as visual navigation. However, camera inner 20


5

parameters have to be known.


15 0
0 50 100 150 200 250 300 350 400 0 50 100 150 200 250 300 350

B. Image Preprocessing
b) The corresponding fitting graph
Before the images are used to recover distance information, Fig. 3. The curve fitting to calculate ellipse long axis.
image preprocessing algorithms are applied to improve
image quality. Then image data is transformed into HSI color F. Distance calculation
space, which can help design algorithms to reduce navigate
According to the major length, the area of the circle is
effect of light conditions for maker image segmentation.
calculated, and then it can be calculated to get the distance
C. Marker Segmentation between the marker and the robot.
It splits the marker in the use of color characterizes. But
there may be present of the object as the same color
characteristic with markers in the environment. According to IV. EXPERIMENTAL RESULTS AND ANALYSIS
the actual shooting environment, the largest part is selected In order to validate the proposed algorithm, several
after the same colors objects was divided, and this area is be experiments have been carried out. Firstly it selects eight
used as the area of marker. pictures at different distances between the camera and the
D. Center Localization experimental marker to verify the away. Then it selects three
different sets of pictures at three distances for different angles
Since the marker is a circular, it is obtained ellipse after to verify the rotation.
rotating, and the pattern is regular. In order to obtain the long
axis of ellipse, the diameter of the circle, the center of the A. The Verification of Away
plane graphic of the marker is needed. According to the The circle segmentation maps of different distances are
definition, the coordinate of center is respectively the first shown in the Fig. 4. Since in the filming process, the similar
moment of x, y, namely: color characteristics objects with the marker may exist in the
environment, so there are some influences on the divided

x0 
 xF ( x, y)dxdy y0 
 yF ( x, y)dxdy (11)
images. But the results can be seen from the data, the error of
calculated distance is less than 2cm, in practical applications,
 F ( x, y)dxdy  F ( x, y)dxdy such error is within an acceptable range. The dates are shown
in Fig. 5 and Table I.
Since in the filming process, the similar color
Thus the focus point of the ellipse can be obtained:
characteristics objects with the marker may exist in the
 x0 , y0  . environment, so there are some influences on the divided

378
Lecture Notes on Software Engineering, Vol. 1, No. 4, November 2013

images. But the results can be seen from the data, the error of such error is within an acceptable range. The dates are shown
calculated distance is less than 2cm, in practical applications, in Fig. 5 and Table I.

TABLE I: THE VERIFICATION DATA FOR AWAY


channels Group1 Group2 Group3 Group4 Group5 Group6 Group7 Group8

Measured area( cm ) 2 7326 5617 4476 3582 3034 2567 2207 1880
Measured distance( cm ) 70.8039 80.8609 90.5828 101.2577 110.0229 119.6129 129.0000 139.7694
True distance( cm ) 70.0000 80.0000 90.0000 100.0000 110.0000 120.0000 130.0000 140.0000
Relative error (%) 1.148 0.8609 0.5828 1.2577 0.0229 0.3871 1.0000 0.2306

TABLE II: ROTATE DATA VALIDATION


channels Diameter 1 2 3 4 5 6 7
110( cm ) 32.3310 31.9531 32.311 32.5576 32.0156 33.1059 32.249 31.9061

120( cm ) 25.7099 25.0000 24.6982 25.2389 25.1794 25.318 24.7386 25.0200


140( cm ) 22.6274 22.4722 22.1359 23.1949 22.2036 22.4722 22.4722 21.5870

TABLE III: ERROR ANALYSIS


Relative error (%) 1 2 3 4 5 6 7
110(cm) 1.168847 0.062592 0.701309 0.968745 2.420383 0.24769 1.31756
120(cm) 2.761193 4.0468 1.907022 2.101914 1.556431 3.836401 2.788759
140(cm) 0.685894 2.187147 2.563709 1.827126 0.698986 0.690631 4.62972

B. Rotate Verification
In order to verify whether the actual rotation markers in
compliance with the above principles, it selected three sets of
data that were in 110cm, 120cm, 140cm distance. Then the
marker is rotated at any angle, and then seeks its long axis
length. The data obtained has been shown in Table II and
a) b) Table III as follows.
From the above data, it can be seen, the method in the text
to obtained the long axis, and then it uses the length of the
major axis to recover the circle. The method from the area of
circle marker to calculation has a more accurate result.

c) d)
V. CONCLUSION
In order to solve the difficulties of the encountered when
measuring the distance in real-time tracking, in this paper, a
new distance measurement method with the use of a single
camera and a circle marker is presented. Firstly, A distance
measurement mathematical model is established which is
e) f)
based on a circular marker, while, the impact after the marker
rotating is analyzed, in order to eliminate the impact on the
measurement result after rotation. Algorithm framework is
proposed that can be used in the actual system
implementation, the algorithm is programed, and the validity
and accuracy of the algorithm is verified.
(g) (h)
Fig. 4. Circle segmentation maps of different distances.
REFERENCES
[1] H. W. Zhao, Y. L. Zhang, H. Yue, and H. G. Cai, “Design and
application of distributed ultrasonic ranging system for autonomous
mobile robots,” WSEAS Transactions on Circuits and Systems, vol. 6,
no. 2, pp. 228-234, February 2007.
[2] Kamal and K. Aditya, “Millimeter wave radar,” On deelectrique, vol.
69, no. 6, pp. 70-77, Nov.-Dec. 1989.
[3] X. Liu et al., “Robot tracking using vision and laser sensors,” in Proc.
the 4th IEEE Conference on Automation Science and Engineering,
CASE 2008, pp. 169-174.
[4] S. Bahadori, L. Iocchi, G. R. Leone, D. Nardi, and L. Scozzafava,
“Real-Time people localization and tracking through fixed stereo
Fig. 5. Measured value and the true value chart. vision,” Lecture Notes in Computer Science, pp. 44-54, 2005.

379
Lecture Notes on Software Engineering, Vol. 1, No. 4, November 2013

[5] S.-Y. Park et al., “Coarse-To-Fine Vision-Based localization for Jian-Ming Wang received the Ph.D. degree in
mobile robots using an object and spatial layout-Based hybrid map,” in electrical engineering from Tianjin University in
Proc. the International Conference on Control Automation and 2003. In 2004, he was a visiting scholar at Queens
Systems, 2008, pp. 2113-2116. University of Belfast. He was a visiting researcher
[6] Z. W. Zhou, M. Xu, W. Fu, and J. Z. Zhao, “Object tracking and during 2007 and 2008 at Carnegie Mellon University.
positioning based on stereo vision,” Sensor, Measurement and Since 2010, he has been an full professor in electronic
Intelligent Materials, 2013, pp. 313-317. information engineering at Tianjin Polytechnic
[7] K. Zhu, T. W. Yang, and Q. Q. Ruan, “Real-Time tracking and University. His research interests include electronic
measuring of moving objects based on binocular vision,” in Proc. the system design, signal processing, and computer vision and pattern
Robot, 2009, vol. 31, no. 4, pp. 327-334. recognition.
[8] P. T. Huang, C. Y. Li, C. C. Hu, and C. M. Hong, “Object following
based on SURF for mobile robots,” IEEE Global Conference on Yu-kuan Sun has graduated from Tianjin Polytechnic
Consumer Electronics, 2012, pp. 382-386. University, and received the College of bachelor's
[9] R. E. Woods, Digital Image Processing Using MATLAB, Beijing: degree in Electronic and Information Engineering. He
Electronic Industry Press, ch. 7, 2008. is now studying at Tianjin Polytechnic University of
[10] R. Szeliski. (December 23, 2008). Computer vision: Algorithms and Electronics and Information engineering for
applications. [Online]. Available: http://szeliski.org/Book/ postgraduate degree, and his main research is on
intelligent image processing, computer vision and
Yu-Tao Cao has graduated from Qingdao industrial control system.
Technological University, and received the College
of bachelor’s degree in electronics and Xiaojie Duan is a lecturer in the school of
communication engineering. She is now studying at Electronics and Information Engineering in Tianjin
Tianjin Polytechnic University of Electronics and Polytechnic University, China. He obtained his PhD
Information engineering for postgraduate degree, and from Tianjin University in 2013. His main research
her main research is on the intelligent image direction is photoelectronic testing technology,
processing, computer vision and navigation. vision inspection.

380

You might also like