Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

2009 Fifth International Conference on Image and Graphics

Robot Body Guided Camera Calibration:


Calibration using An Arbitrary Circle

Qian Chen, Haiyuan Wu and Toshikazu Wada


Wakayama University, Faculty of Systems Engineering,
930 Sakaedani, 640-8510, Wakayama, Japan
{chen, wuhy, twada}@sys.wakayama-u.ac.jp

Abstract ing low speed and constant steering angle, it is easy to let
the RC car to draw a circular trajectory. The circular tra-
This paper presents a novel camera calibration method jectory is projected as an ellipse onto the image plane by
for a system consists of a sensor-less mobile robot and an a fixed external camera (Figure 1). Based on this geomet-
external camera. This method can estimate the focal length ric property, we developed a new method for camera cali-
and the viewing direction of the camera relative to a plane bration using one single image of an arbitrary circle. This
where the sensor-less mobile robot moves from a single im- method can determine the focal length f, and the tilt angle
age of an arbitrary circle on that plane. Since it is easy to () that describes the view direction of the camera relative to
let afour-wheel mobile car to move along a circular locus, the working plane.
our method does not requires specially designed calibra-
tion objects for camera calibration. The method does not
use the position of the center and the radius of the circle Image
and the circle is allowed to be partially occluded. Through
computer simulations and experiments with a real camera
and a robot, we have confirmed that our method is more
robust than the Homography based calibration using four
Room
point correspondences.

Figure 1. The system construction


1 Introduction
Several researches of camera calibration, 3D structure
A camera calibration is necessary whenever the camera estimation and motion analysis using circular patterns [1]-
setting has been changed. Therefore, much time will be [4], or conic patterns [5]-[11] have been reported so far.
consumed for the calibration before a robot can be con- They all assume multi-viewpoint images.
trolled after a new setup. For dissolving this undesired time Meng et al. [1] proposed a method that uses at least three
consumption, we developed a new technology for the cam- images from different viewpoint of a planar pattern contain
era calibration that uses a robot body, and that is able to a circle and lines passing through its center. Kim et al. [3]
incorporate into the system as initialization operation. proposed a method using con-centric circles. It requires two
We are performing research about environmental recog- views. Yang et al. [4] proposed a similar method except that
nition using mobile robots. We use a system consists of ellipses are used instead of circles. All these are methods to
1) a sensor-less mobile robot; 2) a radio wave controller to estimate camera intrinsic parameters.
send instructions to the robot; 3) an external camera fixed Long[5] proposed a method to find the correspondence
in the environment; and 4) a computer. In this system, we between conics in two views, and to estimate the relative
employ a four-wheel radio controlled car (hereafter called orientation of the optical axis of two views.
"RC car") moving on a plane as a calibration object. That Kanatani and Wu[8],[9] reported a method to extract 3D
is, we use the locus of its movement as a calibration pat- information from conics in images. The intrinsic and ex-
tern. Without a navigation system, it is difficult to let the trinsic camera parameters are supposed to be known.
RC car to make a straight line locus. However, by keep- Avidan et al. [6] proposed an approach to estimate the 3D

IEEE
978-0-7695-3883-9/09 $26.00 2009 IEEE 39 ~computer
DOl 10.1l09/ICIG.2009.175 society
positions of a moving point along a conic section viewed by 2 Basic Equations
a monocular moving camera. It requires nine views. If the
point is known to move along a circle, it needs seven views. 2.1 Assumptions
Homography can be used as a camera calibration method
using a planar pattern. It requires at least four point corre- In this paper, we assume 1) the roll angle is zero, 2) the
spondences to estimate the matrix that convert points on an camera is a pinhole camera with unknown focal length. And
image plane to the corresponding points on another image 3) the speed of the robot, the radius and the center of the
plane. By decomposing the Homography matrix, we can circles draw by the robot are all unknown. As shown in
estimate the normal vector of the plane, the translation, and Figure 3, 1) the world coordinates system (O-XYZ), and
the rotation between the two cameras. But the decomposi- 2) camera coordinates system (OI_XIY I ZI) are used in our
tion is alternative[ 12] [13]. research. The origins of both coordinates systems are set
at the optical center of the fixed external camera. We let
Virtual camera
the normal vector of the working plane be the Z-axis of 0-
r...
pcamera
1~HOmOgraphY
! ! XY Z, and the optical-axis be the Z'-axis of OI_Xly l ZI.
,=.,
The Y-axis of 0- XY Z is set to be perpendicular to Z-axis
and ZI -axis. Because we assume that the roll angle of the
camera is zero, thus XI-axis ofO'-X'Y' ZI is parallel to the
working plane.
Ground
Image plape
.\
Figure 2. Conversion between two cameras,
which are observing a same plane

By assuming that the roll angle of the camera is zero, we The.center of circle
developed a unique method that can estimate the direction (xc, yc, zc)
of the axis ofthe camera and its focal length simultaneously
from a single image of an arbitrary circle on the plane. We
consider that the assumption of a zero roll angle is not a too Figure 3. The world coordinates and camera
strict limitation. In the case of monitoring mobile robots coordinates systems
that move on a planar floor, there is seldom a need of none-
zero roll angles, and to adjust a camera to have a zero roll
angle is easy. The scene configuration is characterized by the follow-
Although some of the extrinsic camera parameters (such ing parameters: 1) the distance between the working plane
as pan angle and the complete translation ofthe camera) can and the optical center h; 2) the radius of the circle r; 3) the
not determined by using our method, the estimated informa- center of the circle (x e, Ye, ze) and 4) the focal length f.
tion is enough to convert the images taken by the external The relation between the world coordinate system and
camera into a vertical view of the work plane, from which the camera coordinate system can be described by the fol-
the correct movement of the robot can be observed (figure lowing equation,
2). To be concrete, we first let the robot to move along a
circle on the work plane, which is observed by the external
o
cosB (1)
camera. Next, the focal length and the view direction of the
sin B
camera relative to the work plane are determined from the
circular trajectory projected on to the image plane. the circle is described by,
In this method, the center of the circle and the radius of
the circle are not used for the calibration, and the circle is (x - x c)2 + (z - zc)2 = r2
{ Y (2)
allowed to be partially occluded. Furthermore, we do not = Yc =-h
use the speed of the robot, because it is difficult to mea-
and the perspective projection of the camera is given by,
sure. Through computer simulations and the experiments
with a real camera and a RC car, we have confirmed that
our method is more robust than the Homography based cal-
=-f~ (3)
- _flL
ibration using four point correspondences. - Zl

40
2.2 Formulation 3 Experimental Results using Simulation Im-
ages
In general case, a circle on the working plane is projected
on to an image plane as an ellipse. The ellipse detected from
In this simulation, we use the world coordinate system
the image [16] [17] is described by the following equation,
and the camera coordinates system as shown in figure 3. We
f; + Blxly + CI; + Dlx + Ely + F = O. (4) assume that the size of an image sensor is 0.3 x 0.4 [inch],
and the image resolution is 640 x 480 [pixel]. Therefore, the
From Eqs.(1), (2), (3) and (4), the equations that describe size of one pixel is 0.15875 [mm]. The distance between
the relation between parameters of the ellipse (B, C, D, E, the optical center of the camera and the working plane is 3
F), the parameters of the camera (f, B) and the circle (x c, [meter].
Yc, zc, r) are obtained as following,
B = 2x~ cos B (5 - 1) 3.1 Dependency between estimation error and po-
C = (sinB - z~cosB)2 + (x~2 - r /2 ) cos 2 B (5 - 2) sition of the circle
D = -2fx~sinB (5 - 3)
E = 2J[z~(sin2 B - cos 2 B) + (1- z~2 _ 2 + r /2 ) Assuming the focal length f is 550 [pixel], and the tilt
sinBcosB] (5 - 4) angle B is 55 [degree], the radius of circles is 0.25, 0.50, or
2
F = f2[(z~ sin B + cos B)2 + (x~ - r /2 ) sin 2 B] (5 - 5) 0.75 [meter] scattered on the working plane. We have tested
(5) our method using each image of ellipse synthesized with the
where l
= xc/h, z~ = zc/h, r = r/h. parameter settings.
In order to determine f and B, x ~, and r l need to be < We first used 8 images ofa circle having 0.75 [meter] ra-
dius, and the center ofthe circle is different from each other.
eliminated from Eq.(5). By computing ~~i~::::il in Eq.(5)
The RMS error and the maximum error of the estimated fo-
the following is derived,
cal length and the tilt angle are shown in table 1 denoted by
D hand BI , respectively.
ftanB = - B' (6)

From the Eqs.(5-4) and (5-2) in Eq.(5), the following equa-


tions can be derived, Table 1. Estimation errors of circles with dif-
ferent radius
sin 2B(x~ 2 + z~ 2 - r /2 )2 + cos 2Bz~ = sin 2B -If I
I===:;=~c== maximum error RMS error
2
(1 + cos 2B)(x~ + z~2 - r /2 ) - 2 sin 2Bz~ = . h(%) 3.1 I 0.7
{ -B~(d:;-':--'------,)- -------:0=-2=-1:---- 0.066
2C-1+cos2B 1 egree .
I
(7) 12(%) 4.0 I 0.7
Solving the Eq.(7), the following is obtained,
I B2(degree) 0.38 0.055
24.7 I 2.0
z~2 _ r /2 = (1- C) tan 2 B - ~ tanB + 13(%)
2+ C. (8)
I B3 (degree) 2.29 0.30

By computing Eq.(5-2)x f2+ Eq.(5-5), we have,


2/ 2 2 f2(C-1)+F The image that gives the maximum estimation error for
+ Zc
1
- r = j2 . (9) both the focal length and tilt angle is shown in figure 4(a),
where the size of the ellipse was 211 x 162 pixels. The sim-
2
Substituting Eq.(9) for 2 + z~ - r /2 in Eq.(8), the follow- ilar experiments using circles of 0.50 [meter] radius and of
ing equation is inferred, 0.25 [meter] radius were also performed. 28 and 72 images
are synthesized, from which the RMS error and the maxi-
f2(C -1) +F E
j2 =(1-C)tan 2 B- mum error are shown in table 1 denoted by 12, B2 , and 13,
y tanB+C. (10)
B3 , respectively. The images that give the maximum estima-
From Eq.(6) and Eq.(10), f and B can be determined as fol- tion error are shown in figure 4(b) and (c), where the size of
lowing, the ellipse were 127 x 84 and 90 x 42 pixels, respectively.
Two factors can be considered as the reasons causing big
ylB2F - BDE + (C - 1)D2 estimation error. One is that the small size of the ellipse
IBI (11) in the image, in this case the quantization error becomes
D relatively big. Another one is that the estimated value of B
= arctan(- fB)
is too small, for example, B of the ellipse shown in figure

41
o o o angle to test the sensitivity to the roll angle. We use circles
of r = 0.5 [meter], the focal length f = 550 [pixel], the tilt
angle () = 55 [degree], let the roll angle changes from 0 to
3 [degree] at a step of 1 degree to synthesize images for the
experiment. In order to avoid undesired effect of quantiza-
(a) r=0.75 (m) (b) r=0.50 (m) (c) r=0.25 (m) tion error, we select the images from the synthesized ones
(x,z)=(-0.4,-2.6) (x,z)=(-0.3,-3.4) (x,z)=(-0.1,-3.6) that the minor axis of the ellipse in it is longer than 40 pix-
els, to estimate the parameters using the proposal method.
The results show the relation between the estimation error
Figure 4. Images showing big errors. and the roll angle is summarized in table 2.

4(c) was 0.03. Hence, the calculation error of the formula


(11) becomes big. Table 2. The estimation error caused by the
roll angle
3.2 Dependency between estimation error and tilt Roll angle (degree) I 0 I 1 I 2 I 3 I
angle ==:R:=:'M==S:=e=rr~or::::C=o==:f~()~(d~eg~r=eeC=)I 0.055 I 0.274 I 0.446 I 0.613 I

~ 3.4 Comparison with the Homography


0
Using camera setting of the focal length f = 220 [pixel]
r)
and the tilt angle () = 38 [degree], we synthesized a per-
(a) 40 (b) 60 (c) 80 spective projection image consists of a square of 2.0 x 2.0
[meter], and its inscribed circle (Figure 6)(a). We also syn-
thesized an image of a vertical view of the same scene. Us-
Figure 5. Synthetized images using different ing these images, we compared our method with the Ho-
tilt angle. mography based approach.
Assuming that the correspondence between the four ver-
In order to investigate the dependency of the estimation texes of the square in the two images has been established,
error and the tilt angle of the camera, we synthesized im- we calculated the Homography matrix, and then decom-
ages by changing the tilt angle from 20 degree to 60 degree posed it into the rotation matrix, translation, and the normal
at a step of 10 degree, while the radius of the circles on vector of the plane.
the working plane is 0.5 [meter] and the focal length is 550
[pixel]. The RMS error of focal length f was 1.2%, and of
the tilt angle () was 0.06 degrees. In the case using circles

0
of r = 0 .25 [meter], the tilt angle () was changed from 40 to
80 [degree] at a step of 10 degree (figure 5), the RMS error
offocallength f was 1.3%, and of the tilt angle () was 0.15
[degrees]. Therefore the estimation error can be considered
independent to the tilt angle.
From these experiment results, we can say that, except-
n
(a) (b)
ing the cases that 1) the center of the ellipse in the image is
too close to the y axis ofthe image coordinates, or 2) the cir-
cle to be viewed is too far from the camera, that cause the Figure 6. (a) An image for comparison of our
value of B becomes too small, accurate estimation results method and Homography method, and (b) the
can be obtained with our method. vertical view image of the same scene.

3.3 Dependency between estimation error and roll


angle The estimated results are ()1 = 38.18 [degree] and ()2 =
37.88 [degree]. We determined that the latter result is valid,
Although we assumed that the roll angle is zero, here we because it has smaller errors. However, the focal length
investigate the estimation error in the case of non-zero roll cannot be estimated because of lack of information. On

42
the other hand, from the same image, the results estimated which the real movement of the RC car movement can be
with our method were () = 37.99 [degree], and f = 206 monitored. Figure 7 (b) shows a test environment.
[pixel]. From this result, we can claim that our method pro- Figure 8 (a) shows the robot trajectory during the envi-
vides more precise results than the four point correspon- ronment map generation using the robot body. That is, if
dence based calibration. we detected that the robot movement is blocked irrelevant
Furthermore, the figure 6(a) was transformed into a ver- to the action command, then we know that there must be an
tical view (figure 6(b)) using estimated f and () with our obstacle; if the LEDs disappeared from the image, then we
method. In the figure 6(b), the length of four sides of a know that there must be an object that occluded the robot
rectangle was 120,119,119,119 [pixel] respectively. The from the camera view. In this way, we can generate the en-
adjoining angle of two sides was 90 1.44,90 0.96 [de- vironment map. A resulted map is shown in figure 8(b).
gree] respectively. It shows that camera parameters can be Both of them are shown in virtual vertical view, and the real
estimated very accurate with our method. environment is shown in Figure 7(b).

4 Experiments with real data

4.1 Environment Map Acquisition

Our calibration method has been implemented as an ini-


tialization procedure of a robot control system. Our system
consists of a four-wheel RC car with red and green LEDs
mounted on its top, a fixed external camera (SONY DFW
VL500), and a PC with a radio controller as shown in Fig-
ure 7(a). In the system, the motion trajectory of the RC car
is observed by detecting the LEDs from the images at video (a) (b)
rate by the color target detector [18].
Figure 8. (a) Some louses of a robot. (b) An
obtained environmental map

4.2 Accuracy Evaluation based on CircIe-


Likeness

The focal length f and the tilt angle () were estimated by


using 30 sets of real image data (including seven kinds of
tilt angles and 22 kinds of focal lengths). In order to evalu-
ate the accuracy of estimated results, we calculated the con-
version Homography matrix from the fixed external camera
(a) System (b) Test environment to a virtual camera that gives vertical view of the scene us-
ing the estimated f and (). After that, each real image is
converted to the vertical view. The circle-likeness is then
Figure 7. Experiment scenery estimated by calculating the ratio of the minor axis a to the
major axis b of the ellipse in the converted image.
When the system is stared, it performs the camera cali- The minimum value of the ratio was alb = 0.995, the
bration automatically. In this calibration, the RC car tries maximum value was alb = 1.00, and the mean of the ra-
to move at low speed and keeping a constant steering angle. tio was alb = 0.998. An example of the locus ofa robot's
When the observed trajectory is closed, the system consid- movement obtained from the image series, and the images
ers the trajectory as a circle and estimates the parameters of after the conversion to the vertical view is shown in the fig-
the ellipse observed by the camera, and then performs the ure 9(a) and (b), respectively. The ratio alb of the ellipse
camera calibration. When the calibration is finished, planar in the image 9(b) was 0.998. From these results, we con-
scene of the working plane where the robot moves around firmed that the locus of the robot in the converted virtual
will be able to be transformed into a vertical view, from image is close to a circle enough, that shows the accuracy

43
of the proposal technique. [3] J.S.Kim, H.W.Kim and I.S.Kweon, "A Camera Calibration
Method using Concentric Circles for Vision Applications",
ACCV pp. 23-25, 2002.

[4] Yang, C., Sun, F. Hu, Z.: "Planar Conic Based Camera Cal-
ibration", ICPR, 2000.

[5] Long, Q.: "Conic Reconstruction and Correspondence From


Two Views", PAM!, VoLl8, No.2, pp. 151-160, 1996.

[6] S.Avidan and AShashua, "Trajectory Triangulation: 3D Re-


construction of Moving Points from a Monocular Image Se-
quence", PAM!, Vol.22, No.4, pp. 348-357, 2000.
(a) (b)
[7] J.Heikkila and O.Silven, "A Four-step Camera Calibration
Procedure with Implicit Image Correction".
Figure 9. (a) The locus of a robot observed
[8] K.Kanatani and L.Wu, "3D Interpretation of Conics and Or-
from a real camera. (b) Images synthesized
thogonality", Image Understanding, Vol.58, Nov, pp. 286-
from the real images based on the estimated 301, 1993.
camera parameters.
[9] L. Wu and K.Kanatani, "Interpretation of Conic Motion and
Its Applications", Int. Journal of Computer Vision, VoLlO,
No.l,pp. 67-84,1993.

5 Conclusion [10] P.Sturm, "A Case Against Kruppa's Equations for Camera
Self-Calibration", PAM!, Vol.22, No. 10, pp. 348-357, 2000.

We have developed a new camera calibration method us- [11] P.Gurdjos, A Grouzil and R.Payrissat, "Another Way of
ing circular pattern drawn by a four-wheel robot. Since a Looking at Plane-Based Calibration: the Center Circle Con-
circle pattern fits an observation range can be made imme- straint", ECCV, 2002.
diately, this approach can solve the problem of estimating
[12] R.J. Holt and A.N.Netravali, "Camera Calibration Problem:
the extrinsic camera parameters in far flat environment such
Some New Result", CVIU , No.54, VoL3, pp. 368-383,
as a sports ground. On the other hand, the robot's move 1991.
speed is not used, if there is an arbitrary circle, our pro-
posal technique can be used. Estimating Homography ma- [13] P.Sturm and S.Maybank, "On Plane-Based Camera Cali-
trix from point correspondence data suffers from the alter- bration: A General Algorithm, Singularities, Applications",
native decomposition problem. However, our method pro- CVPR, pp. 432-437, 1999.
duces a unique solution. This means that we can perform [14] Z.Zhang, "A Flexible New Technique for Camera Calibra-
the camera calibration without using additional planes or tion", PAM!, Vol.22, No.ll, pp. 1330-1334,2000.
cameras. In addition, we can realize an automatic camera
calibration system based on our method. [15] Barreto, J.P. and Araujo H.: "Issues on the Geometry of
Central Catadioptric Image Formation", CVPR, 2001.

Acknowledgments [16] Fitzgibbon, A, Pilu, M. and Fisher R.B.: Direct Least


Square Fitting of Ellipses, IEEE TPAM!, Vol.21, No.5, pp.
476-480, 1999.
This research was partially supported by the Ministry of
Education, Science, Sports and Culture, Grant-in-Aid for [17] Halir, R. and Flusser, J.: Numerically Stable Direct Least
Scientific Research(C), 21500171. Squares Fitting of Ellipses, WSCG, 1998.

[18] T.Wada, "Color-Target Detection Based on Nearest Neigh-


References bor Classifier: Example Based Classification and its Ap-
plications", JPJS SIG Notes, 2002-CVIM-134, pp. 17-24,
2002.
[1] X.Meng and Z.Hu, "A New Easy Camera Calibration Tech-
nique Based on Circular Points", Pattern Recognition, Vol.
36, pp. 1155-1164,2003.

[2] G.Wang, F.Wu and Z.Hu, "Novel Approach to Circular


Points Based Camera Calibration".

44

You might also like