Professional Documents
Culture Documents
Body Guided Camera Calibration
Body Guided Camera Calibration
Abstract ing low speed and constant steering angle, it is easy to let
the RC car to draw a circular trajectory. The circular tra-
This paper presents a novel camera calibration method jectory is projected as an ellipse onto the image plane by
for a system consists of a sensor-less mobile robot and an a fixed external camera (Figure 1). Based on this geomet-
external camera. This method can estimate the focal length ric property, we developed a new method for camera cali-
and the viewing direction of the camera relative to a plane bration using one single image of an arbitrary circle. This
where the sensor-less mobile robot moves from a single im- method can determine the focal length f, and the tilt angle
age of an arbitrary circle on that plane. Since it is easy to () that describes the view direction of the camera relative to
let afour-wheel mobile car to move along a circular locus, the working plane.
our method does not requires specially designed calibra-
tion objects for camera calibration. The method does not
use the position of the center and the radius of the circle Image
and the circle is allowed to be partially occluded. Through
computer simulations and experiments with a real camera
and a robot, we have confirmed that our method is more
robust than the Homography based calibration using four
Room
point correspondences.
IEEE
978-0-7695-3883-9/09 $26.00 2009 IEEE 39 ~computer
DOl 10.1l09/ICIG.2009.175 society
positions of a moving point along a conic section viewed by 2 Basic Equations
a monocular moving camera. It requires nine views. If the
point is known to move along a circle, it needs seven views. 2.1 Assumptions
Homography can be used as a camera calibration method
using a planar pattern. It requires at least four point corre- In this paper, we assume 1) the roll angle is zero, 2) the
spondences to estimate the matrix that convert points on an camera is a pinhole camera with unknown focal length. And
image plane to the corresponding points on another image 3) the speed of the robot, the radius and the center of the
plane. By decomposing the Homography matrix, we can circles draw by the robot are all unknown. As shown in
estimate the normal vector of the plane, the translation, and Figure 3, 1) the world coordinates system (O-XYZ), and
the rotation between the two cameras. But the decomposi- 2) camera coordinates system (OI_XIY I ZI) are used in our
tion is alternative[ 12] [13]. research. The origins of both coordinates systems are set
at the optical center of the fixed external camera. We let
Virtual camera
the normal vector of the working plane be the Z-axis of 0-
r...
pcamera
1~HOmOgraphY
! ! XY Z, and the optical-axis be the Z'-axis of OI_Xly l ZI.
,=.,
The Y-axis of 0- XY Z is set to be perpendicular to Z-axis
and ZI -axis. Because we assume that the roll angle of the
camera is zero, thus XI-axis ofO'-X'Y' ZI is parallel to the
working plane.
Ground
Image plape
.\
Figure 2. Conversion between two cameras,
which are observing a same plane
By assuming that the roll angle of the camera is zero, we The.center of circle
developed a unique method that can estimate the direction (xc, yc, zc)
of the axis ofthe camera and its focal length simultaneously
from a single image of an arbitrary circle on the plane. We
consider that the assumption of a zero roll angle is not a too Figure 3. The world coordinates and camera
strict limitation. In the case of monitoring mobile robots coordinates systems
that move on a planar floor, there is seldom a need of none-
zero roll angles, and to adjust a camera to have a zero roll
angle is easy. The scene configuration is characterized by the follow-
Although some of the extrinsic camera parameters (such ing parameters: 1) the distance between the working plane
as pan angle and the complete translation ofthe camera) can and the optical center h; 2) the radius of the circle r; 3) the
not determined by using our method, the estimated informa- center of the circle (x e, Ye, ze) and 4) the focal length f.
tion is enough to convert the images taken by the external The relation between the world coordinate system and
camera into a vertical view of the work plane, from which the camera coordinate system can be described by the fol-
the correct movement of the robot can be observed (figure lowing equation,
2). To be concrete, we first let the robot to move along a
circle on the work plane, which is observed by the external
o
cosB (1)
camera. Next, the focal length and the view direction of the
sin B
camera relative to the work plane are determined from the
circular trajectory projected on to the image plane. the circle is described by,
In this method, the center of the circle and the radius of
the circle are not used for the calibration, and the circle is (x - x c)2 + (z - zc)2 = r2
{ Y (2)
allowed to be partially occluded. Furthermore, we do not = Yc =-h
use the speed of the robot, because it is difficult to mea-
and the perspective projection of the camera is given by,
sure. Through computer simulations and the experiments
with a real camera and a RC car, we have confirmed that
our method is more robust than the Homography based cal-
=-f~ (3)
- _flL
ibration using four point correspondences. - Zl
40
2.2 Formulation 3 Experimental Results using Simulation Im-
ages
In general case, a circle on the working plane is projected
on to an image plane as an ellipse. The ellipse detected from
In this simulation, we use the world coordinate system
the image [16] [17] is described by the following equation,
and the camera coordinates system as shown in figure 3. We
f; + Blxly + CI; + Dlx + Ely + F = O. (4) assume that the size of an image sensor is 0.3 x 0.4 [inch],
and the image resolution is 640 x 480 [pixel]. Therefore, the
From Eqs.(1), (2), (3) and (4), the equations that describe size of one pixel is 0.15875 [mm]. The distance between
the relation between parameters of the ellipse (B, C, D, E, the optical center of the camera and the working plane is 3
F), the parameters of the camera (f, B) and the circle (x c, [meter].
Yc, zc, r) are obtained as following,
B = 2x~ cos B (5 - 1) 3.1 Dependency between estimation error and po-
C = (sinB - z~cosB)2 + (x~2 - r /2 ) cos 2 B (5 - 2) sition of the circle
D = -2fx~sinB (5 - 3)
E = 2J[z~(sin2 B - cos 2 B) + (1- z~2 _ 2 + r /2 ) Assuming the focal length f is 550 [pixel], and the tilt
sinBcosB] (5 - 4) angle B is 55 [degree], the radius of circles is 0.25, 0.50, or
2
F = f2[(z~ sin B + cos B)2 + (x~ - r /2 ) sin 2 B] (5 - 5) 0.75 [meter] scattered on the working plane. We have tested
(5) our method using each image of ellipse synthesized with the
where l
= xc/h, z~ = zc/h, r = r/h. parameter settings.
In order to determine f and B, x ~, and r l need to be < We first used 8 images ofa circle having 0.75 [meter] ra-
dius, and the center ofthe circle is different from each other.
eliminated from Eq.(5). By computing ~~i~::::il in Eq.(5)
The RMS error and the maximum error of the estimated fo-
the following is derived,
cal length and the tilt angle are shown in table 1 denoted by
D hand BI , respectively.
ftanB = - B' (6)
41
o o o angle to test the sensitivity to the roll angle. We use circles
of r = 0.5 [meter], the focal length f = 550 [pixel], the tilt
angle () = 55 [degree], let the roll angle changes from 0 to
3 [degree] at a step of 1 degree to synthesize images for the
experiment. In order to avoid undesired effect of quantiza-
(a) r=0.75 (m) (b) r=0.50 (m) (c) r=0.25 (m) tion error, we select the images from the synthesized ones
(x,z)=(-0.4,-2.6) (x,z)=(-0.3,-3.4) (x,z)=(-0.1,-3.6) that the minor axis of the ellipse in it is longer than 40 pix-
els, to estimate the parameters using the proposal method.
The results show the relation between the estimation error
Figure 4. Images showing big errors. and the roll angle is summarized in table 2.
0
of r = 0 .25 [meter], the tilt angle () was changed from 40 to
80 [degree] at a step of 10 degree (figure 5), the RMS error
offocallength f was 1.3%, and of the tilt angle () was 0.15
[degrees]. Therefore the estimation error can be considered
independent to the tilt angle.
From these experiment results, we can say that, except-
n
(a) (b)
ing the cases that 1) the center of the ellipse in the image is
too close to the y axis ofthe image coordinates, or 2) the cir-
cle to be viewed is too far from the camera, that cause the Figure 6. (a) An image for comparison of our
value of B becomes too small, accurate estimation results method and Homography method, and (b) the
can be obtained with our method. vertical view image of the same scene.
42
the other hand, from the same image, the results estimated which the real movement of the RC car movement can be
with our method were () = 37.99 [degree], and f = 206 monitored. Figure 7 (b) shows a test environment.
[pixel]. From this result, we can claim that our method pro- Figure 8 (a) shows the robot trajectory during the envi-
vides more precise results than the four point correspon- ronment map generation using the robot body. That is, if
dence based calibration. we detected that the robot movement is blocked irrelevant
Furthermore, the figure 6(a) was transformed into a ver- to the action command, then we know that there must be an
tical view (figure 6(b)) using estimated f and () with our obstacle; if the LEDs disappeared from the image, then we
method. In the figure 6(b), the length of four sides of a know that there must be an object that occluded the robot
rectangle was 120,119,119,119 [pixel] respectively. The from the camera view. In this way, we can generate the en-
adjoining angle of two sides was 90 1.44,90 0.96 [de- vironment map. A resulted map is shown in figure 8(b).
gree] respectively. It shows that camera parameters can be Both of them are shown in virtual vertical view, and the real
estimated very accurate with our method. environment is shown in Figure 7(b).
43
of the proposal technique. [3] J.S.Kim, H.W.Kim and I.S.Kweon, "A Camera Calibration
Method using Concentric Circles for Vision Applications",
ACCV pp. 23-25, 2002.
[4] Yang, C., Sun, F. Hu, Z.: "Planar Conic Based Camera Cal-
ibration", ICPR, 2000.
5 Conclusion [10] P.Sturm, "A Case Against Kruppa's Equations for Camera
Self-Calibration", PAM!, Vol.22, No. 10, pp. 348-357, 2000.
We have developed a new camera calibration method us- [11] P.Gurdjos, A Grouzil and R.Payrissat, "Another Way of
ing circular pattern drawn by a four-wheel robot. Since a Looking at Plane-Based Calibration: the Center Circle Con-
circle pattern fits an observation range can be made imme- straint", ECCV, 2002.
diately, this approach can solve the problem of estimating
[12] R.J. Holt and A.N.Netravali, "Camera Calibration Problem:
the extrinsic camera parameters in far flat environment such
Some New Result", CVIU , No.54, VoL3, pp. 368-383,
as a sports ground. On the other hand, the robot's move 1991.
speed is not used, if there is an arbitrary circle, our pro-
posal technique can be used. Estimating Homography ma- [13] P.Sturm and S.Maybank, "On Plane-Based Camera Cali-
trix from point correspondence data suffers from the alter- bration: A General Algorithm, Singularities, Applications",
native decomposition problem. However, our method pro- CVPR, pp. 432-437, 1999.
duces a unique solution. This means that we can perform [14] Z.Zhang, "A Flexible New Technique for Camera Calibra-
the camera calibration without using additional planes or tion", PAM!, Vol.22, No.ll, pp. 1330-1334,2000.
cameras. In addition, we can realize an automatic camera
calibration system based on our method. [15] Barreto, J.P. and Araujo H.: "Issues on the Geometry of
Central Catadioptric Image Formation", CVPR, 2001.
44