Camera Calibration Using Three Sets of Parallel Lines Echigo1990

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Machine Vision and Applications (1990) 3:159-167

Machine Vision and


Applications
O 1990Springer-VerlagNew York Inc.

A Camera Calibration Technique using Three Sets of


Parallel Lines
Tomio Echigo
IBM Research, Tokyo Research Laboratory, Tokyo, Japan

Abstract. This paper presents a new method for three- that give its location. The imaging distance from the
dimensional camera calibration in which the rotation pa- lens to the image plane should also be defined as a
rameters are decoupled from the translation parameters. calibration parameter, because it is adjusted when-
First, the rotation parameters are obtained by projecting ever the camera is focused. The parameters must be
three sets of parallel lines independently of the translation calibrated every time the camera is fixed, and there-
parameters and the imaging distance from the lens to the fore it is necessary to establish a convenient method
image plane. The virtual line passing through the image
of camera calibration. The purpose of this research
center, which is calculated by perspective projection of a
set of parallel lines, depends only on the rotation parame- is to develop an easy and robust method of deter-
ters. Next, the translation parameters and the imaging mining camera parameters.
distance are analytically obtained. Experimental results The problem of camera calibration has been ex-
are used to show how the camera model can be accu- tensively studied. Sobel (1974) and Gennery (1979)
rately reconstructed in an easily prepared environment. solved this problem by using full-scale nonlinear op-
timization. Ganapaphy (1984) derived a noniterative
technique using linear equations. Martins, Birk, and
Key Words: camera calibration, rotation parameters, Kelley (198l) developed the two-planes method,
vanishing point, algebraic solution, reconstructed geo- which does not assume that the camera model is a
metric camera model pinhole model. Tsai (1985) described a two-stage
technique that decomposes the camera parameters
into two groups: those that can be obtained by solv-
ing linear equations with five unknowns and those
1 Introduction
that can be obtained by nonlinear optimization, us-
The purpose of camera calibration is to establish the ing an initial estimation given by solving linear
relationship between three-dimensional world coor- equations with two unknowns. In these conven-
dinates and their corresponding two-dimensional tional methods the parameters are detected from
image coordinates seen by the camera. Once this points in three-dimensional space and from their
relationship is established, three-dimensional infor- corresponding projection points on the image plane,
mation can be inferred from two-dimensional im- Such methods require accurate positioning of the
ages by using a stereo method or a structured light- projection points on the image plane for accurate
ing method. This paper describes a new method of detection of the parameters. However, as the posi-
three-dimensional camera calibration. A camera tions of the projection points on the image plane are
has 6 degrees of freedom; that is, the relationship easily shifted by noise or changing light sources, it
between the three-dimensional world coordinates is difficult to measure them accurately. To deter-
and the camera coordinates consists of six parame- mine accurate parameters, it is important to ignore
ters: three rotation parameters that give the orienta- the effects of isolated points that are in gross error.
tion of the camera and three translation parameters In the method presented here the parameters are
obtained from an image of parallel straight lines,
whose features can be extracted more accurately
Address reprint requests to: Tomio Echigo, IBM Research,
than those of projection points. Furthermore, the
Tokyo Research Laboratory, 5-19, Samban-cho, Chiyoda-ku, influence of errors in the extraction of projected
Tokyo 102, Japan. lines is suppressed, because this technique uses fea-
160 Echigo: A Camera Calibration Teehnklue

tures calculated from several projected lines by the 9 Location of the center of the lens
least square method. A characteristic of the ne:w " The imaging distm~ce from the lens to the image
method is that the rotation parameters can be de- plane
rived independently of the translation parameters
and the imaging distance from the lens to the image The first group of parameters defines the orientation
plane, so that rotation parameters are not influ- of the camera, which consists of three rotation pa-
enced by errors in Ne, translation parameters and rm-neters. The second defines the location of the
the imaging distance. camera, which consists of three tran,~'4ationparame-
ters, The third is often adjusted so that the lens is
focused on the image plane. In this method the
2 Definition of the Camera Parameters and
imaging distance is measured in pixels. Conse-
the Geometric Camera Model quently, the scaling factor of the discrete device is
The camera parameters are classified into intrir~sic not used, and the ctmaera coordinates are repre-
parameters, which are fixed by the camera, 'the sented in pixels, Since the intrinsic parameters are
tens, and the signal converter from analog image fixed by the camera, the tens, and the image acquisi-
signals to digitalized pkxeI data, and extrinsic pa- tion hardware, once calibrated, they should not be
rameters, which are changeable whenever the cam- changed. However, the extrinsic parameters must
era is set up. be calibrated every time the camera is set up. This
The intrinsic parameters are defined as follows: paper gives a method for determining the extrinsic
p~ameters.
- Aspect ratio of an image frame In this method the ideal camera is represented by
- Coordinates of the intersection of ~he opticat axis a pinhole model. The image of a three-dimensional
of the lens with the two-dimensional image Nane point is the intersection of the line-of-sight ray with
(image center). a front image plane, The image plane is at a distance
f ( t h e imaging distance) from the tens center on the
The. size of a discrete array device (such as a CCD option axis and intersects perper~'licularly with the
or MOS) is specified by the device manufacturer, optical axis. AU lines of sight intersect at the te~s
and its spacing between sensor elements is accu- center, Figure t shows a schematic of the world
rate. The vertical factor of the image plane is fixed, coordinate system and the camera coordinate sys-
because of the television (TV) scanning signal. The tem. The world coordinate system is denoted as O-
horizontat factor of the image Nane is determined X Y Z and the came~.a coordinate system as o-xyz;
by the camera and ~ analog-to~digimt (Aft)) con- both have r/ght-handed coordinates,
verter from TV signals to pixet data. Data from the Let the origin of the camera coordinate system
discrete array device are converted into TV signals be the lens center and let the z-axis correspond to
consisting of sequences of scanning lines, with the the optical axis of the lens. The relation of the two
interna! clock of the camera as a ho~qzontal syn- coordinate systems can be described as follows, us-
chronous signal. These signals are then transferred ing tr~.,msformation matrices for rotation (R) and
to image acquisition hardware, The analog value translation (T).
between horizontal synchronous pulses is sampled
by the clock of an A/D converter in the image ac-
quisition hardware and transformed irate pixet da~~ =R +T (I)
Thus, the aspect ratio is obtained s the ratio of a
row and a column of elements of the discrete array
device ~ d the timing ratio of the camera and A/D
converter. The image center is the intersection of
the Optical axis of the lens and the image plane,
~ i c h is the surface of the discrete device. The lo-
cation of the image center depends on the accuracy
with which the manuNcmrer attached the device to
the camera and the quality of the camera and lens,
For the same camera different lenses may yieN dif-
ferent image centers,
The extrinsic parameters are defined as follows:
Fiffure 1, World coordinates (O.XYZ) and camera coordi-
9 Direction of the optical axis of the tens nates (o.xyz).
Echigo: A Camera Calibration Technique 161

When the unit vectors E~, Ey, and Ez of the camera


coordinate system are expressed in world coordi-
nates, let their factors be ~ n t e r
:111 :l [1
t
]gx = Exy Ey = Eyy Ez = Ezy (2)
kExJ kEyzJ Ezz

The transformation matrix for rotation is then de-


noted by
G~ ' 'l ) 'g vanishing
E~ ', - - ~ N ~. point
R = Exy Eyy E~, (3)
Figure 2. Detecting a virtual line passing through the image
center and a vanishing point by projecting lines on the im-
age plane of parallel lines in three-dimensional space.
Since the orientation of the camera is expressed by
three rotation parameters, the rotation matrix R can
be described as a function of the Euler angles yaw 0,
pitch ~, and tilt ~, as follows:
To obtain the rotation parameters, sets of parallel
[cos 0 cos ~ cos to - sin ~ sin O lines are used. Figure 2 shows a set of parallel
R = |cos 0 sin ~ cos to + cos ~ sin O straight lines in the three-dimensional world and
k -sin 0 cos tO their corresponding projection image on the image
plane. They are related by perspective transforma-
-cos 0 cos ~ sin to - sin ~ cos tO tion, so that the projected lines on the image plane
-cos 0 sin ~ sin to + cos ~ cos to lead to a vanishing point. A line passing through the
sin 0 sin image center and the vanishing point on the image
sin 0 cos r plane is obtained from the projected lines. This is
sin 0 sin ~ | (4) not projection of a real line in three-dimensional
cos 0 J space, but it is equivalent to projection of a virtual
line parallel to the other lines in three-dimensional
The transformation matrix T is equivalent to the space. This line does not change, even if the camera
position vector of the lens center, which is denoted is translated, because the position of a vanishing
as point is not changed by translation. When the imag-
ing distance from the lens to the image plane is
changed, the vanishing point moves along the line;
T=
[tT~ (5) however, the line itself on the image plane does not
change. Therefore, this line depends only on the
rotation parameters and is independent of the trans-
lation parameters and the imaging distance from the
3 Rotation Parameters lens to the image plane.
The problem of determining camera orientation is The three rotation parameters can be obtained
similar to that of determining the surface orientation from three constraints. Three virtual lines passing
of a scene from an a priori geometric model of the through the image center and the vanishing point
camera for which several methods have been re- can be calculated from three sets of parallel lines
ported (Ohta et al. 1981; Ikeuchi 1984; Shakunaga with linearly independent direction vectors. (The
and Kaneko 1986). By these methods the orienta- method for determining the virtual lines is described
tion of the camera can be obtained from an image of in Appendix A.) For three virtual lines three simul-
patterns However, such methods require that the taneous equations are established can be solved al-
imaging distance be given. The new method differs gebraically, As Figure 2 illustrates, a straight line g:
in that the imaging distance is also an unknown fac- a x + by = 0 passing through the image center and
tor. One of the characteristics of this method is that the vanishing point is obtained from a set of parallel
the rotation parameters can be obtained indepen- lines in the scene, whose direction vector is G. The
dently of the translation parameters and the imaging surface normal N of a plane including the straight
distance from the lens to the image plane. line g and the lens center is then expressed as
162 Echigo: A Camera Calibration Technique

N = aEx + bEy (6) the influence of errors in the extraction of projected


lines is suppressed. The new method is based on
The straight line g on the image plane is parallel to our earlier work (Kasai et al. 1983; Echigo and
the direction vector G of the parallel lines; thus, the Yachida 1985) in which rotation parameters could
relationship between the surface normal N and the be determined automatically from three projected
direction vector G is N 9 G = 0. For example, let lines in cases where the imaging distance was
three vectors G~, Gz, and G3 denote unit vectors of known. However, the parameters determined in
the world coordinates. Three straight lines a~x + this way were often in gross error, as it is difficult to
biy = 0 (i = 1, 2, 3), which are calculated from their extract projected lines on the image plane accu-
projection lines, give the following simultaneous rately. This is why the accuracy of the calculated
equations: rotation parameters depends on the accuracy of the
extraction of projected lines on the image plane.
However, the new method suppresses the influence
(7) of errors in the extraction of projected lines, be-
a2E~y + b2Gy
cause it uses three straight lines passing through the
a3Exz + b3Eyz image center, which are calculated from several
projected lines by the least square method. It also
Since E~ and Ey are orthogonal unit vectors, E~ and ignores the effects of extracted lines that are in
Ey can be derived from Eqs. (7) and the following gross error and can have a large effect on the crite-
equations: rion of the least square method. After the rotation
parameters have been determined, the projection
line (aix + b~ y = ci) that maximizes the value e~ of
"~
ET~ 2
+ Exy +

+
2
Exz

E2,,, + G
:l (8) the following equation is neglected, the three
straight lines are calculated again, and the rotation
parameters are then redetermined.
Ez can be calculated from its orthogonality to Ex and
Ey. Details are given in Appendix B. Thus, all the ~i : 2 I(ai- aj)Gx + (bi- bj)Eyxl (9)
factors of the rotation matrix R can be obtained J
from three sets of parallel lines such as those in
Figure 3. In the new method the rotation parame- This procedure is repeated until the value e; is
ters can be derived from three sets of parallel lines less than some preset threshold, or for a preset
independently of the translation parameters and the number of iterations.
imaging distance.
Another characteristic of the new method is that 4 Translation Parameters
The translation parameters can be determined from
some known points in world coordinates and their
corresponding projection points on the image plane.
Figure 4 illustrates the parallel projection of a point
(X;, Yi:, Z;) in a three-dimensional scene and its
image point (xi, yi) onto the x-y plane in camera

camera coordinates

(xi. v i ) ~ Y
Ex'
x - y plane
Ex, E'y', Ez (z=O)
Figure 3. Rotation parameters from three sets of parallel Figure 4. Parallel projection of a line of sight (to the x-y
lines. plane of the camera coordinates).
Echigo: A Camera Calibration Technique 163

coordinates. In this figure the world coordinates X; Since Eqs. (12) and (14) are both linear, Tx', Ty, T~,
and I1,.,are expressed by the following camera coor- and f are calculated from more than two points. The
dinates: translation matrix T is then obtained from the fol-
lowing equation:

rx;1
L
r;/=r-,
Z'iJ
[1
Z~
+x' (10) T = -RT' (15)

5 Accuracy Assessment
where
In order to evaluate the accuracy of the calculated
~x E~ camera parameters, the geometric camera model is
R -1 = Ey~ Eyy Ey z reconstructed. When a three-dimensional point is
LE~ E=y Ezz given in world coordinates, its corresponding image
point is projected onto the real image plane. When
T'=-R-'T= t[T~ ry the same point is given to the calculated camera
model, however, the point of intersection of the vir-
From the ratio of (X;, Yi) to (xi, yi) tual image plane of the calculated camera model and
the line connecting the three-dimensional point and
Xi __ X; the calculated lens center is obtained as in Figure 6.
(11)
The real image plane and the virtual image plane are
overlapped, and the calculated point on the virtual
the following equation is derived:
plane is compared with the real projected point on
the image plane. Assuming that the calculated cam-
[__Vi Xi ] [T;~
LTy] = -xi(EyxXi + E,,yYi + EyzZi) era model conforms to the geometry of the real
camera, the calculated point on the virtual plane
+yi(ExxXi + ExyYi + ErzZi) (12) coincides with the real projected point. However,
the calculated point obtained from a camera model
The factor T; of the translation parameters in the containing errors may be plotted to a different posi-
Z direction and the imaging distance f from the lens tion from the real projected point. In this paper the
to the image plane are also determined on the plane measure for evaluating the camera parameters de-
that includes the line of sight and the optical axis of termined by the proposed method is defined as the
the lens, as shown in Figure 5. In the same way, distance between the points.
from the ratio

f Z[ 6 Experimental Results
~x-x2 + y/Z ~Sz+ y[2 (13)
6.1 Real Experiments
Before the camera was set up, the intrinsic parame-
the following equation is derived: ters were determined.

',AT, + x7 - V x ? + r ? 1. Aspect ratio. The camera used was an NEC


CCD camera with a 7.15909-MHz clock and a
= - + (E Xi + E=yh + E zzl) (14)
calculated lens center
real [e_nscente~ / ,
line of line of s i g h ~
real projected~on~//,i'~
I / /"C~lulated p~
,2
J" ;,(.xF +Yi '.)
+h
~ irtual

real mage plane


image plane

t ~Zi '/
Figure 6. Comparison of the real image point and the calcu-
Figure 5. Plane including a line of sight and the optical axis lated point from the geometric camera model reconstructed
of the lens (using the camera coordinates). by using calibrated parameters.
164 Echigo: A Camera Calibration Technique

As shown in Figure 7, an object drawn in a grid


pattern is used to determine the extrinsic parame-
ters. The three edges of the object denote the X, Y,
and Z axes in world coordinates. The image center
is the mark (+) in Figure 8. The three straight lines
in Figure 9 are calculated by the least square
method from perspective projection of the three
sets of parallel lines in Figure 8. After the rotation
parameters have been determined so that errors in
the parameters of lines extracted from the image do
not influence the calculated values of the rotation
parameters, an extracted line (aix + b i y = ci) in
gross error that maximizes the value of Eq. (9) is
neglected, the three straight lines are calculated
again, and the rotation parameters are redeter-
mined. In the experiments this procedure was re-
peated nine times.
Figure 7. An object for calibration. The results of calibration of the camera that cap-
tured the image shown in Figure 7 were as follows.
The rotation parameters are expressed in Euler an-
gles.
discrete device whose sensor element was 23 x
13.5/xm. The image acquisition hardware had an 0 ~ 61.44 Tx = -600.80
A/D converter with a sampling rate of 12.5 MHz. ~ -35.42 [degrees] ~, = 469.74 [mm]
Thus, the aspect ratio was x : y = 0.975757 : 1. = 18.55 Tz= -389.93
2. I m a g e c e n t e r . The lens was a Nikon with a focal f = 4154.09 [pixels]
length of 50 mm. It has been confirmed experi-
mentally that a lens with a focal length longer To evaluate this method, the experimental
than 35 mm yields very little distortion, so this results were verified by accuracy assessment,
technique takes no account of lens distortion. which is defined in section 5. The geometric camera
The image center was determined from the point model was reconstructed from the calculated
on the CCD irradiated by a laser beam whose results. Given some known three-dimensional
radiation coincided with the reflection from the points, the calculated points on the virtual plane
surface of the lens.

Figure 9. Determination of three straight lines passing


Figure 8. Extraction of projecting lines of three sets of par- through the image center and vanishing points from three
allel lines in the scene. sets of projecting lines.
Echigo: A Camera Calibration Technique 165

Figure 10. Overlapping of the real image plane and the


virtual plane of the reconstructed geometric camera model. Figure 12. Overlapping of the real image plane and the
virtual plane of the reconstructed geometric camera model
in another setting of the camera.

were compared with real projection points on the model was reconstructed with an accuracy of coin-
real image plane. When the real image plane was cidence between the real imaging position and the
compared with the calculated virtual plane, as in calculated position of about two pixels. In practice,
Figure 10, the distances from the real image points the lens center was about 850 mm from the origin of
to the calculated points were given, as shown in the world coordinates. Therefore, a line of sight
Figure 11. The image obtained for a different cam- computed from the calculated geometric camera
era setting and the calculated points are shown in model was within 0.4 mm of the vertical plane of the
Figure 12. In the same way, the distances from the optical axis of the camera in the field of view.
real points to the calculated points are shown in
Figure 13. The results of the two experiments were 6.2 Simulation Experiments
similar, despite the different settings of the cam- In the simulation experiments the camera setting
eras. The new method was able to determine cam- was the same as in the first real experiment de-
era parameters robustly, independently of the set- scribed earlier. When the intrinsic and extrinsic pa-
ting of the camera. In the experiments this rameters are selected to have the same values as in
calibration method was able to determine the cam- the first real experiment, the geometric camera
era parameters from which the geometric camera model can be reconstructed. When a three-dimen-

2 ................................................................................................... 2 ....................................................................................... i

i o o i i

o i Z '= !
...............................= ! ~ o ~ .......... i ..... ! .................................
.x • i i :. i o O [ i i
i o ~ io
o d 0
o "T:
o o
............................................. ~ b ~ o - i ; d o - D ........................................... i ......................................
>-
7-
........................... t tto ~ ~ I ................................... Q......o..s o ......... -
-1
i io ~176176
oo
i
i

-2 -2
i i
-2 -1 0 -1 0 1

X Gx[s [pixels] X Gxis [pixds]

Figure 11. Distances from the real image points to the cal- Figure 13. Distances from the real image points to the cal-
culated points. culated points in another setting of the camera.
166 Echigo: A Camera Calibration Technique

Table 1. Comparison of simulation results using our method and Tsai's method
Added Noise Range Our Method Tsai's
Method

-+1 (pixels) Max. error 0.569 (pixels) 1.415 (pixels)


Ave. error 0.246 0.369
-+2 (pixels) Max. error 1.817 (pixels) 2.868 (pixels)
Ave. error 0.510 0.847
-+3 (pixels) Max. error 1.549 (pixels) 4.113 (pixels)
Ave. error 0.618 1.073

sional point is given in world coordinates, its corre- of this method is that the rotation parameters can be
sponding projected point can be calculated on the derived independently of the translation parameters
virtual image plane. The data of the image lines for and the imaging distance from the lens to the image
the simulation experiments can be calculated by the plane, so that the rotation parameters are not influ-
least square method from projections of three-di- enced by errors in other extrinsic parameters. An-
mensional points along parallel lines in the scene. other characteristic is that the influence of errors in
The simulation experiments were performed using the extraction of projected lines is suppressed, be-
the image lines, which were determined from pro- cause the method uses three straight lines passing
jected points by adding a noise at random on the through the image center, which are calculated from
image plane. The way to add noise to projected several projected lines by the least square method.
points is described as (xi + o~, Yi +/3), where ~,/3 Experimental results showed that the geometric
have subpixels in the range of [ - n , +n]. In the X - Y camera model could be accurately reconstructed
plane of world coordinates five lines parallel to the and used for three-dimensional data acquisition.
X-axis and five lines parallel to the Y-axis, and on Accuracy assessment is defined in this paper as
the Y-Z plane, five lines parallel to the Y-axis and finding the differences between real image points
five lines parallel to the Z-axis were used for the and points calculated by using the proposed model.
simulation experiments, and seven extrinsic camera It serves to clarify the limits of the area to be
parameters were obtained from them. For evalua- searched for correspondences of stereo images.
tion, the simulation results were verified by accu- Correspondences of stereo images exist within the
racy assessment in the same way as in the real ex- preceding area, which is centered on the coordi-
periments. The intersections of the simulation data nates calculated by the constraint of epipolar geom-
in the noise-free case were compared with projected etry.
points of the intersections of 10-mm grid pattern on
the X-Y plane and the Y-Z plane in world coordi-
nates, using the camera model reconstructed from
the simulation results. Table 1 shows the results of
accuracy evaluation when the noise range is from Appendix A Determining the Straight Line
within one pixel to within three pixels. The results
Passing Through the Image
of the simulation experiments show that the calcu-
Center and the Vanishing Point
lated parameters are reasonably accurate. The
other results were obtained from a simulation using
from the Image Lines
Tsai's single-plane calibration method (Tsai 1985), The line passing through the image center and the
which uses image points as data for calibration. The vanishing point of a group of lines can be calculated
data for the simulation were the intersections of the from the image lines projected from a set of parallel
same lines as in the simulation of our method. The lines in the scene by the least square method. Let
simulation results show that the method presented the equation of an image line be
here, which uses image lines as features, is more
robust when noise appears than the conventional a i x + b i y = ci (A.1)
method, which uses image points as features.
The surface normal Ni of a plane including the lens
7 Conclusions center and the image line is then expressed as
This paper presented a new method of three-dimen-
Ci
sional camera calibration. One of the characteristics Hi = aiEx + biEy - ~ Ez (A.2)
J
Echigo: A Camera Calibration Technique 167

Let the real line in the scene corresponding to the and


image line be parallel to the .X-axis of the world
coordinates. Since the real line is perpendicular to ~_Ta3 + 2 (b2 b3) b3
the vec,tor N;, the following equation can then be ~22a3 + =
derived:
E2 ~ ( b 3 b ~ a 3 ) + E 2 b2 (b3
_ __ b2
yx al at YY a2 ~22 a3) + a3
ci
aiExx + biE~,~ - 7f Ez:, = 0 (A.3)
(A.6)
However, the value of the right-hand side of Eq.
(A.3) yields an error ei, because the parameters ai, The components of Ex and Fq can be determined
bi, and c~ of the image line have errors created dur- from the solution of simultaneous Eqs. (A.5) and
ing the extraction of the line. When the sum of the (A.6) and Eq. (8). The signs of" the components are
squared errors e 2 in the same group is minimized, then obtained from the signs of the image lines pro-
Eq. (Ao3) can be transformed into jected from the unit direction vectors of the lines in
the scene. Similarly, the components of Ez can be
{s albi s bici "Z ciai - s c~(E aibi) 2 determined from the vector product: Ez = Ex x Ey.
- b (r. c aO + "Z r.
+ {s albi(s bici) 2 - ~ bai ~ blci "Z ciai}Ey~ = 0
(A.4) References
Echigo T, Yachida M (1985) A fast method for extraction
Since Eqs. (A.3) and (A.4) can be taken as identical, of 3-D information using multiple stripes and two cam-
at ci = 0 the values of ai and b~ can be obtained. eras. In: Proceedings of IJCAI-85, pp 1t27-1130
Therefore, the straight line passing through the im- Ganapaphy S (t984) Decomposition of transformation
age center and the vanishing point can be derived matrices for robot vision. In: Proceedings of Interna-
from a set of image lines projected from a set of tional Conference on Robotics and Automation, pp
parallel lines. In the preceding situation the lines in 130-139
the scene are parallel to the X-axis; however, the Gennery DB (1979) Stereo-camera calibration. In: Pro-
ceedings of Image Understanding Workshop, Novem-
world coordinates may be fixed at any location so
ber, pp 101-108
that the image lines projected from a set of lines Ikeuchi K (1984) Shape from regular pattern. Artificial
parallel to any vectors can derive the identical equa- Intelligence 22:49-75
tions and the line passing through the image center, Kasai T, Asahi T, Yoshimori T, Tsuji S (1983) Measure-
and the vanishing point can be calculated from any ment system of 3-D motion using a pair of position
set of image lines. sensing detector cameras (in Japanese), SICE
19(12):997-1003
Martins HA, Birk JR, Kelley RB (1981) Camera models
based on data from two calibration planes. Computer
Appendix B Deriving the R o t a t i o n Graphics Image Processing 17:173-180
Parameters
Ohta Y, Maenobu K, Sakai T (1981) Obtaining surface
orientation from texels under perspective projection.
From Eqs. (7) and (8) the following simultaneous In: Proceedings of IJCAI-81, pp 746-751
equations can be derived: Shakunaga T, Kaneko H (1986) Perspective angle trans-
form and its application to 3-D configuration recover1,'.

~ at( a1
t
xx -~1 b3 - a3 + Exy -~ b3 - a3 + a3 =

~ a2 a2
i} In: Proceedings of CVPR-86, pp 594-601
Sobet I (1974) On calibrating computer controlled cam-
eras for perceiving 3-D scenes. Artificial Intelligence
5:185-198
g a3 - g b3 + E 5 ,,3 - g b3 + b3 Tsai RY (1985) A versatile camera calibration technique
for high accuracy 3D machine vision metrology using
off-the-shelf TV cameras and lenses. IBM Research
(A.5) Report, RC51342, May 8

You might also like