Camera Calibration With A Simulated Three Dimensional Calibration Object
Camera Calibration With A Simulated Three Dimensional Calibration Object
Camera Calibration With A Simulated Three Dimensional Calibration Object
)
Perslak, Czech Republic, February 24, 2000
Czech Pattern Recognition Society
Abstract A new camera calibration method based on DLT The calibration object can be three dimensional [1], two di-
model is presented in this paper. Full set of camera param- mensional (also called coplanar) [7] or the method can use
eters can be obtained from multiple views of coplanar cal- multiple views of a coplanar object to simulate the three di-
ibration object with coordinates of control points measured mensional one. The three dimensional object gives better
in 2D. The method consists of four steps which are iterated results and the complete set of camera parameters can be
until convergence. The proposed approach is numerically acquired, but such object is hard to manufacture. In addi-
stable and robust in comparison with calibration techniques tion, the coordinates of the detected features have to be mea-
based on nonlinear optimization of all camera parameters. sured in three dimensional space. In case of multiple views
Practical implementation of the proposed method has been of a coplanar object, it can be moved freely between the
evaluated on both synthetic and real data. A MATLAB tool- views [5, 15] or the movement should be a priori known [11].
box has been developed and is available on WWW. The known movement of a coplanar object requires a robot
which may not be available.
Above mentioned facts lead to a conclusion that the most
1 Introduction effective approach is to simulate a three dimensional calibra-
tion object by multiple views of coplanar one. The move-
Camera calibration is one of the basic tasks in computer vi- ment of the coplanar target between image acquisitions can
sion [4]. It is used to determine the optical characteristics be unconstrained.
of the camera (so called intrinsic parameters) and position of
the camera in the scene (extrinsic parameters). One or more
images of a priori known calibration object are acquired by 2 Simulation of the 3D calibration target
a camera and several well-defined features (so called con- using multiple unconstrained views of a
trol points) are detected in the images. The coordinates of coplanar object
the features in the scene can be unknown (so called self-
calibration [10]) or they can be measured in advance. In
The proposed calibration method uses multiple views of a
this paper, a known calibration object with coordinates of the
2D calibration target (so called calibration plate) to simulate
features measured in 2D is assumed.
a 3D object, which is needed for precise and reliable cali-
Various camera calibration methods were introduced in a lit- bration results. The calibration plate can be moved freely
erature. The classical approach [9], based on methods used between the image acquisitions. An important assumption of
in photogrammetry, gives precise results but it is computa- the method is that the intrinsic parameters of the camera are
tionally extensive. Several simplification were made (such constant for all views. This constraint allows the simulation
as [11] and [13]), but the nonlinear search used there may of the 3D object to be based on relative positions of the plate
lead into computational instability. There are also meth- between particular acquisitions.
ods which combine both minimization and closed-form so-
lution [4, 5, 7, 15]. All these methods are based on phys- The method consists of the following four steps (see Fig-
ical camera parameters. Implicit camera calibration meth- ure 1):
ods [12], on the other hand, use an interpolation between co-
ordinates of the points on multiple planes or images and they 1. an initial estimation of the intrinsic parameters of a cam-
do not explicitly compute camera parameters. era,
123 1231
Camera Calibration with a Simulated Three Dimensional Calibration Object
2. an estimation of the extrinsic camera parameters for each As a result of the calibration, both intrinsic and extrinsic
view, camera parameters are provided. Regarding the fact that the
proposed calibration method needs an initial guess of the in-
3. a construction of a virtual 3D calibration object from mul- ternal characteristics of the camera, its accuracy can be im-
tiple 2D planes, and proved as follows: The obtained intrinsic parameters are put
back to the second step of the method and a new set of the
4. a complete camera calibration based on the virtual 3D ob- camera parameters is estimated (as depicted in Figure 1. The
ject. whole process is iterated until convergence.
Scene and
image coordinates
of control points
3 Implementation of the method using DLT
camera model
Step 1
The outline of the proposed calibration method presented in
Initial estimation
of intrinsic camera the previous section illustrates only main ideas of the novel
parameters
Intrinsic camera
approach. All the estimation steps are relatively independent
parameters
on concrete calibration method. Practical realization, how-
Step 2
ever, has to be based on particular models of a camera.
Estimation of extrinsic
camera parameters
for each view A coplanar variant of the DLT [1] camera model (so called
Positions of camera CDLT) has been chosen for implementation of the second
for all views
step of the method. The model can be expressed in form
Step 3
New set of
of a linear matrix equation which allows direct extraction of
Simulation of a virtual
3D calibration object intrinsic camera camera parameters. Although it does not compensate nonlin-
using multiple views parameters
ear distortions, it still gives a very good approximation of a
Virtual 3D object
coordinates camera. Due to the fact that all camera parameters cannot be
Step 4 extracted from the CDLT matrix [7], an initial estimate of at
Camera calibration
based on the virtual
least three of them has to be provided to the first step of the
3D object
method.
Complete set of camera
parameters CDLT model allows the image formation to be expressed in
Convergence
homogeneous coordinates by the following matrix equation:
check
q i = A pi , (1)
Complete set where A is the coplanar DLT (CDLT) matrix of the size 3
of camera
parameters 3, pi = [xi , yi , 1]T are homogeneous scene coordinates of
the i-th point and qi = [wi ui , wi vi , wi ]T are the appropriate
coordinates in the image plane. The CDLT matrix A of the
Figure 1: A scheme of the proposed calibration method
size 3 3 can be written as
Due to the employment of a coplanar calibration target, an A = V1 B1 FMT , (2)
initial guess of the intrinsic camera parameters is required,
because the coordinates of the control points are measured where matrices V compensates the shift of the image origin,
only in 2D. These parameters are supplied by the user in the B represents the difference in scale and lack of orthogonality
first step of the proposed method. between image axes, F stands for focal length, M for rota-
tion and T represents translation, respectively and is a non
In the second step, the extrinsic camera parameters are es- zero scalar.
timated for each view with respect to the provided intrinsic
parameters. Any explicit coplanar camera calibration method An estimate of the extrinsic camera parameters is obtained by
can be applied here. a decomposition of the CDLT matrix A in the second step.
Values of these parameters have to be refined, especially the
In the third step, the virtual 3D calibration object is con- matrix of rotation M need not to be orthonormal as required.
structed. The construction exploits the knowledge on the Geometrical approach to transform the matrix M to a proper
camera positions for all views provided by the previous step. matrix of rotation was chosen in this particular realization of
the method (see Figure 2):
The pairs of scene and image coordinates of the control
points of the simulated calibration object are finally passed First two columns of M determine a 2D plane p. Changing
to the fourth step of the proposed method. A complete set the angle between these vectors m1 and m2 to the value
of the camera parameters is estimated there. Any explicit 3D 2 and computing the third column of M as vector product of
based camera calibration method can be used for this pur- m1 and m2 gives a proper matrix of rotation similar to M.
pose. The translation is then computed directly from Equation 2.
2123 123
Hynek Bakstein and Radim Halr
a) b)
2
123 1233
Camera Calibration with a Simulated Three Dimensional Calibration Object
the method which can be demonstrated on incorrect estima- ally detected. Precision of the detection was about 0.5 pixels.
tion of camera parameters (see Table 1). A set of experiments was performed using combinations of
these images.
Table 1: Results of the calibration on synthetic data: ideal data, no
noise. See text for a description of the table
No. of views Error f u0 v0
Initial guess 5.5349 1300 353 286
2 0.2683 1095 340 304
3 0.2806 1110 363 276
4 0.2295 1122 357 285
5 0.2208 1132 355 284
6 0.2119 1112 356 278
Ground truth 0.0 1136 363 280
Table 2 contains results for experiment with synthetic data Figure 5: An image of calibration grid
blurred by Gaussian noise with standard deviation of 2 pix-
els. Structure of the table is the same as in the previous case.
Again, note that the error value decreases with larger number
of views.
4123 123
Hynek Bakstein and Radim Halr
550 12
450 8
v [pixels]
400 6
350 4
300 2
250 0
100 200 300 400 500 600 0 1 2 3 4 5 6 7 8 9 10
u [pixels] standard deviation of noise [pixels]
(a) (b)
Figure 4: Test with synthetic data: (a) the calibration error (note that the arrows are scaled, mean value is 0.63 pixels), (b) dependence of the
calibration error on noise
576 12
mean error in the image plane [pixels]
2 planes
504 6 planes
10
432
8
360
v [pixels]
288 6
216
4
144
2
72
0 0
0 96 192 288 384 480 576 672 768 1 2 3 4 5 6 7
u [pixels] number of iterations
(a) (b)
Figure 7: Experiments on real data: (a) calibration error for the reference view (note that the arrows are scaled, mean value is 0.89 pixels), (b)
calibration error for different number of iterations (2 and 6 planes)
123 1235
Camera Calibration with a Simulated Three Dimensional Calibration Object
1
of the table is the same as for tables 1 and 2. Similar devel-
opment of the error of reprojection in relation to number of 0
0 1 2 3 4 5
views as in the case of synthetic data can be observed. The standard deviation of noise [pixels]
Exact values of camera parameters were not available for the A new camera calibration was presented in this paper. The
tests. Only basic camera parameters like the nominal focal approach is capable to extract full set of camera parameters
length and the camera resolution were obtained from data from multiple views of a coplanar target. The target can
sheet of the camera. The size of the CCD chip in millimeters be moved without any constraint between the image acqui-
needed for the conversion of the focal length from pixels to sitions. Tests both with synthetic and real data were per-
millimeters was unknown. Therefore the alternative values formed. They evaluated that the method is robust and sta-
of the focal length f in Table 4 could not be computed. ble and gives reliable results. Method was compared with
other approaches. The proposed method exhibits significant
Figure 8 compares the calibration approaches with respect improvements in comparison to the technique of Tsai with
to their sensitivity to noise among data. This test was per- coplanar object and better numerical stability than the meth-
formed on synthetic data blurred with Gaussian noise with ods based on nonlinear optimization of all camera parame-
standard deviation set to 0.25, 0.5, 1, 2, 3, 4 and 5 pixels. It ters.
can be seen that the proposed method gives the precision of
the same order as the method of Heikkila and Silven. This The method models only linear distortion but it can be ex-
means that the proposed approach gives results of the same tended as in it is shown in [7] or [5]. The main disadvantage
6123 123
Hynek Bakstein and Radim Halr
20 200
Proposed method
Tsai basic
17
16 100
15
14 50
13
12 0
0 1 2 3 4 5 0 1 2 3 4 5
standard deviation of noise [pixels] standard deviation of noise [pixels]
(a) (b)
Figure 9: A comparison of the calibration methods. (a) estimation of the focal length, (b) the distance of the estimated principal point from its
ideal position in pixels
is the time of the computation which is due to use of gen- teknisk kybernetikk, Norges tekniske hgskole, Trondheim,
eral optimization routine in the second step of the method. November 1994.
Providing gradients should speed up the execution. Future [8] T. Moons. A guided tour through multiview relations. In
work should be targeted mainly to decrease the time needed R. Koch and L. Van Gool, editors, European Workshop,
for estimation of extrinsic camera parameters performed in SMILE98, Freiburg, Germany, June 1999.
[9] C. C. Slama. Manual of Photogrammetry. American Society
the second step of the method. of Photogrammetry, Falls Church, Virginia, 4th edition, 1980.
[10] B. Triggs. Autocalibration from Planar Scenes. In
H. Burkhardt and B. Neumann, editors, Proc of the 5th Euro-
References pean Conference on Computer Vision (ECCV98), Freiburg,
Germany, volume 1, pages 158174. Springer-Verlag, June
[1] Y. I. Abdel-Aziz and H. M. Karara. Direct linear transforma-
1998.
tion into object space coordinates in close-range photogram- [11] R. Y. Tsai. A versatile camera calibration technique for high-
metry. In Proc. of the Symposium on Close-Range Pho- accuracy 3D machine metrology using off-the-shelf cam-
togrammetry, Urbana, Illinois, pages 118, 1971. eras and lenses. IEEE Journal of Robotics and Automation,
[2] H. Bakstein. Camera calibration toolbox, March 1999. Avail- 3(4):323344, 1987.
able at, [12] G. Q. Wei and S. D. Ma. A complete two-plane camera cali-
http://terezka.ufa.cas.cz/hynek/toolbox.html. bration method and experimental comparsion. In Proc. of the
[3] H. Bakstein. A complete DLT-based camera calibration with 4th International Conference on Computer Vision (ICCV93),
a virtual 3D calibration object. Masters thesis, Charles Uni- Berlin, Germany, pages 439446, 1993.
versity, Prague, 1999. [13] J. Weng, P. Cohen, and M. Herniou. Camera calibration with
[4] O. Faugeras. Three-dimensional computer vision A geo- distortion models and accuracy evaluation. IEEE Trans. on
metric viewpoint. MIT Press, 1993. Pattern Analysis and Machine Intelligence, 14(10):965980,
[5] J. Heikkila and O. Silven. A four-step camera calibration pro- October 1992.
cedure with implicit image correction. In Proc. of the IEEE [14] R. Willson. Tsai camera calibration software, 1995.
Computer Society Conference on Computer Vision and Pat- http://www.cs.cmu.edu/ rgw/TsaiCode.html.
tern Recognition (CVPR97), San Juan, Puerto Rico, 1997. [15] Z. Zhang. A flexible new technique for camera calibration.
[6] A. Kuntsevitch and F. Kappel. Solvopt the solver for local Technical Report MSR-TR-98-71, Microsoft Research, 1998.
nonlinear optimization problems, 1997.
http://bedvgm.kfunigraz.ac.at:8001/alex/solvopt/index.html.
[7] T. Melen. Geometrical modelling and calibration of video
cameras for underwater navigation. PhD thesis, Institut for
123 1237
8123 123