Laser Best Review

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

Results in Physics 19 (2020) 103637

Contents lists available at ScienceDirect

Results in Physics
journal homepage: www.elsevier.com/locate/rinp

Line structured light calibration method and centerline extraction: A review


Xiaobin Xu a, b, *, Zhongwen Fei a, b, Jian Yang c, Zhiying Tan a, b, Minzhou Luo a, b
a
College of Mechanical & Electrical Engineering, Hohai University, Changzhou 213022, China
b
Jiangsu Key Laboratory of Special Robot Technology, Hohai University, Changzhou 213022, China
c
College of Mechanical Engineering, Yangzhou University, Yangzhou 225127, China

A R T I C L E I N F O A B S T R A C T

Keywords: Line structured light system has made rapid progress in recent years owing to the improvement of camera res­
Line structured light olution and image processing technology. The calibration of line structured light system and centerline extrac­
Calibration method tion affect the imaging quality. Hence, numerous innovations have occurred in both line structured light
Centerline extraction
calibration methods and centerline extraction methods. In this paper, key technologies of line structured light
Deep learning
calibration are reviewed. Major calibration methods such as vector cross product, linear equation solution, cross
ratio invariance, vanishing points and lines, Plücker matrix are analyzed and compared in detail. Methods of
centerline extraction are analyzed furthermore. This review concludes with a discussion of the future develop­
ment of line structured light system, providing a good reference for researchers.

Introduction non-iterative that do not need any initial guess, meaning no risk of local
minima. This method is suitable for real-time online calibration. Niu [7]
Owing to the high capability of anti-interference, fast scanning speed proposed a novel method for calibrating relative orientation of a camera
and high accuracy, line structured light has been widely used in variety fixed on a rotation axis using constrained global optimization algorithm.
fields of industrial robot [1–3]. The line structured light system emits a On the other hand, there are a number of researches available that use
laser beam, which forms a fringe projection on the surface of the object. different calibration methods to determinate the parameters of the laser
Afterwards, the camera captures a laser stripe and extracts the center of plane. These calibration methods could be categorized into vector cross
the laser stripe. Finally, the system converts pixels to the information of product [8,9], the solution of linear equations [10,11], cross ratio
ranges according to the transform matrix of the laser plane and camera invariance [12,13], vanishing points and lines [14,15], Plücker matrix
frame. If the system is mounted on a linear guide rail, line structure light [16,17], etc. In vector cross product method, normal vector of the light
can form a complete object surface by controlling a motor. In image plane can be calculated by the cross product of two unparallel laser
processing, it is crucial to enhance the accuracy of the transform relation stripe vectors. For the solution of linear equations, the linear relation­
and the centerline extraction. Thus, calibration methods of line struc­ ship between the 3-D points on the laser plane and their corresponding
tured light system and means of centerline extraction are mainly 2-D pixel points is obtained by the transformation matrix. As for cross
reviewed in this paper. ratio invariance, 3-D points on the laser plane are obtained by the
It is of significant importance to calibrate the relationship between property of cross ratio invariance. Thus, the transformation matrix can
the camera and the structured light plane for line structured light sys­ be converted to the equation parameters of the laser plane. In regard to
tem. It can be divided into camera calibration and laser plane calibra­ the method of vanishing points and lines, normal vector of the laser
tion. The technology of camera calibration is relatively mature. The plane can be acquired by the knowledge of vanishing points and lines in
well-known Zhang’s method overcomes the calibration problem for computer vision. With regard to the method of Plücker matrix, the light
intrinsic and extrinsic parameters of the camera [4]. In addition, rele­ plane equation is directly solved by Plücker matrix. However, it is
vant researches of camera calibration have been conducted. Aziz [5] difficult to understand the derivation of Plücker matrix method due to
proposed a novel method without the necessity for neither fiducial the relevant knowledge of computer vision and theory of matrices
marks nor initial approximations for inner and outer orientation pa­ required.
rameters of the camera. Hong [6] proposed an original method with Center extraction is an indispensable process to provide high

* Corresponding author at: College of Mechanical & Electrical Engineering, Hohai University, Changzhou 213022, China.
E-mail address: xxbtc@hhu.edu.cn (X. Xu).

https://doi.org/10.1016/j.rinp.2020.103637
Received 3 July 2020; Received in revised form 4 September 2020; Accepted 17 November 2020
Available online 23 November 2020
2211-3797/© 2020 The Author(s). Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
X. Xu et al. Results in Physics 19 (2020) 103637

precision pixel points for 3-D reconstruction. There are mainly two
typical methods considering the extraction speed, anti-interference and
the performance of sub-pixel extraction. The former is Steger’s method
[18]. The sub-pixel centerline of the laser is obtained by the property of
Hessian matrix. Due to the stability of this method, it is widely used in
various line structured calibrations. The latter is the improved gray
centroid method. Izquierdo [19] employed this method to extract the
centerline with a standard deviation of 0.37 pixels.
The remaining of the paper is organized as follows: Section 2 in­
troduces the main calibration methods and calibration devices in detail.
In Section 3, methods of centerline extraction are described. In section 4,
the trend of future development of line structured light is prospected.

Calibration method

A line structured light system projects one or more special light


patterns onto a scene. The system is typically composed of a monocular
camera and a line laser. The camera captures the projection of laser
Fig. 2. The checkerboard target.
stripes on the surface of the object. Then, the profile of the object is
reconstructed by calculating the projection coordinate.
With the development of line structured light sensor technology, a Fan and Qi adopted Zheng’s method, where Fan [9] used skeleton
large number of calibration methods have been proposed. This section method while extracting the centerline, and Qi [20] assumed that the
outlines calibration methods and corresponding targets of line struc­ intensity of laser stripe obeys Gaussian distribution. Furthermore, the
tured light visual sensor. calibration precision Qi’s method could meet the demand of robot
welding, showing its application value in the industrial field.
Generally, the principle of the methods based on normal vector are
Vector cross product simple, and the calibration targets could be easily satisfied. However,
the extraction of the centerlines of laser stripes inevitably causes errors,
It is crucial to calibrate the equation parameter of the light plane in because two straight lines cannot intersect ideally at one point. Thus it is
camera coordinate system for line structured light system, and normal difficult to guarantee the calibration accuracy. Moreover, multiple
vector of the light plane can be calculated by the cross product of two straight lines are not sufficiently employed to enhance the calibration
unparallel laser stripe vectors: precision.
̅̅̅→ ̅̅̅→ In order to solve the problems above, Bi [21] modified the vector
→ (1)
′ ′
nl = M1 M2 × M1 M2 cross product method. Due to the image noise, the intersection point of
two spatial laser stripes does not exist. The normal vector of light plane
̅̅̅→ ̅̅̅→
where → nl is normal vector of the light plane, M1 M2 , M1 M2 are vectors of was identified by the point that is closest to both laser stripes. The so­
′ ′

laser stripes in the camera coordinate system. The diagram of vector lution of exact plane equation can be obtained by least square method.
cross product method is shown in Fig. 1. In addition, the solution was calculated using multiple laser stripes
Zheng [8] proposed a calibration method by the means of checker­ pairs. More accurate light plane equation could be achieved with
board. The method consisted of two steps: camera calibration and weighted average of the calibration results of all stripes pairs. Table 1
structured light plane calibration. Firstly, the intrinsic and extrinsic shows the calibration accuracy of Fan [9]. Kiddee [22] applied it to the
parameters of the camera were calibrated. Then the projection equation cross structured light system. The calibration result was more precise by
of the line structured light was obtained according to the constraint adding edge constraints of rectangle checkerboard. Although Kiddee
relationship between the line structured light projection and the camera. calibrated with the checkerboard of only two different positions, normal
Finally, normal vector of the light plane can be directly calculated by the vector can still be calculated.
cross product of laser stripes vectors with an absolute error of about Although the accuracy of the method based on normal vector is
0.09 mm. The checkerboard target is shown in Fig. 2. limited, it is suitable for general applications due to the easy operation.
However, to calculate the spatial linear equation with better accuracy,
other methods such as studying the target with higher precision and
improved method of centerline extraction shall be adopted.

Solution of linear equations

The linear equation is solved by establishing the linear relationship


between camera pixels and world reference points. The relationship is:

Table 1
Fan [9] calibration accuracy.
Number The distance of the reference Absolute error Relative error
points(mm) (mm) (%)

1 10.18 − 0.07 0.688


2 10.16 0.08 0.787
3 10.20 − 0.11 1.078
4 10.19 0.06 0.589
5 10.16 0.05 0.492
Fig. 1. The diagram of vector cross product method.

2
X. Xu et al. Results in Physics 19 (2020) 103637

⎧⎡ ⎤ ⎡ ⎤⎫
⎨ xu Xw ⎬ for the case that there were only 8 independent parameters in the
s⎣ yu ⎦ = [ r1 r2 t ]⎣ Yw ⎦ (2) transformation process of inclined sensor coordinate system to the
⎩ ⎭
1 1 orthogonal reference system. The structured light sensor was installed
on a CMM. The conjugate pairs conducted parallel slices of a spherical
where s is the scale factor, ri (i = 1, 2) is the column vector of the ith target with the CMM aided. The model of slice circles in non-orthogonal
column of the rotation matrix. The plane equation can be achieved in scan calibration is shown in Fig. 3. The 3-D coordinates of the spherical
camera coordinate system by solving the rotation matrix and translation center can be calculated by linear interpolation and least square fitting.
vector. Since a set of matching points could be expressed by three equations, the
By defining m̃ = [xu yu 1]T , M̃ = [Xw Yw 1]T , H = [r1 r2 t], Eq. (2) can
be rewritten as:

sm̃ = H M̃ (3)
By eliminating the scale factor, Eq. (3) is simplified as:
⎡ ⎤
[ ]
xu 1 ⎣ ⌢ ⌢ ⎦
= ⌢ h1 M̃h2 M̃ (4)
yu h3 M̃

Then Eq. (4) could be expressed as:


⎡ ⎤
⎡ ⎤⎢ ⎥
⎢ ⎥

T ⎥⎢ T T T ⎥
⎢ T T
(5)
T T T
⎣ M̃ i 0 xu M̃ i 0 M̃ i yu M̃ i ⎦⎢ ̂
h ̂
h ̂
h =0
⎢ 1 2 3⎥⎥
⎢ ⎥
⎣ ⎦

Thus, at least 4 groups of known target reference points and pixel


points are needed to solve the transformation matrix between the
camera coordinate system and the world coordinate system.
The methods of linear equations are summarized and shown in
Table 2.
McIvor [10] put forward the calibration method of laser fringe
profilometer, which consisted of a laser projector, a camera and a linear
motion platform. Its calibration target had three surfaces. However,
experimental results were not discussed in detail. Che [11], Xie [23,24]
and Xiong [25] proposed line structured light imaging systems with ball
targets. A constrained optimization calibration algorithm was proposed Fig. 3. Slice of a sphere in non-orthogonal scan calibration.

Table 2
Methods of linear equations.
Year Calibration target auxiliary device Features type accuracy References

1999 three surface target No device circular white disks – McIvor et al. [10]
2000 ball-target CMM conjugate pairs – sliced circles Average error 0.07 mm Che et al. [11]
2004 ball-target CMM conjugate pairs – sliced circles Maximum error 0.035 mm Xie et al. [23],
2004 ball-target CMM and PH10 conjugate pairs – sliced circles Maximum error 0.03 mm Xie et al. [24]
rotary head
2009 ball-target CMM conjugate pairs – sliced circles Average error 0.005194 mm Xiong et al. [25]
2005 single-tooth target CMM and rotation the crossing point of the two straight-line – Xie et al. [26]
table
2014 checkerboard pattern No device corner points Maximum error 0.0066 mm Sun et al. [27]
2014 checkerboard pattern No device corner points without dark area – Walch et al. [28]
2015 1D target with a checkered pattern No device intersection points between the laser plane and the – Xu et al. [29]
and a 3D calibration board centers, annular orbit of the intersection feature
points
2018 bi-planar references with No device corner points mean of the reconstruction Xu et al. [30]
checkerboard pattern errors 0.93 mm
2019 No target parallel guide rail ORB feature points – Yin et al. [31]
push-broom devices
2009 high-precision gauge object CMM corner points Error within 0.03 mm Santolaria et al.
[32]
2009 steel plate and a ball array No sliced circles Error within the range of Zhao et al. [33]
[0.32 mm, 0.37 mm]
2014 calibration pattern with circles No calibration circles mean absolute error was Usamentiaga et al.
0.066 mm [34]
2018 Rectangle pattern No corner points the accuracy around 0.0908 Tran et al. [35]
mm
2013 Combined target No gauge block endpoints Repeatability precision Huang et al. [36]
within 0.04 mm
2009 A plane target with black and No perpendicular constraint of intersection line absolute error is in range Zhang et al. [37]
white triangles of0.35 mm

3
X. Xu et al. Results in Physics 19 (2020) 103637

solution of the matrix is coped with at least three “conjugate pairs”. shown that the accuracy can be within 0.03 mm while ensuring high
Based on this method, Xie [24] improved the calibration accuracy by precision of the target. Zhao proposed a system composed of two cam­
using a CMM, a PH10 rotary head and a ball target, where the maximum eras and a laser [33]. The point cloud on the sphere and the spherical
error was reduced to 0.03 mm. In addition of using ball target, Xie [26] center could be obtained according to the triangulation principle and the
proposed a calibration method with a single-tooth target. The target was surface fitting, respectively. In double-stripe structured light system
simple and it was convenient to obtain enough reference points. More­ [34], there is an additional light source to the traditional linear struc­
over, the novel algorithm was employed to extract the centerline instead tured light. It can provide additional information to compensate for the
of the traditional method and Steger method. However, the laser plane vibration caused by the movement of the measuring device. Moreover,
must be adjusted to a coordinate plane parallel to the CMM. in order to obtain better result, the exposure time of the target and the
Without auxiliary device, Sun [27] and Walch [28] established linear laser irradiation target are required to be inconsistent in the case of
equations between the camera and the laser sensor through known shooting at the same angle with only the target and the laser combined
reference points on a checkerboard. Sun obtained sub-pixel points by with the target, respectively. Tran [35] proposed a calibration method
modifying the method of centerline extraction. In order to diminish the for line-structured light sensors with multiple lines based on observing a
influence of environment, Walch eliminated the areas with too high or rectangular pattern and the viewed light lines projected on the planar
too low reflections according to the reflectivity [28]. In addition, the plane. In Huang’s method [36], the calibration target is similar to the
camera was used to obtain color information of the object and fused with target in reference [32]. The target illustration is shown in Fig. 7. Zhang
the point cloud. Compared with the method using single checkerboard [37] adopted a planar target with black and white, as shown in Fig. 8.
pattern, Xu [29,30] put forward a novel calibration method with com­ The innovation of this method lies in the use of the linear feature and the
bined measurement target. Xu extracted annular orbit of the intersection vertical constraint of the cross lines to replace the reference points on the
feature points and intersection points between the laser plane and the traditional target.
centers with 1D target with a checkered pattern and a 3D calibration In the method of solving linear equation, the key point is to obtain
board [29]. In Reference [30], Xu adopted two checkerboards as refer­ the correspondence relation of reference points in world coordinate
ence planes. The system error caused by the motion of the target can be system and camera coordinate system through a transformation matrix.
avoided owing to the existence of dual targets. Moreover, the re- Then, the transformation matrix is converted to calculate the light plane
projection error was applied to build optimization function to cut equation. Since the world coordinate system is assumed artificially,
down the calibration error. The calibration system is shown in Fig. 4. some methods, such as Sun [27], directly uses the reference points on
Yin utilized the information obtained by binocular camera to obtain the target to obtain the transfer matrix, while others, such as Zhang [37],
better location accuracy than traditional algorithms. In addition, Ori­ calculates the reference point by using the constraint relationship of the
ented FAST and Rotated BRIEF (ORB) was used as feature information, target itself through specific targets. The key of the calibration method
and Random Sample Consensus (RANSAC) further eliminated mis­ lies in how to obtain the world coordinates and pixel coordinates of
matches [31]. control points with high accuracy. In this way, the transformation ma­
More researchers have tried to acquire the transformation matrix by trix can be solved more accurately and the plane equation of the light
special targets. Santolaria calibrated the structured light system by a with better quality can be acquired. Image processing can recognize the
gauge target [32], which is shown in Fig. 5. It is a high-precision gauge coordinates of pixel points. Sun utilized Bouguet’s method to extract
object which materializes points of well-known nominal coordinates in precise corners of the targets [27]. Tran [35] detected through Harris
its local coordinate system. Its points are distributed in different planes corner point. It is a good research direction of looking for improved
with an average dot diameter of 0.799 mm. The gauge object has a methods to extract the position of corner point better. The world co­
maximum flatness error of 0.003 mm. Moreover, in order to enhance the ordinates are more dependent on the accuracy of the target. The cali­
relationship between the reference point and the light plane, the target bration accuracy can be further refined by nonlinear optimization
reference point is extracted in the light environment. The reference algorithm with known transformation matrix [27].
point and centerline are shown in Fig. 6. The experiment results have

Fig. 4. Double checkerboard calibration system.

4
X. Xu et al. Results in Physics 19 (2020) 103637

Fig. 5. The target used by Santolaria [32].

Fig. 6. reference point and centerline extraction.

Fig. 8. Target used in Zhang [37] method.

Cross ratio invariance

The cross ratio invariance method is a fundamental theory in stereo


vision and widely used in line structured system. The imaging system is
shown in Fig. 9.
Fig. 7. Target used in Huang method [36].

5
X. Xu et al. Results in Physics 19 (2020) 103637

Fig. 9. Image system with cross ratio invariance.

According to the property of the cross ratio invariance, the rela­ calibration method of line structured light for underwater environment.
tionship can be expressed as: The SLSC-CR method based on the cross ratio invariance and the SLSC-
LP method based on the robust fitting of the laser line projection were
AB AC ab ac
: = : (8) utilized to capture 3-D coordinates on the laser stripe. Furthermore, the
BQ QC qb qc
least square method was used to fit with more accurate plane parame­
where AB, AC, BQ and QC are Euclidian distances in the world coordi­ ters. Pan recognized the pixel position of the target feature points by
nate system, respectively, ab, qb, ac and qc are pixel distances of cor­ multi-scale method [47]. Moreover, an optimization function was
responding coordinate points in the pixel coordinate system, established by considering the positioning uncertainty of the target
respectively. In this way, the intersection point between the laser line feature points and the fringe points.
and the target is calculated, and the plane equation of light can be Obviously, although the targets applied are various, the calibration
determined. errors are basically similar. Therefore, in general, it is a good choice to
Methods of cross ratio invariance are summarized and shown in use simple target such as checkerboard. Furthermore, this method re­
Table 3. quires the intersection between the laser fringe and the control point
Xie [12] and Chu [13] proposed a novel method to calibrate the line, so extracting the centerline accurately is also a key point, which
intrinsic and extrinsic parameters of the structured light sensor based on directly affects the calibration accuracy.
the cross ratio invariance simultaneously. Stöcher achieved a more ac­
curate solution through multiple iterations [38]. Wei [39], Zhou [40], Vanish points and lines
Wu [41], Wei [42] and Chen [43] adopt a method similar to Xie [12],
where the difference is the auxiliary targets. Wei used a 1-D target [39]. Vanish point and vanish line are also theories in stereo vision, and
Zhou applied a planar target with a grid array [40]. Wu employed a the principle is shown in Fig. 11.
target with a concentric array [41]. Wei adopted a target with a sort of After the perspective projection, the parallel lines in space are no
control point [42] and Chen applied with a common checkerboard [43]. longer parallel in the camera coordinate. Ideally the parallel lines in
These targets are shown in Fig. 10. space intersect at a point in the camera coordinate, which is defined as
Xie took into consideration of the extraction error of the reference the vanish point of the parallel lines. As shown in Fig. 11(b), point v1 and
point leading to the error of the distortion coefficient [44]. Thus, the point v2 are the vanish points of parallel lines in vertical and horizontal
reference point was extracted from the target, and the lens distortion directions of the target, respectively. The line l of the two points con­
coefficient was obtained on the basis of the cross ratio invariance. Jorge stitutes the vanish line of the target plane under the position. It should be
proposed a method to integrate the laser triangulation sensor (LTS) into noted that the placement of the targets should avoid the situation that
an articulated arm coordinate measuring machines (AACMM) [45]. The the target plane and the camera’s image plane are parallel to each other.
calibration target was consistent with the target of reference [39].This Otherwise, mapping relation between them will become an affine
method can be easily applied to the integration of manipulator and transformation. If the affine transformation is parallel, the image of the
coordinate measuring machine. Flávio [46] proposed a system parallel lines will be parallel too, and won’t intersect at one point.
The line equation of image is assumed as ax + by + c = 0. The

6
X. Xu et al. Results in Physics 19 (2020) 103637

Table 3 corresponding plane equation is Ax + By + Cz + D = 0. Let the equa­


Methods of cross ratio invariance. tion parameter of the vanishing line be l = [a, b, c]T , The following
Year Calibration auxiliary Features accuracy References relationship can be obtained:
target device type
l = K − T nT (9)
2014 planar target CMM intersection Maximum Xie et al.
etched with points error [12] where K is camera intrinsic parameter, n represents the normal
grid lines between the 0.046 mm
vector of the light plane. Hence, the normal vector of the light plane is
laser plane
and the grid solved by Eq. (9). However, D ,as another important parameter, can be
lines on the gained by a known reference point on the target.
target The methods of vanishing points and lines are summarized and
2001 cube frame No device corner Standard Chu et al. shown in Table 4.
points of the variance is [13]
Rather than considering various targets, most articles focused on
cube frame 0.0041
mm how to obtain the forth parameter D of the laser plane. Wei obtained at
2005 a “wire-frame No device points of six Maximum Stöcher least one 3-D feature point on the light plane to determine the parameter
model” cube different error et al. [38], D [14]. Moreover, Wei determined the parameter D as the distance of
planes 0.0241
each two adjacent parallel lines known exactly [15]. Meanwhile, the
mm
2010 1-D (one- No device intersection Maximum Wei et al.
simulation results of the key influencing factors in the calibration
dimension) points error [39] confirmed that the image noise, the quality of centerline extraction and
target between the 0.065 mm the parallelization of parallel lines would affect the calibration results.
laser plane In addition to the references methods above, Liu calculated D by using
and the
the coordinates of known points in the target coordinate system [49].
target
2005 planar target No device the RMS error Zhou et al. Xie reckoned that the vanishing point is not always at the intersection of
contains a intersection 0.085 mm [40] multiple parallel lines due to environmental noise [50], positioning
pattern of 3 points of the error of the moving platform, lens distortion, image extraction error of
× 3 squares feature line
coding point, etc. Levenberg-Marquardt (LM) optimization algorithm
and the
fitted lines
was utilized to obtain the coordinates of the optimal vanishing point.
2011 Planar target No device the center of Average Wu et al. The optimized objective function is:
with the Error [41]
concentric concentric 0.030 mm ∑1
n×k−
(aj ue + bj − ve )2
circle array circle
F(ue , ve ) = (10)
j=1
a2j + 1
2015 specifically No device control absolute Wei et al.
designed points error less [42]
calibration than 0.1 where ue,ve are the horizontal and vertical pixel coordinates, respec­
board with mm tively, aj,bj are the parameters of the jth line equation v = aj u +bj in the
sorted pixel coordinate system.
control
Shao [51] calculated D by the principle of the intersecting planes. Pei
points
2019 1D target No device the Standard Chen et al. [52] adopted the method similar to Xie [50]. However, Pei [52] took the
intersection deviation [43] average sampling as the calibration result. The innovation of Wei [53]
points of the less than was proposing a new method of the distance constraint of parallel lines
laser plane 0.005 mm being applied to calculate the 3-D coordinates of feature points on the
and the
target
light plane. The distance constraint is:
2010 planar target No device cross points Maximum Xie et al.
dc = ac xcp + bc ycp + cc zcp (11)
with square of the lines error [44]
patterns 0.0075
mm where [ac , bc , cc ] is normal vector of the light plane, [xcp , ycp , zcp ] is a point
2011 gauge plane articulated the Maximum Santolaria on the laser plane. Chen [54] has also improved the acquisition of
arm intersection error 0.01 et al. [45] parameter D by using the perspective-three-point model, the schematic
coordinate points of the mm diagram of which is shown in Fig. 12:
measuring laser plane
machines and the
The distance from the origin of the camera coordinate system to the
(AACMM) target light plane is derived as:
2015 checkerboard No device intersection RMS Error Lopes et al. ̅̅→
point 0.1068 [46] |Oc A⋅nL |
between the mm
D= (12)
‖nL ‖
projector
plane and
where nL is the normal vector of the optical plane. The experimental
target
horizontal results have shown that the calibration method had a high accuracy with
line an average measurement error of 0.0451 mm and a relative error of
2019 planar target No device intersection RMS Error Pan et al. 0.2314%.
with control point 0.4 mm [47]
Similar to the cross ratio invariance principle, the vanish points and
points between the
laser and the
lines are also the fundamental theories in stereo vision. The normal
target vector of the light plane can be obtained by calculating the vanish line
1999 steel plate No sliced circles Error Huynh equation and combining the intrinsic parameters of the camera. In most
and a ball within the et al. [48] references, the normal vector is directly obtained according to the for­
array range of
mula. However, due to the influence of the image processing noise, lens
[0.32
mm,0.37 distortion and other factors, the parallel lines barely intersect with an
mm] ideal point, which leads to inaccurate vanish points and line. Chen [54]
and Xie [50] have acquired vanish points more accurately through

7
X. Xu et al. Results in Physics 19 (2020) 103637

Fig. 10. (a) target applied in Xie [12], (b) target applied in Wei [40], (c) target applied in Zhou [40], (d) target applied in Wu [41], (e) target applied in Wei [42],

8
X. Xu et al. Results in Physics 19 (2020) 103637

Fig. 11. (a) planar target (b) image of a planar target.

nonlinear optimization. However, the method based on vanish points The coordinates of the laser plane were obtained by LM optimization
and vanish lines is usually not suitable for non-professionals owing to and compared with the original method proposed by Xu [16]. The
the difficulty of understanding the knowledge of stereo vision. experimental results have shown that the relative errors were reduced
by 18.97%, 19.81%, 21.22% and 21.60%, respectively, under the test
distances of 200, 300, 400 and 500 mm.
Plücker matrix
Zhang accomplished the calibration of multi-sensor structured light
systems based on Plücker matrix [55]. The calibration accuracy and
The line structured light system based on Plücker matrix is shown in
robustness were enhanced by optimizing the light plane with all the
Fig. 13:
points on the centerlines. In addition, global calibration of multiple
The principle of line structured light based on Plücker matrix is as
vision sensors was achieved by calibrating each of the other vision
follows: The coefficient of the plane equation of the target plane Qc(i) in
sensors with the base vision sensor.
the plane target coordinate system is Qt = [0, 0, 1, 0]T , and the coeffi­ Three papers are based on Plücker matrix and have adopted the same
cient of the plane equation in the camera coordinate system is: methods essentially to solve the light plane equations [161755]. How­
Qc(i) = Tt(i),c Qt (13) ever, in order to achieve better accuracy, researchers could integrate the
Plücker matrix with other algorithms. The comparison of these methods
The solution of Tt(i),c is obtained by Zhang’s calibration method. are shown in Table 5.
The light li is mapped to the camera image plane by the projection Compared with other methods such as intersection ratio invariance
matrix M(i) to generate the 2-D line lim(i) . lim(i) and the camera center can and linear equation solution, the line-structured light system calibration
form a plane Ql(i) : based on Plücker matrix does not need to obtain the reference points of
the light plane, which avoids the positioning errors caused by the mul­
Ql(i) = M T lim(i) (14)
tiple spatial coordinate transformations of the reference points. The
accuracy of this method is high because the optical plane equation can
where l(i) can be determined by the intersection of πc(i) and π l(i) . Then the
be determined simply by computing the Plücker matrix of the optical
dual Plücker L*(i) of L(i) is:
stripes. However, this complex method is not widely used on account of
its difficulty of understanding and solving.
L*(i) = Qc(i) (Ql(i) )T − Ql(i) (Qc(i) )T (15)

L(i) can be obtained by simply rewriting rules according to L*(i) : Other methods

l12(i) : l13(i) : l14(i) : l23(i) : l42(i) : l34(i) = l*34(i) : l*42(i) : l*23(i) : l*14(i) : l*13(i) : l*12(i) Various other calibration methods are summarized and shown in
(16) Table 6.
Niola proposed another calibration method that utilizes a digital
By placing the plane target at different angles by m times, L(i) can be
micrometer which can move the target of known size to obtain the world
calculated. Since all the laser stripes are on the same light plane, by
coordinates of the target reference points at different positions [56]. The
solving the null space of XT , the light plane Q satisfies:
system diagram is shown in Fig. 14.
XT Q = 0 (17) The transformation relation between the point in the camera coor­
dinate system and the point in the world coordinate system can be ob­
where X = [L(1) , L(2) , ⋅⋅⋅L(m) ] is a 4 × 4m matrix. tained by:
Xu [16] has derived the plane equation with the method above.
Moreover, the validity of this calibration method was experimentally
analyzed through the impact factors of noise magnitude and number of
images. Xu [17] utilized laser plane coordinates to parameterize the
projection plane, assuming that the projection plane obeys Gaussian
distribution. Thus, the maximum likelihood function was established.

9
X. Xu et al. Results in Physics 19 (2020) 103637

Table 4 Table 4 (continued )


Methods of vanishing points and lines. Year Calibration auxiliary Innovation accuracy References
Year Calibration auxiliary Innovation accuracy References target device point
target device point
on the model
2010 planar No Only one 3D RMS error Wei et al. of
rectangle device feature point 0.141 mm [14] perspective-
target on the light three-point.
plane is
acquired to
⎡ ⎤⎡ ⎤
determine D 1 0 0 0 cos(θlc ) 0 sin(θlc ) 0
parameter of ⎢ 0 cos(ψ lc ) − sin(ψ lc ) 0 ⎥⎢ 0 1 0 0⎥
the light [cT] = ⎢ ⎥⎢
⎣ 0 sin(ψ lc ) cos(ψ lc ) 0 ⎦⎣ − sin(θlc ) 0 cos(θlc )

0⎦
plane
0 ⎡ 0 0 1 ⎤⎡0 0 0 ⎤ 1
2014 plane with No Determine Maximum Wei et al.
several device the error 0.09 [15]
cos(φlc ) − sin(φlc ) 0 0 1 0 0 0
⎢ sin(φlc ) cos(φlc ) 0 0 ⎥⎢ 0 1 0 0 ⎥
parallels parameter D mm ×⎢ ⎥⎢ ⎥
when the
⎣ 0 0 1 0 ⎦⎣ 0 0 1 Δzlc ⎦
distance of 0⎡ 0 ⎤0⎡ 1 0 0 0⎤ 1
each two 1 0 0 0 1 0 0 Δxlc
adjacent ⎢ 0 1 0 Δylc ⎥⎢ 0 1 0 0 ⎥
parallel lines ×⎢⎣0 0 1
⎥⎢ ⎥
0 ⎦⎣ 0 0 1 0 ⎦
has been
0 0 0 1 0 0 0 1
known
exactly (18)
2018 planar No the Maximum Liu et al.
target with a device parameter D relative [49],
In addition, a method of defining the laser luminance coefficient and
set of of light plane error 0.4% setting the luminance threshold convolution on the centerline extraction
orthogonal is obtained by was proposed to make the centerline and the calibration result more
diameters using the accurate.
and relative coordinates
Xu [57] deduced an empirical matching difference-height model
distances of the known
known points in the valid for hardware configurations to reduce the calibration time. The
concentric target experiments results have shown that the proposed calibration method
circles coordinate had a similar accuracy as the conventional method while the calibration
system duration was reduced to 27%. The advantage of this method is that there
2017 1-D (one- No LM Maximum Xie et al.
dimension) device (Levenberg error [50]
is no strict restriction on the placement of targets. Whereas the disad­
target Marquardt) 0.06242 mm vantage is that it requires a ball with high precision and a high resolution
optimization camera is of necessary.
algorithm is Xu proposed a global calibration method for the light plane [58,59].
utilized to
The calibration target was a 3-D checkerboard, composed of three
obtain the
coordinates mutually orthogonal plane checkerboards and a height gauge. The
of the optimal calibration system is shown in Fig. 15:
vanishing The sigmoid-Gaussian function was used to normalize the eigen­
point values of Hessian matrix in Xu [58]. Thus, the centerline was more ac­
2019 planar No The Maximum Shao et al.
target with a device parameter D error 0.07 [51]
curate and reached to sub-pixel level. In addition, Xu [59] constructed
pattern of can be mm the error function according to the actual height of the feature points
two (or confirmed and the reconstructed height. Meanwhile, he optimized the objective
more) from the function through the local particle swarm optimization algorithm and
concentric principle of
obtained the optimal coefficient of the laser plane equation.
circles the
intersecting The calibration error of a projector is greater than a camera. Hence, a
planes. structured light system without the projector calibration can greatly
2006 target No Take the Maximum Pei et al. diminish the system calibration error and simplify the calibration pro­
contains device average Error 0.015 [52] cess. Luo used this method [60], where the projector’s line-of-sights
groove and sampling as mm
flume the
served as spatially vectors invariant to the environment. Through the
calibration camera model, the reflection light equation can be obtained:
result ̅→
(19)

2014 planar No The RMS error Wei et al. Lc = Oc + λ(Kc Rc )− 1 I
target with a device parameter D 0.134 mm [53]
̅→
set of is calculated where Oc is the representation of the camera origin in world co­
parallel by the ordinates, λ is an arbitrary value, Kc is the intrinsic parameter of the
straightlines distance
camera, Rc is the rotation matrix between the camera and the world
constraint of
parallel coordinate system. I’ is the homogeneous coordinate of the pixel point.
straight lines Liu proposed a fast calibration method for line structured light sys­
2017 Target with No The distance average Chen et al. tem based on single ball target [61]. By introducing the cone space of
circular device from the error0.0451 [54]
stereoscopic vision, the cone space equation can be obtained by the
sectors origin of mm
camera
fringe equation and the spherical contour equation, respectively.
coordinate Finally, under the maximum likelihood criterion, the optimal solution of
system to the the optical plane equation was obtained by nonlinear optimization.
light plane is Afterwards, Liu [62] used two cylindrical targets parallel to each other
derived based
to calibrate the line structured light. The essential difference between
this method and the method based on single ball is that no auxiliary

10
X. Xu et al. Results in Physics 19 (2020) 103637

Fig. 12. calibration model of Chen [54].

Fig. 13. line structured light system based on Plücker matrix.

mm and the measuring distance was 700 mm. Wei adopted a spherical
Table 5
target to determine the equation of the line structured light [63]. It can
comparison of different methods based on Plücker matrix.
avoid traditional defects that is lack of enough feature points.
Method Fusion algorithm Accuracy In most of these methods, targets with different shapes and complex
Xu [16] Plücker matrix and noise fusion error reductions of about formulas are utilized to calibrate the line structured light. Most of these
33% methods require a lot of formulas to be derived and many prior formulas
Xu [17] Plücker matrix and nonlinear error reductions of about to be used, which are not suitable for practical application in the in­
optimization 20%
Zhang Plücker matrix and centerline 0.04 mm
dustrial field. But as a field of academic research, it worth to be explored
[55] optimization further.

Centerline extraction
information is needed. Steger algorithm is utilized for centerline
extraction. The proposed method can achieve a calibration accuracy of In the line structured light system, most calibration methods require
0.07 mm in the condition that the field range was about 500 mm × 400 obtaining the intersection points between the laser stripe and the target.

11
X. Xu et al. Results in Physics 19 (2020) 103637

Table 6
Various calibration methods.
Year Calibration target auxiliary Feature point accuracy References
device

2011 planar target digital A cylindrical lens and an optical filter are provided to project a plane light – Niola et al.
micrometer and segment the laser stripe from the rest of the scene [56]
2013 a flat board with four No device improves the resolution and consistency by 3% and 13% in comparison Maximum error 0.4 mm Xu et al.
balls with the traditional models and the calibration duration is reduced to 27% [57]
2014 a 3D calibration Height gauge sigmoid-Gaussian function is applied during centerline extraction to – Xu et al.
board prevent centers missing or multi-centers [58],
2016 a 3D calibration Height gauge the local particle swarm optimization is employed to optimize the the error values concentrate in Xu et al.
board objective function the range of − 4 to + 2 mm [59]
2014 planar board with No device calibration method don’t need projector calibration Maximum error 0.0925 mm Luo et al.
ring patterns [60]
2015 ball target No device The method is suitable for narrow spaces or for the rapid on-site Maximum Error 0.04 mm Liu et al.
calibration of several line-structured light vision sensors from multiple [61]
perspectives
2015 the parallel cylinder No device The method does not need any auxiliary information and is suitable for Maximum Error 0.07 mm Liu et al.
target on-site calibration in complex light environments [62]
2013 a ball target and a No device Avoid the lack of calibration feature points Maximum Error 0.102 mm Wei et al.
reference board [63]

Fig. 14. Niola [56] calibration system.

Fig. 15. Xu [58] line structured light system.

12
X. Xu et al. Results in Physics 19 (2020) 103637

Hence, the extraction of laser centerline is of extremely significant. Hence, a quantity of articles have made efforts to exploit the property of
Researchers have proposed a variety of algorithms for extracting the Hessian matrix. Steger algorithm is based on Hessian matrix and can
centerline of laser stripes. The centerline extractions can be categorized realize the sub-pixel positioning of the centerline extraction [18].
into Gray centroid method, Steger algorithm and Skeleton extraction Firstly, Gaussian filter is applied to the image process, and the value of
method. parameter σ is related to the width of the stripe. The normal direction of
When using Gray centroid method, it is important to perform image the light is gained through the Hessian matrix, and sub-pixel points can
pre-process first, such as image denosing. Izquierdo utilized canny al­ be approximated by Taylor expansion in the normal direction. More­
gorithm to detect the normal vector of the centerline, and Gaussian over, a threshold range is adjustable to eliminate the incorrect points.
shaped windows is adopted to remove Gaussian noise [19]. Zhang used The sub-pixel coordinates of the laser stripe are:
rectangular step function to model the flat top laser [65]. Hence, the
(px , py ) = (tnx , tny ) (21)
brightness of the background and foreground can be estimated. Wang
used Sobel operator to extract the gradient vector of the centerline [67].
where nx, ny are the eigenvectors according to the maximum two ei­
The regional growth statistics method further eliminates the influence of
genvalues of Hessian matrix corresponding to the normal direction of
random laser speckle noise. In addition, this method is suitable for the
the light t is shown as follows:
uneven reflective metal surface considering the laser scattering and
reflection models. t= −
rx nx + ry ny
(22)
Rubén referred that variable brightness, reflected noise in the image rxx n2x + 2rxy nx ny + ryy n2y
and uneven surface would affect the profile shape in industrial field
On the basis of Steger [18], Xu [66] proposed a method of centerline
[70]. To solve these problems, a boundary linking process was adopted
extraction using RANSAC method. The laser stripe was fitted by RAN­
to extract the laser stripe. Li [72] calculated the initial coordinates of the
SAC method to avoid the interference of the outliers. In addition,
centerlines by gray centroid method. Then, the tangential vector, the
RANSAC method was compared with Hough transform and least square
normal vector and the radius of curvature were acquired by least squares
to verify the reliability of the method. In the cross-structured light sys­
algorithm. The accurate results of the centerlines can be obtained by re-
tem, Hedivides categorized the laser stripe center points into two kinds
calculating the normal vector in a rectangle region around the points.
based on the property of Hessian matrix [74]. Du detected potential
Furthermore, the center uncertainty was also analyzed based on the
laser stripe regions on the basis of Hessian matrix [75]. Then the regions
Monte Carlo method.
were sorted according to the scene irradiance density after calibration.
Sun proposed a robust laser fringe center extraction method [64].
Finally the columnar peak detection was carried out by region
The laser stripe distribution model is shown as follows:
sequencing. This method is suitable for industrial applications with

⎨ h1 x ∈ [0, l1 ) multiple reflections among reflective and uneven surfaces.
B(x) = h2 x ∈ [l1 , l2 ] (20) Skeleton extraction is also commonly used in centerline extraction.

h1 x ∈ (l2 , 1] Jang introduced the concept of skeleton extraction to detect more gen­
eral structures [68]. A candidate skeleton was extracted from the
where h1 is the intensity of the background light, h2 is the intensity of Euclidean distance map. This method has good robustness and can deal
the laser stripe. l1 and l2 are the two sides of the laser stripe. with comprehensive and natural images, but its real-time performance is
Based on the grey level moment, the closed solution of the center line poor. Liu [69] proposes an image binarization method based on wavelet
in each section can be achieved. By adopting Reinsch’s smooth spline domain gray stretch. The method combines the discrete wavelet trans­
algorithm to fit these scatter points, the noise is eliminated. Experiments form with a threshold and emphasizes the grayscale stretching and edge
have shown that this method has high accuracy and robustness in the enhancement of the strip image. Brandli proposed a combination of a
presence of noise. The contrast experiment of centerline extraction bio-inspired, redundancy-suppressing dynamic vision sensor (DVS) with
based on Sun [64] algorithm and Gaussian method is shown in Fig. 16. a pulsed line laser to allow fast terrain reconstruction [71]. The stable
Hessian matrix can represent the normal vector of the centerlines. laser fringe extraction was realized by using sensor to capture temporal

Fig. 16. (a) The centerline extraction based on Sun [64] algorithm; (b) the centerline extraction based on Gaussian method.

13
X. Xu et al. Results in Physics 19 (2020) 103637

dynamics in the scene. By using the high-resolution temporal informa­ noise in the image can be removed by region segmentation. The contrast
tion of the DVS output, the scoring function was constructed to collect experiment is shown in Fig. 17.
the relevant event information, then the maximum score value of each Fasogbon [78] proposed a laser fringe extraction method for Speckle
column was determined by averaging the score graph. The centerline noise caused by surface morphology of metal objects. By assuming that
can be determined by whether the maximum peak is higher than the the fringe is Gaussian distributed, the approximate centerline of the
threshold θ. If adjacent pixels are also above the threshold, they are fringe can be obtained by parabolic estimator. The sub-pixel interpola­
weighted averaged to determine the center of the laser stripe. tion algorithm based on three neighboring points were employed.
Sun proposed a fast and robust algorithm for laser stripe center Finally, the final result was acquired by further weighting by gray
extraction based on Legendre moment theory [73]. An ideal 1-D light centroid method.
intensity model of laser stripe was given based on the uniformly distri­
Ĩ +1 − Ĩ − 1
bution. The section of the laser fringe was derived from the closed-form S(v) = u0 + (23)
solution of the centerline according to conservation laws of Legendre Ĩ − 1 + Ĩ 0 + Ĩ +1
moment. Yin [76] proposed a two-step laser centerline extraction
method under noisy environment. An adaptive convolution quality where Ĩ− 1 , Ĩ0 , Ĩ+1 are the neighboring pixels after Gaussian filtering.
extraction method based on geometric information and correlation co­ As can be seen, various algorithms have been proposed to obtain the
efficient was proposed since the noise cannot be removed effectively by center line of the laser fringe better. However, most of them are based on
general preprocessing. Wang [77] proposed an accurate method for the traditional image processing algorithms. The laser stripe is processed by
extraction of the line structured light. According to the geometric filtering and morphological method, yet these filters can only remove
characteristics of the structured light projection in binary image, the some of the noise caused by electronic devices. Although morphological
structured light was separated and extracted precisely. Because of the method can reduce various types of image noise, it is difficult to remove
obvious gap between the highlighted part and the fringe, most of the the noise caused by object occlusion or specular reflection. In addition, if

Fig. 17. (a) structured light projection (b) traditional threshold segmentation (c) improved denoising segmentation.

14
X. Xu et al. Results in Physics 19 (2020) 103637

the environment is too complex, misoperation may occur. In conclusion, low light, high temperature and electronic circuit noise, can be elimi­
although these methods improve the accuracy of the centerline extrac­ nated by the deep learning method. The deep learning denoising method
tion, there are still some limitations, such as the processing capacity for represented by DnCNN, NN3D can also be used for image dehazing [85],
specular reflection and existing object occlusion, which restrict their which has many applications in traffic surveillance. Similarly, as the
applications. camera produces mixed noise due to temperature fluctuations, trans­
mission failures or poor ambient conditions, there are other models
Outlook suitable [86] that can remove these noises. In structured light system,
these problems do also exist, such as camera noise, too bright or dark
A lot of efforts have been put into two key techniques of line struc­ ambient light source, reflection on the metal surface image, and light
tured light systems: calibration algorithm and centerline extraction al­ stripes blocked by other objects. For the linear structured light system,
gorithm. However, the applications of both algorithms are limited the image with good quality is important for the calibration and imaging
mostly due to the strict requirements of the environment. For indoor of the linear structured light system. If the problems above cannot be
environments, structured light systems can be affected by the reflectivity dealt with properly, the application will be severely limited, and the
of the target surface or the interference from the ambient light. For calibration accuracy and imaging quality will be compromised. How­
example, metal surfaces have strong specular reflections that cannot be ever, most papers focus on how to perform image denoising better by
measured directly by standard laser scanners. In addition, due to the adding complex algorithm while few papers focus on how to eliminate
object occlusion, the laser fringe may be missing, resulting in data loss or the noise in line structured light system by the deep learning. Therefore,
error. Traditionally, such noise problems are mainly treated by filtering image denoising based on the deep learning is of great significance in the
or morphological processing [19,67,76]. Mask templates and self- online structured light system. The uses of these methods to obtain good
adaptive convolution algorithms are utilized to manipulate images for post-processing images in complex environments or accompanied with
better extractions. However, these methods tend to be strict with strong noises are promising in future applications.
ambient light and have limited processing power for reflective metal
surfaces and shaded areas. Table 7 lists the precisions of several repre­ Conclusion
sentative methods, including their operation difficulties.
In recent years, deep learning technologies, represented by con­ In this paper, various methods for calibrating line structured light
volutional neural network, have shown excellent performance in object sensor and centerline extraction are reviewed. Basic principles of
classification, segmentation and image denoising. Over the past few different structured light calibration methods are explained. Major
years, several image denoising technologies have been developed to methods include normal vector, the cross ratio invariance, the linear
improve the image quality, including different CNNs for image recovery equation solution, vanishing points and lines, and Plücker matrix. There
based on residual learning model (DnCNN-S [79], DnCNN-B IDCNN are some other methods though, but lacking of good reproducibility
[80]), non-locality reinforced (NN3D) [81], fast and flexible network values due to complex formulations or precise tailor-made targets. Major
(FFDNet) [82], deep shrinkage CNN (SCNN), denoising prior driven methods based on the normal vector are simple in principles, which do
network (PDNN) [83]. not have strict requirements for the targets, while the calibration results
In allusion to line structured light system, Fang [84] proposed a are generally kept accurate. The method of solving linear equations
method of image denoising using a convolutional autoencoder. The laser focuses on obtaining the correspondence relationship of the reference
image is divided into three types of noise and three types of environment points in the world coordinate system and camera coordinate system
to make it easier for the autoencoder to train the image. The autoen­ through a transformation matrix, followed by using the transformation
coder has two parts, one is an input encoder based on hidden layer, matrix to solve the light plane. However, the calibration accuracy is not
another is an output decoder. The encoder function is trained by LSIP high due to the small number of the reference points. The cross ratio
data set, and AdaDelta is used as the gradient descent optimizer in the invariance method applies the cross ratio invariance principle to
training of CAE. Experiments showed that this method can obtain good calculate the coordinates of the line between the reference points on the
visual and quantitative performance in the denoising task. target and the intersection point of the laser stripe. This method also has
It can be seen that the deep learning method has a wide range of the problem that the calibration accuracy is insufficient due to the small
applications in the field of image denoising, including biomedical im­ number of control points. For the method of vanishing points and lines,
ages, remote sensing images, fuzzy images, hyperspectral images, data the normal vector of the light plane can be obtained by calculating the
security and encryption images. Gaussian noise generated in the process vanishing line equation and combining the intrinsic parameters of the
of image acquisition and transmission, such as sensor noise caused by camera. Because this method avoids the requirement of control points,
its calibration accuracy is higher than the other two methods above.
However, this method requires traditional stereovision, which is diffi­
Table 7 cult for non-professionals to comprehend. Similarly, since the line
Comparison of typical center line extraction algorithms based on filtering. structured light system calibration based on Plücker matrix does not
Method Standard Operative Adaptation Ambient light need to obtain the standard points of the light plane, which avoids the
deviation/ difficulty range requirements positioning error caused by the standard points in the process of multiple
pixel
spatial coordinate transformation, the calibration accuracy of this
Improved gray 0.099 easy diffuse strict method is high. But it is not suitable for non-professionals to compre­
centroid method reflection hend and operate, neither. The comparison of these methods is sum­
[19] objects
Regional growth 0.2321 general metal strict
marized in Table 8.
statistics surface In the aspect of laser stripe centerline extraction, most algorithms are
method based on the traditional image processing. The laser stripe is processed
combine with by filtering and morphological method. However, these are some limi­
Sobel operator
tations, such as only the image noise caused by electronic equipment can
and gray
centroid method be reduced, while object occlusion and specular reflection caused by the
[67] noise are difficult to be removed. In addition, if the environment is too
Self-adaptive 0.8 easy metal strict complex, misoperation may occur, which restricts its application in
convolution surface complex environment.
technique [76]
In the view of the analysis above, we believe that the major methods

15
X. Xu et al. Results in Physics 19 (2020) 103637

Table 8 [9] Fan J, Jing F, Fang Z, A simple calibration method of structured light plane
Comparison of typical center line extraction algorithms based on filtering. parameters for welding robots., et al. 35th Chinese Control Conference (CCC). IEEE
2016;2016:6127–32.
Method Calibration Operative professional knowledge [10] McIvor A M. Calibration of a laser stripe profiler. Second International Conference
accuracy difficulty required on 3-D Digital Imaging and Modeling (Cat. No. PR00062). IEEE, 1999: 92-98.
[11] Che C, Ni J. A ball-target-based extrinsic calibration technique for high-accuracy 3-
vector cross low easy basic geometry D metrology using off-the-shelf laser-stripe sensors. Precis Eng 2000;24(3):210–9.
product [12] Xie Z, Wang X, Chi S. Simultaneous calibration of the intrinsic and extrinsic
solution of medium easy basic geometry parameters of structured-light sensors. Opt Lasers Eng 2014;58:9–18.
linear [13] Chu C W, Hwang S, Jung S K. Calibration-free approach to 3D reconstruction using
equations light stripe projections on a cube frame. Proceedings Third International
Cross ratio medium easy simple stereo knowledge Conference on 3-D Digital Imaging and Modeling. IEEE, 2001: 13-19.
invariance [14] Wei Z, Xie M, Zhang G. Calibration method for line structured light vision sensor
Vanishing high normal vanishing points and lines based on vanish points and lines. In: 2010 20th International Conference on
points and related to perspective Pattern Recognition; 2010. p. 794–7.
[15] Wei Z, Shao M, Zhang G, et al. Parallel-based calibration method for line-structured
lines projection
light vision sensor. Opt Eng 2014;53(3):033101.
Plücker matrix high normal Plücker matrix and
[16] Xu G, Zhang X, Su J, et al. Solution approach of a laser plane based on Plücker
projective geometry
matrices of the projective lines on a flexible 2D target. Appl Opt 2016;55(10):
2653–6.
[17] Xu G, Zheng A, Li X, et al. Optimization Solution of Laser Plane Generated from
in current regarding to line structured light calibration fit well to the Maximum Likelihood Estimation of Projection Plane. Sensors and Materials 2018;
requirements of imaging in the future. However, since the calibration 30(5):1155–64.
[18] Steger C. An unbiased detector of curvilinear structures. IEEE Trans Pattern Anal
errors are mostly due to the image processing, the denoising of the laser Mach Intell 1998;20(2):113–25.
image can significantly reduce the calibration errors. As the line struc­ [19] Izquierdo MAG, Sanchez MT, Ibanez A, et al. Sub-pixel measurement of 3D surfaces
tured light is mainly applied in the peculiar areas such as forging mea­ by laser scanning. Sens Actuators, A 1999;76(1–3):1–8.
[20] Qi Y, Jing F, Tan M. Line-feature-based calibration method of structured light plane
surement and welding seam tracking, traditional image processing parameters for robot hand-eye system. Opt Eng 2013;52(3):037202.
techniques are difficult to meet the requirements of the precision. In [21] Bi D, Lu X. A new flexible approach for single laser stripe profiler calibration. In:
recent years, a number of denoising methods based on neural network 2008 International Conference on Information and Automation; 2008. p. 76–80.
[22] Kiddee P, Fang Z, Tan M. A practical and intuitive calibration technique for cross-
have emerged for traffic and other fields, but few works have been
line structured light. Optik 2016;127(20):9582–602.
conducted of these methods being applied to the line structured light [23] Zexiao X, Chengguo Z, Qiumei Z. A simplified method for the extrinsic calibration
systems. The application of the existing image denoising algorithms, of structured-light sensors using a single-ball target. Int J Mach Tools Manuf 2004;
especially those based on neural network and improved image denoising 44(11):1197–203.
[24] Zexiao X, Qiumei Z, Guoxiong Z. Modeling and calibration of a structured-light-
methods shall play an important role in line structured light imaging sensor-based five-axis scanning system. Measurement 2004;36(2):185–94.
systems. [25] Hui-yuan X, You X, Zhi-jian Z. Accurate extrinsic calibration method of a line
structured-light sensor based on a standard ball. IET Image Proc 2011;5(5):369–74.
[26] Zexiao X, Jianguo W, Qiumei Z. Complete 3D measurement in reverse engineering
CRediT authorship contribution statement using a multi-probe system. Int J Mach Tools Manuf 2005;45(12–13):1474–86.
[27] Sun Q, Hou Y, Tan Q, et al. A flexible calibration method using the planar target
with a square pattern for line structured light vision system. PLoS ONE 2014;9(9).
Xiaobin Xu: Investigation, Writing - review & editing. Zhongwen
[28] Walch A, Eitzinger C. A combined calibration of 2D and 3D sensors A novel
Fei: Investigation, Writing - original draft. Jian Yang: Writing - review calibration for laser triangulation sensors based on point correspondences. 2014
& editing. Zhiying Tan: Investigation. Minzhou Luo: Investigation. International Conference on Computer Vision Theory and Applications (VISAPP).
IEEE, 2014, 1: 89-95.
[29] Xu G, Hao Z, Li X, et al. An optimization solution of a laser plane in vision
Declaration of Competing Interest measurement with the distance object between global origin and calibration
points. Sci Rep 2015;5(1):1–16.
[30] Xu G, Yuan J, Li X, et al. Optimization reconstruction method of object profile
The authors declare that they have no known competing financial using flexible laser plane and bi-planar references. Sci Rep 2018;8(1):1–11.
interests or personal relationships that could have appeared to influence [31] Yin L, Wang X, Ni Y. Flexible three-dimensional reconstruction via structured-light-
based visual positioning and global optimization. Sensors 2019;19(7):1583–600.
the work reported in this paper.
[32] Santolaria J, Pastor JJ, Brosed FJ, et al. A one-step intrinsic and extrinsic
calibration method for laser line scanner operation in coordinate measuring
Acknowledgements machines. Meas Sci Technol 2009;20(4):045107–18.
[33] Qiang Z, Wei W. Calibration of laser scanning system based on a 2D ball plate.
Measurement 2009;42(6):963–8.
This research was funded by the Fundamental Research Funds for the [34] Usamentiaga R, Molleda J, Garcia DF. Structured-light sensor using two laser
Central Universities (Grant No. B200202221) and Jiangsu Key R&D stripes for 3D reconstruction without vibrations. Sensors 2014;14(11):20041–63.
[35] Tran TT, Ha C. Extrinsic calibration of a camera and structured multi-line light
Program (Grant Nos. BE2018004-1, BE2018004).
using a rectangle. Int J Precis Eng Manuf 2018;19(2):195–202.
[36] Huang Y, Li X, Chen P. Calibration method for line-structured light multi-vision
References sensor based on combined target. EURASIP Journal on Wireless Communications
and Networking 2013;2013(1):92–8.
[37] Zhang M, Li D. An on-site calibration technique for line structured light 3D
[1] Wei C, Sihai C, Dong L, et al. A compact two-dimensional laser scanner based on
scanner. 2009 Asia-Pacific Conference on Computational Intelligence and
piezoelectric actuators. Rev Sci Instrum 2015;86(1):013102.
Industrial Applications (PACIIA). IEEE, 2009, 2: 30-33.
[2] Zhao H, Kruth JP, Van Gestel N, et al. Automated dimensional inspection planning
[38] Stocher W, Biegelbauer G. Automated simultaneous calibration of a multi-view
using the combination of laser scanner and tactile probe. Measurement 2012;45(5):
laser stripe profiler. Proceedings of the 2005 IEEE International Conference on
1057–66.
Robotics and Automation. IEEE, 2005: 4424-4429.
[3] Rahayem M, Werghi N, Kjellander J. Best ellipse and cylinder parameters
[39] Wei Z, Cao L, Zhang G. A novel 1D target-based calibration method with unknown
estimation from laser profile scan sections. Opt Lasers Eng 2012;50(9):1242–59.
orientation for structured light vision sensor. Opt Laser Technol 2010;42(4):570–4.
[4] Zhang Z. A flexible new technique for camera calibration. IEEE Trans Pattern Anal
[40] Zhou F, Zhang G. Complete calibration of a structured light stripe vision sensor
Mach Intell 2000;22(11):1330–4.
through planar target of unknown orientations. Image Vis Comput 2005;23(1):
[5] Abdel-Aziz YI, Karara HM. Direct Linear Transformation from Comparator
59–67.
Coordinates into Object Space Coordinates in Close-Range Photogrammetry.
[41] Wu X, Li AG, Wu DF, et al. Calibration of line structured light sensor for robotic
Photogramm Eng Remote Sens 2015;81(2):103–7.
inspection system. Appl Mech Mater Trans Tech Publications Ltd 2011;44:702–6.
[6] Hong Y, Ren G, Liu E. Non-iterative method for camera calibration. Opt Express
[42] Wei J, Liu Z, Cheng F. Vision sensor calibration method based on flexible 3d target
2015;23(18):23992–4003.
and invariance of cross ratio. First International Conference on Information
[7] Niu Z, Liu K, Wang Y, et al. Calibration method for the relative orientation between
Sciences, Machinery, Materials and Energy. Atlantis Press, 2015,1478-1484.
the rotation axis and a camera using constrained global optimization. Meas Sci
[43] Chen R, Li X, Wang X, et al. A planar-pattern-based calibration method for high-
Technol 2017;28(5):055001–11.
precision structured laser triangulation measurement. Optical Metrology and
[8] Zheng F, Kong B. Calibration of linear structured light system by planar
checkerboard. ICIA, Proceedings, Hefei 2004;2004:344–6.

16
X. Xu et al. Results in Physics 19 (2020) 103637

Inspection for Industrial Applications VI. International Society for Optics and [65] Xu Z, Tao Z. Center detection algorithm and knife plane calibration of flat top line
Photonics 2019;11189:1118914. structured light. Acta Photonica Sinica 2017;46(5):512001.
[44] Zexiao X, Weitong Z, Zhiwei Z, et al. A novel approach for the field calibration of [66] Xu G, Yuan J, Li X, et al. Reconstruction method adopting laser plane generated
line structured-light sensors. Measurement 2010;43(2):190–6. from RANSAC and three dimensional reference. MAPAN 2018;33(3):307–19.
[45] Santolaria J, Aguilar JJ, Guillomía D, et al. A crenellated-target-based calibration [67] Wang HF, Wang YF, Zhang JJ, et al. Laser stripe center detection under the
method for laser triangulation sensors integration in articulated measurement condition of uneven scattering metal surface for geometric measurement. IEEE
arms. Rob Comput Integr Manuf 2011;27(2):282–91. Trans Instrum Meas 2020;69(5):2182–92.
[46] Lopes F, Silva H, Almeida JM, et al. In: Structured light system calibration for [68] Jang JH, Hong KS. Detection of curvilinear structures and reconstruction of their
perception in underwater tanks. Cham: Springer; 2015. p. 111–20. regions in gray-scale images. Pattern Recogn 2002;35(4):807–24.
[47] Pan X, Liu Z. High-accuracy calibration of line-structured light vision sensor by [69] Liu L, Yang N, Lan J, et al. Image segmentation based on gray stretch and threshold
correction of image deviation. Opt Express 2019;27(4):4364–85. algorithm. Optik 2015;126(6):626–9.
[48] Huynh DQ, Owens RA, Hartmann PE. Calibrating a structured light stripe system: a [70] Usamentiaga R, Molleda J, García DF. Fast and robust laser stripe extraction for 3D
novel approach. Int J Comput Vision 1999;33(1):73–86. reconstruction in industrial environments. Mach Vis Appl 2012;23(1):179–96.
[49] Yan L I U, Dong W U, LI W, et al. A Novel Calibration Method Based on Vanishing [71] Brandli C, Mantel T, Hutter M, et al. Adaptive pulsed laser line extraction for
Points and Lines for Linear Structured Light Vision Sensor. DEStech Transactions terrain reconstruction using a dynamic vision sensor. Front Neurosci 2014;7:
on Computer Science and Engineering, 2018 (amms). 275–83.
[50] Zexiao X, Ruixin Z, Anqi Z. Extrinsic Parameters Calibration of Three-Dimensional [72] Li Y, Zhou J, Huang F, et al. Sub-pixel extraction of laser stripe center using an
Measurement System for Ultra-Large Scale Line-Structured Light Sensor. Chinese improved gray-gravity method. Sensors 2017;17(4):814–26.
Journal of Lasers, 2018, 44(10): 1004003.1-1004003.7. [73] Sun Q, Liu R, Yu F. An extraction method of laser stripe centre based on Legendre
[51] Shao M, Dong J, Madessa AH. A new calibration method for line-structured light moment. Optik 2016;127(2):912–5.
vision sensors based on concentric circle feature. J Eur Optical Soc-Rapid [74] He L, Wu S, Wu C. Robust laser stripe extraction for three-dimensional
Publications 2019;15(1):1–11. reconstruction based on a cross-structured light sensor. Appl Opt 2017;56(4):
[52] Yang P, Xu B, Wu L. Rapid calibration for line structured light vision sensors. 823–32.
Optoelectron Lett 2006;2(3):175–8. [75] Du J, Xiong W, Chen W, et al. Robust laser stripe extraction using ridge
[53] Wei Z, Li C, Ding B. Line structured light vision sensor calibration using parallel segmentation and region ranking for 3D reconstruction of reflective and uneven
straight lines features. Optik 2014;125(17):4990–7. surface. 2015 IEEE International Conference on Image Processing (ICIP). IEEE,
[54] Chen T, Sun L, Zhang Q, et al. Field Geometric Calibration Method for Line 2015: 4912-4916.
Structured Light Sensor Using Single Circular Target. Scientific Programming, [76] Yin XQ, Tao W, Feng YY, et al. Laser stripe extraction method in industrial
2017, 2017(PT.2):1526706.1-1526706.8. environments utilizing self-adaptive convolution technique. Appl Opt 2017;56(10):
[55] Zhang G, Liu Z, Sun J, et al. Novel calibration method for a multi-sensor visual 2653–60.
measurement system based on structured light. Optical Engineering, 2010, 49(4): [77] Wang J, Li Y, Jian Z, et al. An accurate method for the extraction of line structured
043602.1-043602.12. light stripe. J Robotics Networking Artificial Life 2017;4(1):1–4.
[56] Niola V, Rossi C, Savino S, et al. A method for the calibration of a 3-D laser scanner. [78] Fasogbon P, Duvieubourg L, Macaire L. Fast laser stripe extraction for 3D metallic
Rob Comput Integr Manuf 2011;27(2):479–84. object measurement. IECON 2016-42nd Annual Conference of the IEEE Industrial
[57] Xu J, Douet J, Zhao J, et al. A simple calibration method for structured light-based Electronics Society. IEEE, 2016: 923-927.
3D profile measurement. Opt Laser Technol 2013;48:187–93. [79] Zhang K, Zuo W, Chen Y, et al. Beyond a gaussian denoiser: Residual learning of
[58] Xu G, Sun L, Li X, et al. Global calibration and equation reconstruction methods of deep cnn for image denoising. IEEE Trans Image Process 2017;26(7):3142–55.
a three dimensional curve generated from a laser plane in vision measurement. Opt [80] Zhang F, Cai N, Wu J, et al. Image denoising method based on a deep convolution
Express 2014;22(18):22043–55. neural network. IET Image Proc 2017;12(4):485–93.
[59] Xu G, Hao Z, Li X, et al. Calibration method of laser plane equation for vision [81] Cruz C, Foi A, Katkovnik V, et al. Nonlocality-reinforced convolutional neural
measurement adopting objective function of uniform horizontal height of feature networks for image denoising. IEEE Signal Process Lett 2018;25(8):1216–20.
points. Opt Rev 2016;23(1):33–9. [82] Zhang K, Zuo W, Zhang L. FFDNet: toward a fast and flexible solution for CNN-
[60] Luo H, Xu J, Binh NH, et al. A simple calibration procedure for structured light based image denoising. IEEE Trans Image Process 2018;27(9):4608–22.
system. Opt Lasers Eng 2014;57:6–12. [83] Thakur RS, Yadav RN, Gupta L. State-of-art analysis of image denoising methods
[61] Liu Z, Li X, Li F, et al. Calibration method for line-structured light vision sensor using convolutional neural networks. IET Image Proc 2019;13(13):2367–80.
based on a single ball target. Opt Lasers Eng 2015;69:20–8. [84] Fang Z, Jia T, Chen Q, et al. Laser stripe image denoising using convolutional
[62] Liu Z, Li X, Yin Y. On-site calibration of line-structured light vision sensor in autoencoder. Results Phys 2018;11:96–104.
complex light environments. Opt Express 2015;23(23):29896–911. [85] Li B, et al. Benchmarking single-image dehazing and beyond. IEEE Trans Image
[63] Wei Z, Shao M, Wang Y, et al. A sphere-based calibration method for line Process 2019;28(1):492–505.
structured light vision sensor. Advances in Mechanical Engineering, 2013, 5: [86] Islam MT, Rahman SM, Ahmad MO, Swamy MN. Mixed Gaussian-impulse noise
580417.1-580417.8. reduction from images using convolutional neural network. Signal Processing-
[64] Sun Q, Chen J, Li C. A robust method to extract a laser stripe centre based on grey image Commun 2018:26–41.
level moment. Opt Lasers Eng 2015;67:122–7.

17

You might also like