Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Available online at www.sciencedirect.

com

ScienceDirect
Procedia Manufacturing 11 (2017) 132 – 140

27th International Conference on Flexible Automation and Intelligent Manufacturing, FAIM2017,


27-30 June 2017, Modena, Italy

3D metrology using a collaborative robot with a laser triangulation


sensor
Gil Boyé de Sousaa,*, Adel Olabia, Jorge Palosa, Olivier Gibarua
a
Arts et Métiers Paristech, 8 Boulevard Louis XIV, 59000 Lille, France

Abstract

Industrial robots are a key element in Smart Manufacturing systems. They can perform many different tasks such as assembly,
pick-and-place, or even 3D metrology operations. In order to perform 3D metrology, the robot is equipped with a 2D laser
triangulation sensor. The accuracy of the measurements made by this system is dependent of an accurate TCP (Tool Centre
Point) calibration and the accuracy of the robot. In this paper, a TCP calibration method is applied to a collaborative robot. The
hand-guiding feature of this kind of robots is used to establish a human-robot interaction to obtain the laser sensor TCP using a
calibration sphere. Experimental results are presented to validate the procedure and evaluate the quality of the measurements.

©
© 2017
2017TheTheAuthors.
Authors.Published
Publishedby by
Elsevier B.V.B.V.
Elsevier This is an open access article under the CC BY-NC-ND license
(http://creativecommons.org/licenses/by-nc-nd/4.0/).
Peer-review under responsibility of the scientific committee of the 27th International Conference on Flexible Automation and
Peer-review under responsibility of the scientific committee of the 27th International Conference on Flexible Automation and
IntelligentManufacturing
Intelligent Manufacturing.

Keywords: TCP calibration; Collaborative robot; Hand-guiding; Human-robot interaction; 2D Laser triangulation sensor; 3D metrology;
Calibration sphere.

* Corresponding author. Tel.: +33(0)3 20 62 22 10 ; fax: +33(0)3 20 62 27 59 .


E-mail address: gil.boye-de-sousa@ensam.eu

2351-9789 © 2017 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license
(http://creativecommons.org/licenses/by-nc-nd/4.0/).
Peer-review under responsibility of the scientific committee of the 27th International Conference on Flexible Automation and Intelligent Manufacturing
doi:10.1016/j.promfg.2017.07.211
Gil Boyé de Sousa et al. / Procedia Manufacturing 11 (2017) 132 – 140 133

1. Introduction

Industrial robots have an important role in automated manufacturing of vehicles, aircrafts and even robots
themselves. “Cobots” (collaborative robots) are the new era of industrial robots and their use is increasing in every
type of industry.These robots represent interesting properties in the industrial field:

• they do not need safety fences around them like traditional industrial robots
• they allow teaching robot paths by manual guidance (hand-guiding).
• the possibility to an operator to interact with the “cobot” to lighten charges and with that improve the operators’
ergonomics conditions.

In fact, at the beginning of this new generation robots, they were mainly used as weight compensators [1]. They
are now used in many other applications such as pick-and-place operations, assembly lines and also as a quality
control tool. One of the new applications is to perform measurements as a 3D metrology system. This application
will offer great flexibility to the industry. Some years ago, to measure with precision some objects it was necessary a
specific 3D measurement system like a Laser Tracker [2], metrology arms or a CMM (Coordinate Measuring
Machine) [3]. These specific measuring tools offer very good results, but they can only be used to metrology
purposes [4].
A collaborative robot can be considered as a universal machine, which can for example, in the same robotic cell,
measure objects, pick tools to perform different operations or even take workpieces to assemble [5]. In addition to
this, the collaborative robots allow an interaction between the human and the machine. One of the examples of
collaborative robots is the Kuka LBR iiwa 14 R820. This robot is a seven-joint robot equipped with torque sensors
in each of its joints. This allows not only to detect any contact with the robot body (safety feature), but also to
monitor the efforts of an assembly application (process feature). The torque sensors can also be used to measure the
efforts on each joint allowing an online position error correction [6].
One of the most important aspects to consider for this metrology application is the accuracy of robots. In terms of
accuracy Kuka iiwa robot can produce a position error of 2,5 mm which can be a drawback for 3D metrology
applications [6]. There is a set of factors contributing to this accuracy problem: geometric and nongeometric factors
[7]. The geometric ones consider the geometric parameters, the joints offset and the TCP definition [8]. The origin
of these deviations is the manufacturing process of the robot, which means that the real geometry of the robot
components does not match the previously designed and projected geometry [9]. In addition to this, gearboxes and
transmissions can produce errors due to its backlashes and its manufacturing errors too. The nongeometric errors are
mostly dependent of the robot configuration. These errors are caused by the compliance of the robot links and joints,
gearboxes backlashes, kinematic errors, encoder resolution and thermal effects [10].
TCP calibration is one of the important aspects concerning robots’ accuracy improvement. Classical methods of
this calibration consist on using a special cylindrical-type pointer to record at least three non-collinear TCP positions
changing the orientation of the robot touching always the same single point in the robot workspace [11]. Some work
has been done to obtain the TCP using contactless technologies such as laser sensors. In [12], a methodology is
applied to a standard sphere with the particularity of comparing all the robot positions to a single reference position.
Another work [13] is performed to obtain the TCP in an inverse way, as the laser sensor is fixed on the robot
workspace, independent from the robot and the calibration sphere is held by the robot.
In this paper, the TCP identification of a 2D laser triangulation sensor with a collaborative robot is demonstrated.
In the next section, the methodology of obtaining the TCP is described. Section III presents some experimental
results to validate the method previously described. Finally, in Section IV all the conclusions about this study are
exposed.
134 Gil Boyé de Sousa et al. / Procedia Manufacturing 11 (2017) 132 – 140

2. TCP Calibration

To perform the TCP calibration, it is necessary to place a calibration sphere inside the robot workspace. The
robot must be able to move around the calibration sphere to point with the 2D laser triangulation sensor on the
sphere. This operation is performed by a human that will place the sphere and pick the robot in hand-guiding mode
to verify that it is possible to acquire data from the calibration sphere with the 2D laser sensor installed at the robot-
flange (bottom of the robot).
Here are expressed the data that we have access at the beginning of the method (inputs) and the data that we want
to obtain (outputs) which is the laser sensor TCP:
Inputs: rotation matrix between the robot base frame and the robot-flange frame, and the associated vector

 
•   ;
 
 : 4x4 homogeneous transformation matrix between the robot base frame and the robot-flange frame
 : 3x3 rotation matrix between the robot base frame and the robot-flange frame
 : 3x1 position vector between the robot base frame and the robot-flange frame

• : number of measured profiles, 800 points/profile    .

Outputs: rotation matrix  and translation vector  between the robot-flange frame and the laser sensor frame

 
•   ;
 
 : 4x4 homogeneous transformation matrix between the robot-flange frame and the laser sensor frame
 : 3x3 rotation matrix between the robot-flange frame and the laser sensor frame
 : 3x1 position vector between the robot-flange frame and the laser sensor frame

Fig. 1. setup of the experiment and the transformation matrices between the different frames.
Gil Boyé de Sousa et al. / Procedia Manufacturing 11 (2017) 132 – 140 135

Fig. 1 illustrates the setup of the experiment. The collaborative robot is used with the laser sensor, mounted at the
bottom of the robot and the calibration sphere as represented in Fig. 1. The different transformation matrices
between the frames can be expressed by the following equation:

   
    (1)
 
 : position of the calibration sphere in the robot base frame
 : 4x4 homogeneous transformation matrix between the robot base frame and the robot-flange frame
 : 4x4 homogeneous transformation matrix between the robot-flange frame and the laser sensor frame
 : position of the calibration sphere in the laser sensor frame

2.1. Calculation of the position of the calibration sphere in the laser sensor frame

Equation (1) expresses the measured position of the centre of the sphere in the robot base frame from the
measures. This position of the centre of the sphere corresponds to a matrix multiplication between, the homogeneous
transformation matrix between the robot base frame and the robot-flange frame (obtained from the robot controller),
the homogeneous transformation matrix between the robot-flange frame and the laser sensor frame (the unknown
matrix that will be calculated), and the position of the calibration sphere in the laser sensor frame.



 

(2)
 

This position of the calibration sphere in the laser sensor frame must be calculated from each one of the 2D
profiles obtained from the laser sensor controller (see Fig. 2(a)). Firstly, each arc obtained from the laser sensor will
be fit to obtain a circle with his centre coordinates and radius by least squares method. The centre obtained will
correspond to two of the coordinates of  :  and  . Secondly, the radius obtained from the circle fit of the profile
will be used to calculate the third coordinate ( ) of the sphere position in the laser sensor frame, as it is shown in
Fig. 2(b). The equation to obtain  is:

        (3)

Where  is the radius of the calibration sphere (19,05 mm),  is the radius obtained from the circle fit of the laser
line and the term  is defined by the relative position of the laser line on the calibration sphere.
136 Gil Boyé de Sousa et al. / Procedia Manufacturing 11 (2017) 132 – 140

Profile Laser Sensor


2

-2

-4
z Laser

-6

-8

-10

Profile
-12
-20 -15 -10 -5 0 5 10 15
x Laser
a b

Fig. 2. (a) profile of an arc measured by the laser sensor; (b) representation of the method to obtain the third dimension of the sphere centre.

2.2. Calculation of the rotation matrix between the robot-flange frame and the laser sensor frame ( )

The following equations represent the start point of the methodology presented to obtain the TCP:

    


 

(4)
     

             (5)

To obtain better results,  profiles were measured. The minimum number of profiles required is two, but to have
an accurate result some more profiles are needed. This first step, the robot in the hand-guiding mode is only free in
translations. Every rotation around the robot-flange is blocked. This assumption will facilitate the operations below.

    


 

(6)
     

                (7)

As the rotations are blocked, there is no variation in   , which means that one single value of  is considered.
In addition to this, the position of the calibration sphere being always the same, it is possible to establish the
following relation:

 
                             (8)

 
       
      (9)

As it can be noticed on this stage, there is no need to know the perfect position of the calibration sphere in the
robot base frame. This represents an important advantage, otherwise it would be necessary to measure the position
of the calibration sphere by an external measuring system (like a Laser Tracker) or impose precisely the position of
the calibration sphere in the robot base frame.
Gil Boyé de Sousa et al. / Procedia Manufacturing 11 (2017) 132 – 140 137

In this step, we use the difference between two consecutive measures, in order to obtain a good result to the TCP.
It can be less precise to make all the differences with only one reference measure, for that reason, in our method the
comparison is always between two consecutive measures.
 
Considering      and    
      , there is the following equation system to
solve:

     (10)

 can then be calculated using the SVD (Singular Value Decomposition) and multiplying the right  and left
 singular matrices of   , respectively:

      (11)

2.3. Calculation of the position vector between the robot-flange frame and the laser sensor frame ( )

    


 

(12)
     

                (13)

For this second step, the movements of the robot are not the same. The human will point with the laser sensor on
the calibration sphere. Validating one position, the robot will then be free only in rotations around the robot-flange
frame. All the translations are blocked during this second step. As a result of it, all the vectors  are the same, so
they are simplified.
Once again, as the calibration sphere still have the same position in the robot base frame, the following relation
comes:

 
                             (14)

  
        
           (15)

On this step, we need to use the rotation matrix  calculated before, to solve the equation system.
  
Considering       and    
           , there is a system of
linear equations (  ) to solve to obtain  .

3. Validation and experimental results

To validate the methodology described above, some measurements were performed.


For this experimental procedure, the obtained laser sensor TCP was implemented into the robot controller. The
calibration sphere (R=19,05 mm) was then used as an object to measure in the robot workspace. It was placed
randomly in the robot workspace. With the hand-guiding feature, a human pointed the laser sensor on the sphere to
perform measures. To validate the accuracy of the obtained TCP, two translational paths were performed to measure
the same sphere with 2 different robot configurations. Two separate data were obtained and each one was treated to
obtain a sphere (see Fig. 3). In the table below there is the comparison between the coordinates of the sphere centre
and their radius.
138 Gil Boyé de Sousa et al. / Procedia Manufacturing 11 (2017) 132 – 140

Table 1. Comparison of the position and radius of two spheres measured with different robot configurations
Results Sphere 1 [mm] Sphere 2 [mm]  [mm]
 542.3245 542.3889 0.0644
 -10.5513 -10.5554 0.0041
 292.9964 292.9401 0.0563
 19.0225 19.0658 0.0433

Fig. 3. images of the 2 spheres obtained by least squares method to validate the methodology (red and blue profiles visible to illustrate different
robot configurations during data acquisition).

These results demonstrate a good accuracy, independent of the configuration of the robot measuring the sphere.
One of the relevant aspects is the good accuracy in terms of 3D metrology, as the radius obtained is correct within
0.03 mm and the position of the sphere is almost the same within 0.07 mm. These results demonstrate that it is
possible to use a non-calibrated robot to perform 3D metrology with precise geometrical results. One of the most
difficult aspects concerning the 3D measurement is the geometry extraction and mathematical representation,
especially if the size of the workpiece to measure is big and with complex shapes. Another important aspect to take
into account is the measurement noise that can influence negatively the results, and to avoid this it is necessary to
make a proper selection of the surface points measured by the laser sensor.
In addition to this, another validation measure was performed on a tube (see Fig. 4). It consisted in measuring its
external diameter from the acquired data. The measured tube is a stainless-steel pipe with a diameter of 60.3 mm.

Fig. 4. Cylinder measured by the collaborative robot and the laser sensor with rotational paths around the cylinder axis to have its diameter
Gil Boyé de Sousa et al. / Procedia Manufacturing 11 (2017) 132 – 140 139

Table 2. Comparison of the position and radius of two spheres measured with different robot configurations
Results External cylinder [mm]
 60.3427
 [mm] 0.0427

This result confirms that the robot is able to perform 3D metrology with a 2D laser triangulation sensor. The
relevant aspect is the 0.05 error for the cylinder diameter, which is possibly due to the fact that for this last
measurement, the robot movement was a rotation around the cylinder axis, which implies complex configurations of
the robot and all the joints are solicited.

4. Conclusion

This paper presents a precise TCP calibration method of a 2D laser triangulation sensor installed on a
collaborative robot. The laser sensor can be fixed at the bottom of the robot or can be coupled to the bottom of the
robot with a tool changer. The use of the tool changer will increase the flexibility of the robotic cell, allowing this
robotic cell to perform completely different tasks with the same robot.
The collaborative robot allowing the use of hand-guiding represent an advantage to teach the different positions
and orientations of the robot-flange to perform the TCP calibration.
Another advantage of this method is that it is not necessary to know the precise position of the calibration sphere
in the robot base frame. The only condition imposed to the calibration sphere is to be inside the robot’s workspace.
Experimental results show that with the collaborative robot it is possible to do precise metrology with 0,05 mm of
error in the sphere radius and also in the cylinder diameter. Those shapes were selected to be measured as they are
simple shapes where the radius is well known. To perform 3D metrology in more complex shapes it is necessary to
select the points on the surface taking into account measurement noise and geometry extraction.
The translation paths may represent less error than rotational paths due to the precision of the robot but this
possible negative influence does not change the accuracy of the geometric measurements concerning 3D metrology
as shown in the previous section.

Acknowledgements

This work was supported in part by the European Union’s Horizon 2020 research and innovation program under
grant agreement No 6888807.

References

[1] A. Cherubini, R. Passama, A. Crosnier, A. Lasnier and P. F. Fraisse., “Collaborative manufacturing with physical human-robot interaction,”
Robotics and Computer-Integrated Manufacturing, vol. 40, pp. 1-13, 2016.
[2] A. Nubiola and I. A. Bonev., “Absolute calibration of an ABB IRB 1600 robot using a laser tracker,” Robotics and Computer-Integrated
Manufacturing, vol. 29, pp. 236-245, 2013.
[3] A. Joubair, M. Slamani and I. A. Bonev., “Kinematic calibration of five-bar planar parallel robot using all working modes,” Robotics and
Computer-Integrated Manufacturing, vol. 29, pp. 15-25, 2013.
[4] R. Acero, A. Brau, J. Santolaria, M. Pueo and C. Cajal., “Evaluation of the use of a laser tracker and an indexed metrology platform as gauge
equipment in articulated arm coordinate measuring machine verification procedures,” Procedia Engineering, vol. 132, pp. 740-747, 2015.
[5] H. Gu, Q. Li and J. Li., “Quick robot cell calibration for small part assembly,” in International Federation for the Theory of Mechanisms and
Machines World Congress, Taipei, Taiwan, 2015.
[6] P. Besset, A. Olabi, R. Béarée and O. Gibaru., “Advanced geometric and non-geometric calibration applied to a collaborative robot,” in IEEE
International Conference on Power Electronics and Motion Control, Varna, Bulgaria, 2016.
[7] W. Khalil and E. Dombre., Modeling, Identification & Control of Robots, Butterworth Heinemann, 2004.
[8] A. Elatta, L. P. Gen, F. L. Zhi, Y. Daoyuan and L. Fei., “An overview of robot calibration,” Information Technology Journal 3 (1), pp. 74-78,
2004.
[9] A. Olabi, M. Damak, R. Béarée, O. Gibaru and S. Leleu., “Improving the accuracy of industrial robots by offline compensation of joints
errors,” in IEEE International Conference on Industrial Technology, Island of Kos, Greece, 2012.
140 Gil Boyé de Sousa et al. / Procedia Manufacturing 11 (2017) 132 – 140

[10] C. Gong, J. Yuan and J. Ni., “Nongeometric error identification and compensation for robotic system by inverse calibration,” International
Journal of Machine Tools & Manufacture, vol. 40, pp. 2119-2137, 2000.
[11] F. S. Cheng, “Calibration of Robot Reference Frames for Enhanced Robot Positioning Accuracy,” in Robot Manipulators, Marco Ceccarelli
(Ed.), ISBN, 2008.
[12] S. Yin, Y. Guo, Y. Ren, J. Zhu, S. Yang and S. Ye., “A novel TCF calibration method for robotic visual measurement system,” Optik, vol.
125, pp. 6920-6925, 2014.
[13] X. Xu, D. Zhu, H. Zhang, S. Yan and H. Ding., “TCP-based calibration in robot-assisted belt grinding of aero-engine blades using scanner
measurements,” The International Journal of Advanced Manufacturing Technology, 2016.

You might also like