Autonomous Seam Acquisition and Tracking System

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Int J Adv Manuf Technol (2013) 69:451–460

DOI 10.1007/s00170-013-5034-6

ORIGINAL ARTICLE

Autonomous seam acquisition and tracking system


for multi-pass welding based on vision sensor
W. P. Gu & Z. Y. Xiong & W. Wan

Received: 16 July 2012 / Accepted: 29 April 2013 / Published online: 16 May 2013
# Springer-Verlag London 2013

Abstract Automatic welding technology is a solution to 1 Introduction


increase welding productivity and improve welding quality,
especially in thick plate welding. In order to obtain high- Welding is a common but important process in shipbuilding
quality multi-pass welds, it is necessary to maintain a stable industry. As the ultra-large containership is getting even
welding bead in each pass. In the multi-pass welding, it is bigger and the steel plates used to build the ultra-large ships
difficult to obtain a stable weld bead by using a traditional are becoming thicker, thick plate welding becomes much
teaching and playback arc welding robot. To overcome these more time consuming and costly. An automatic multi-pass
traditional limitations, an automatic welding tracking welding system will be a solution to increase welding produc-
system of arc welding robot is proposed for multi-pass tivity [1–3]. In general, a multi-pass welding consists of three
welding. The developed system includes an image acquisition phases—root, fill, and cap, with each phase requiring different
module, an image processing module, a tracking control process control.
unit, and their software interfaces. The vision sensor, which At present, there is extensive development in the field of
includes a CCD camera, is mounted on the welding torch. welding science and technology. Numerous techniques for
In order to minimize the inevitable misalignment between arc welding process automation have been studied [4–6].
the center line of welding seam and the welding torch for Many new intelligent welding robots are developed. Various
each welding pass, a robust algorithm of welding image sensors have been used, such as inductive sensors, through-
processing is proposed, which was proved to be suitable for the-arc sensors, ultrasonic sensors, and vision sensors [7–9].
the root pass, filling passes, and the cap passes. In order to Vision sensors gain the most attention due to their non-
accurately track the welding seam, a Fuzzy-P controller is contact feature, high accuracy, and rich information [10,
designed to control the arc welding robot to adjust the 11]. The key to the successful implementation of a vision
torch. The Microsoft Visual C++6.0 software is used to sensor system is to capture stable and high-quality images
develop the application programs and user interface. The that are suitable for the analysis of welding process.
welding experiments are carried out to verify the validity of Many research papers, with specific interest in the
the multi-pass welding tracking system. interferences and features of different welding conditions,
designed a large number of seam tracking systems. Sung et
Keywords Multi-pass welding . Laser vision sensor . al. [12] developed a multiline laser vision sensor for joint
Image processing . Seam tracking tracking in high-speed welding. Experiments were performed
with speeds of 10, 15, and 20 m/min and the mean error was
0.3 mm and the maximum error was only 0.6 mm. Villán et al.
[13] developed a laser-stripe system used in the automatic
W. P. Gu : Z. Y. Xiong (*) : W. Wan
welding process in heavy industries. This system could
School of Aeronautical Manufacturing Engineering,
Nanchang Hangkong University, Nanchang, China present satisfactory tracking results even when the welding
e-mail: xzyww@nchu.edu.cn gap geometry varies greatly or even when the weld image
W. P. Gu is distorted by noise. Reference papers [14, 15] presented
e-mail: WangpingGu@gmail.com a vision system used for tracking the I-butt welding joint.
W. Wan In paper [16], a circular laser trajectory was used to detect
e-mail: nhww@163.com weld location and to track weld seam.
452 Int J Adv Manuf Technol (2013) 69:451–460

This paper describes a seam tracking system based CCD camera


on laser visual sensor for multi-pass metal active gas Joint image
(MAG) welding. A method is developed to extract Lens Filter
feature point in a multi-pass welding image. To make
the seam tracking system precise and steady, a Fuzzy-P Laser Glass plate
controller is designed to regulate the welding torch. A
series of experiments were performed to test the seam
tracking system which helped to achieve uniform weld
bead with a high level of acceptability.

2 Overall seam tracking system Workpieces

2.1 Hardware of robotic welding system


Fig. 2 Diagram of the laser vision sensor
The overall seam tracking system (shown in Fig. 1) consists
of a welding robot, a control cabinet (with a correction laser vision sensor is illustrated in Fig. 2. The CCD camera is
board), an industrial computer (with an image processing the most critical component as it has direct and significant
card and a D/A card), a laser vision sensor and a welding influence on the quality and speed of image acquisition. A
machine. A 6-axis UP20 welding robot is used for the JC-629 industrial camera with a 16-mm focal length
trajectory planning during the tests. A welding torch is fixed was used. A 650-nm stripe laser with a 20-mW output
at the end joint of the robot, while a laser vision sensor was used. A narrow band optical filter with pass bad
is placed before the torch. Welding images acquired by centered at 648.7 nm and transmissvity of 35 % is
the vision sensor are transferred to the control PC and placed in front of the focus lens. This laser wavelength
are transformed by the image processing card for further was selected based on the characteristic of MAG
image processing so that the weld location information welding arc light spectrum of steel [17]. Only the
can be obtained. The control PC sends commands to the reflected laser can reach the CCD camera while the
correction board through an AC1343 D/A card, and disturbance and noise from ambient light and arc light
then the control cabinet can regulate the robot to track would be filtered out. In order to protect inside compo-
the center of the seam. nents of the laser vision sensor, a transparent glass
cover was put in the front of the metal box.
2.2 Vision sensor development The vision sensor is installed at a certain angle relative to
the surface of the workpiece so that the CCD camera is
A vision sensor is placed in front of the welding torch. capable of obtaining high-quality images of the joint during
The vision sensor consists of a charge coupled device the welding process [18]. A series of tests have been
(CCD) camera, a stripe laser, and a filter. The setup of the performed to determine the optimum position of the vision
sensor. The CCD camera was installed at 160 mm from the
Control Cabinet plane of the workpiece; the stripe laser is mounted at
150 mm and 67 ° angle to the plane of the workpiece.
Welding Welding
Gas Robot 2.3 Seam tracking software

The software is created in Microsoft Visual C++6.0. Block


diagram and user interface of the software are shown in
Fig. 3. It is multi-thread software, with the joint images
capturing, image processing, and the welding torch motion
Welding controlled by different threads side-by-side.
Machine

3 Image capture and processing


Industrial Laser Vision
Computer Sensor
An image can be described by a two-dimensional matrix as
Fig. 1 Weld tracking system setup follows
Int J Adv Manuf Technol (2013) 69:451–460 453

(a) User interface (b) Process of the system


Fig. 3 The seam tracking software of multi-pass MAG process

2 3
f ð0; 0Þ f ð0; 1Þ … … … f ð0; nÞ value of a pixel is presented by a byte. f (i, j)=0 represents
6 f ð1; 0Þ f ð1; 1Þ … … … f ð1; nÞ 7 black, while f (i, j)=255 represent white, and the intermediate
6 7
6 ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ 7 values of f (i, j) represent the graduated variation from black to
f ði; jÞ ¼ 6
6
7
7
6 ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ 7 white [19]. All images analyzed in this paper are of 320×
4 ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ 5
240 pixels, representing an area 0.917× 0.453 mm2. To
f ðm; 0Þ f ðm; 1Þ … … … f ðm; nÞ
achieve high accuracy and reliability feature point of the joint
Where f (i, j) represents the gray level value i and j repre- image, a new image processing approach is needed. Fig. 4a
sent the space coordinates (position). The f (i, j) digitization shows the image of root joint.

(a) Original image (b) Median filtering (c) Thresholding

(d) Denosing (e) Thinning (f) Curve fitting


Fig. 4 Images of processing procedures
454 Int J Adv Manuf Technol (2013) 69:451–460

3.1 Image processing

The images captured by the CCD camera include not


only laser lines but also noise from spatter, arc light, or
overlapped reflection. Fig. 4 shows an example of joint
edge detection by using the methods developed in this
paper.
Some noises in the image are partly eliminated by
median filter, since it is especially effective for salt,
pepper, and impulse noise. The sliding window of me-
dian filter is 3×3; the median filtering result is shown
in Fig. 4b.
In this study, a uniform threshold algorithm was used to
isolate points of interest. This method can be described as Fig. 5 The pattern of Pi
follows:
 order set P2,P3 … P8,P9, and B (P1) is the number of
1; f ði; jÞ > T
f 1 ði; jÞ ¼ ð1Þ nonzero neighbors of P1, that is,
0 otherwise

where f denotes the gray level value at the original X


9

image, f1 is a threshold processing image and T is the B ðP 1 Þ ¼ Pi ð3Þ


i¼2
threshold. The threshold T is 252, which is set by an
image histogram. The threshold result is shown in
Fig. 4c. In the second subiteration, the same operations are car-
Since the typically scanned method of the camera [19] ried out, except that only conditions c and d are changed, as
and the spatter instantaneous pass through a capture region, follows:
either of the fields can capture the image track of spatter.
Based on the above characteristic, the follow algorithm has (c’) P2P4P8 =0
been used to denoising: (d’) P2P6P8 =0
 The thinned result is shown in Fig. 4e.
0; f 1 ði−1; jÞ þ f 1 ði þ 1; jÞ ¼ 0
f 2 ði; jÞ ¼ ð2Þ Since the denoising, structure of the stripe laser line has
1; otherwise
been broken. In order to decrease the influence of feature
point extraction, curve fitting should be used. The algorithm
Where f2 denotes the image after denoising. is given as follows:
After the above process, there are still some continuous
 
noises. Since noise is regular in horizontal direction, and yk −y j−1
bandwidth of the laser is less than 12 pixels, the light region y j ¼ y j−i þ ð4Þ
k−ð j−1Þ
where bandwidth is greater than 16 pixels is removed in
order to avoid the reflection and scattering influence. The
where [] is rounding, j is the column without a laser point in
denoising result is shown in Fig. 4 (d).
thinning image, k is the column, which is the first left laser
Thinning is an image processing operation in which
point, and yj is row of j column. Fig. 4f is the curve fitting
binary-valued image regions are reduced to lines which
result.
is approximate center lines. The method presented in
[20] is adopted here to obtain a skeleton of the laser
stripe.
j
In the first subiteration, the point P1 is deleted if it
satisfies the following conditions: 1 4 1 Top Left Point

(a) 2≤ B (P1) ≤6 2 Bottom Left Point


(b) A (P1)=1 3 Bottom Right Point
(c) P2P4P6 =0 2 3
(d) P4P6P8 =0, 4 Top Right Point
i (a(j),j)
Where the pattern of Pi (i =1, 2, …, 9) is shown in Fig. 5,
Pi =0 or 1, A (P1) is the number of 01 alternate mode in the Fig. 6 Feature points
Int J Adv Manuf Technol (2013) 69:451–460 455

Fig. 7 Flow chart of feature points determination


456 Int J Adv Manuf Technol (2013) 69:451–460

70°

11 12 13
10
9

20mm
7 8
4 6 5
2 3
1

Fig. 9 Schematic of welding sequence

Fig. 8 Image of extraction of feature point original image of a welding joint; A2, B2, and C2 (shown in
Fig. 10) are the result of thresholding; A3, B3, and C3
(shown in Fig. 10) are the result of denoising; A4, B4, and
3.2 Extraction of feature points C4 (shown in Fig. 10) are the result of thinning; A5, B5, and
C5 (shown in Fig. 10) are the result of feature point extrac-
In order to define a closed-loop seam tracking system, the tion. The image processing results show the accuracy of the
feature point of each image must be extracted. In this paper, designed methods. After extracting the feature points of the
four feature points have been marked as a top left point, a joint image, the position error information can be obtained
left bottom point, a top right point, and a right bottom point. through the offset of current feature point and start position.
The image coordinates and feature points are shown in
Fig. 6.
The algorithm of determining feature points is shown in 4 Seam tracking controller
Fig. 7, where a (j) represents the rows of column j. There are
two parts of the extracting algorithm. In the first part, the top The main goal of seam tracking is to keep the torch at the
two feature points are extracted. In the second part, the center of the joint. The seam tracking control is a nonlinear
bottom two feature points are extracted. The extraction system and a mathematical model for the system is difficult
algorithm is shown in the following diagram: to develop [21]. A Fuzzy-P controller, combined fuzzy
At root welding, the left bottom point and right bottom controller with proportion controller, is utilized to control
point are in a superposition in this study. The extraction the welding process. The proportion controller is used in
result can be seen in Fig. 8. large offset region and the Fuzzy controller is used in small
offset region. Fig. 11 shows the flow chart of the seam
3.3 Image processing results tracking controller.
In small offset region, the equation of a proportion con-
A series of tests were performed to assess the effectiveness troller is given by
of the image processing method. The welding conditions are
shown in Table 1. The layers of the multi-pass welding and
welding parameters are presented in Fig. 9 and Table 2; u ðt Þ ¼ K p  e ðt Þ ð5Þ
Fig. 10 shows the image processing results of the first, third,
and tenth beads. A1, B1, and C1 (shown in Fig. 10) are the

Table 2 Welding passes and voltage of each layer in multi-pass MAG

Table 1 Experiment conditions of MAG welding Layer Pass Welding Welding


current(A) voltage (V)
Parameter Value
First 1 128 20
Welding speed 5 mm/s Second 2, 3 128 20
Gas-flow rate CO2 15 L/min Third 4, 5, 6 128 20
Welding wire type H08Mn2SiA Fourth 7, 8, 9 143 23
Welding wire Φ 1.2 mm Fifth 10, 11, 12, 13 128 20
Int J Adv Manuf Technol (2013) 69:451–460 457

A1 A2 A3 A4 A5

B1 B2 B3 B4 B5

C1 C2 C3 C4 C5
Fig. 10 Image processing and feature points

where, e represents an error, Kp is a proportionality factor.


The error information is computed as the difference between
the reference image feature R and the feedback image fea-
ture F as follows:
e ðt Þ ¼ F ðt Þ−R ðt Þ ð6Þ

The proportionality factor KP is determined by a


variety of experiments, the experimental results are
shown in Fig. 12. As the figure shows, when KP is
smaller, the response time will be longer. If KP is too
big, the system will be overshoot. Therefore, in this
paper, KP is set as 3.6.
The Fuzzy controller includes fuzzier, fuzzy inference
engine, defuzzifier, and knowledge base. The structure of
the developed Fuzzy control is shown in Fig. 13. Where yr is
the input, u is the output, e represents the error, and ec is
change rate of error.
In this study, the Fuzzy controller generates a signal u to
drive the welding robot, e is the output of the seam recog-
nition; ec is the difference between the current error ei and
the error of last sampling circle ei-1, i.e. ec=ei −ei-1[22]. The
basic domain of the fuzzy value of error E, error change rate
EC is between −6 and +6, and the control value U are
Fig. 11 Flow chart of seam tracking controller between −12 and +12.
458 Int J Adv Manuf Technol (2013) 69:451–460

KP=2.6 KP =3.6 KP=4.6

Fig. 12 Seam tracking results of different proportionality factor

Fig. 13 Structure of Fuzzy


control

Fig. 14 Result of seam


tracking during root welding

(a) Tracking result (b) Tracking trajectory

Fig. 15 Result of seam


tracking during filled welding

(a) Tracking result (b) Tracking trajectory


Int J Adv Manuf Technol (2013) 69:451–460 459

Fig. 16 Result of seam


tracking during cosmetic
welding

(a) Tracking result (b) Tracking trajectory

5 Seam tracking results image processing module, feature points extracting module,
and seam tracking module. A multi-thread technology is
The robot is guided along the teaching line; laser vision used to improve the accuracy of this system.
sensor is used for monitoring the whole welding process Aiming at the problems of multi-pass welding seam
while Fuzzy-P controller is adopted to track the joint. Ex- tracking, this paper put forward a practical image processing
perimental conditions are shown in Table. 1. method, which can successfully extract feature points and
Fig. 14 displays a root welding result. In root welding, CCD obtain the position information. Based on the accurate
camera is ahead; the groove is regular, and the bottom feature of extracting welding deviation information, a Fuzzy-P con-
the laser line is the center of the welding joint. In this experiment, troller was used to control the torch to move along the joint
the bottom feature point was chosen as a benchmark point. with constant velocity.
Fig. 15 displays a filled welding result. The bottom To validate the system performance, V-grooved joint of
feature point has been submerged after root welding, the multi-pass on the flat plates was used. Different trajectory
shape of groove becomes irregular and the surface of and offset on root welding, filled welding, and cosmetic
welding seam is uneven. The top left feature point or the welding have been tested. Experimental results have shown
top right feature point can be used as a benchmark point. In that the system can precisely control MAG welding, and the
this experiment, the top left point was chosen as a bench- tracking error is within the range of 0.3 mm.
mark point.
Fig. 16 shows a cosmetic welding result. It is diffi- Acknowledgments The authors would like to acknowledge the sup-
port of the S&T plan project (no. GJJ11499) of Jiangxi Provincial
cult to track the joint if the edge line of groove has
Education Department of China.
been submerged by the former welding seams. There-
fore, in cosmetic welding, one of the edge lines serves
as the last welding pass. In this experiment, the 13th
welding pass shown in Fig. 9 were welded finally, and
References
the top left point is chosen as a benchmark point.
The experimental results show that the proposed seam
1. Kim YB, Kim JG, Jang WT, Park J, Moon HS, Kim JO (2008)
tracking system has a capability to produce a uniform
Development of automatic welding system for multi-layer and
weld bead at a high level of acceptable quality. The multi-pass welding. Proceedings of the 17th World Congress The
tracking errors in all the experiments were remained with- International Federation of Automatic Control Seoul, Korea, July
in the range of 0.3 mm. 6–11, 2008 pp: 4290–4291
2. Kang MG, Kim JH, Park YJ, Woo GJ (2007) Laser vision system
for automatic seam tracking of stainless steel pipe welding ma-
chine. International Conference on Control, Automation and Sys-
6 Conclusions tem, Seoul, Korea pp: 1046–1051
3. Haug K, Pritschow G (1998). Robust laser-stripe sensor for
automated weld seam tracking in the shipbuilding industry.
The software of seam tracking system for multi-pass
IECON ‘98. Proceedings of the 24th Annual Conference of
welding is written on Visual C++6.0 using Windows XP the IEEE Industrial Electronics Society (Cat. No.98CH36200)
operating system. It includes image gathering module, 2: 1236–1241
460 Int J Adv Manuf Technol (2013) 69:451–460

4. Kuo HC, Wu LJ (2002) An image tracking system for welded 13. Villán AF, Acevedo RG, Alvarez EA, Lopez AC, Garcia DF,
seams using fuzzy logic. J Mater Process Technol 120:169–185 Fernandez RU, Meana MJ (2011) Low-cost system for weld track-
5. Huang YW, Tung PC, Wu CY (2007) Tuning PID control of an ing based on artificial vision. Ind Res 47(3):1159–1167
automatic arc welding system using a SMAW process. Int J Adv 14. Kim JW, Bae HS (2005) A study on a vision sensor system for
Manuf Technol 34:56–61 tracking the I-butt weld joints. J Mech Sci Technol 19(10):1856–
6. Moon HS, Kim YB, Beattie RB (2006) Multi sensor data fusion 1863
for improving performance and reliability of fully automatic 15. Fang ZJ, Xu D, Tan M (2010) Visual seam tracking system for butt
welding system. Int J Adv Manuf Technol 28:286–293 weld of thin plate. Int J Adv Manuf Technol 49:519–526
7. Bae KY, Park JH (2006) A study on development of inductive 16. Xu PQ, Tang XH, Yao S (2008) Application of circular laser vision
sensor for automatic weld seam tracking. J Mater Process Technol sensor (CLVS) on welded seam tracking. J Mater Process Technol
176:111–116 205:404–410
8. Bingul Z, Cook GE, Strauss AM (2000) Application of fuzzy logic 17. Chen W (1990) Monitoring joint penetration using infrared sens-
to spatial thermal control in fusion welding. IEEE Trans Ind Appl ing techniques. Weld J 69(5):181–185
36(6):1523–1530 18. Huang W, Kovacevic R (2012) Development of a real-time laser-
9. Pritschow G, Mueller G, Horber H (2002) Fast and robust image based machine vision system to monitor and control welding
processing for laser stripe-sensors in arc welding automation. Proc process. Int J Adv Manuf Technol 63:235–248
IEEE Int Symp Ind Electron 2:651–656 19. Mark S. Nixon, Alberto S. Aguado (2010) Feature extraction and
10. Zhou L, Lin T, Chen SB (2006) Autonomous acquisition of seam image processing. Newnes, Oxford pp.10-11
coordinate for arc welding robot based on visual servoing. J Intell 20. Zhang TY, Suen CY (1984) A fast parallel algorithm for thinning
Robot Syst 47:239–255 digital patterns. Commun ACM 27(3):236–239
11. Chokkalingham S, Chandrasekhar N, Vasudevan M (2011) 21. Yu JY, Na SJ (1998) A study on vision sensors for seam tracking of
Predicting the depth of penetration and weld bead width from the height-varying weldment. Part 2: applications. Mechatronics 8:21–
infra red thermal image of the weld pool using artificial neural 36
network modeling. J Intell Manuf 385–392 22. Shi YH, Wang GR (2006) Vision-based seam tracking system for
12. Sung K, Lee H, Choi YS (2009) Development of a multiline laser underwater flux cored arc welding. Sci Technol Weld Join
vision sensor for joint tracking in welding. Weld J 88(4):79–85 11(3):271–277

You might also like