Professional Documents
Culture Documents
Autonomous Seam Acquisition and Tracking System
Autonomous Seam Acquisition and Tracking System
Autonomous Seam Acquisition and Tracking System
DOI 10.1007/s00170-013-5034-6
ORIGINAL ARTICLE
Received: 16 July 2012 / Accepted: 29 April 2013 / Published online: 16 May 2013
# Springer-Verlag London 2013
2 3
f ð0; 0Þ f ð0; 1Þ … … … f ð0; nÞ value of a pixel is presented by a byte. f (i, j)=0 represents
6 f ð1; 0Þ f ð1; 1Þ … … … f ð1; nÞ 7 black, while f (i, j)=255 represent white, and the intermediate
6 7
6 ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ 7 values of f (i, j) represent the graduated variation from black to
f ði; jÞ ¼ 6
6
7
7
6 ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ 7 white [19]. All images analyzed in this paper are of 320×
4 ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ 5
240 pixels, representing an area 0.917× 0.453 mm2. To
f ðm; 0Þ f ðm; 1Þ … … … f ðm; nÞ
achieve high accuracy and reliability feature point of the joint
Where f (i, j) represents the gray level value i and j repre- image, a new image processing approach is needed. Fig. 4a
sent the space coordinates (position). The f (i, j) digitization shows the image of root joint.
70°
11 12 13
10
9
20mm
7 8
4 6 5
2 3
1
Fig. 8 Image of extraction of feature point original image of a welding joint; A2, B2, and C2 (shown in
Fig. 10) are the result of thresholding; A3, B3, and C3
(shown in Fig. 10) are the result of denoising; A4, B4, and
3.2 Extraction of feature points C4 (shown in Fig. 10) are the result of thinning; A5, B5, and
C5 (shown in Fig. 10) are the result of feature point extrac-
In order to define a closed-loop seam tracking system, the tion. The image processing results show the accuracy of the
feature point of each image must be extracted. In this paper, designed methods. After extracting the feature points of the
four feature points have been marked as a top left point, a joint image, the position error information can be obtained
left bottom point, a top right point, and a right bottom point. through the offset of current feature point and start position.
The image coordinates and feature points are shown in
Fig. 6.
The algorithm of determining feature points is shown in 4 Seam tracking controller
Fig. 7, where a (j) represents the rows of column j. There are
two parts of the extracting algorithm. In the first part, the top The main goal of seam tracking is to keep the torch at the
two feature points are extracted. In the second part, the center of the joint. The seam tracking control is a nonlinear
bottom two feature points are extracted. The extraction system and a mathematical model for the system is difficult
algorithm is shown in the following diagram: to develop [21]. A Fuzzy-P controller, combined fuzzy
At root welding, the left bottom point and right bottom controller with proportion controller, is utilized to control
point are in a superposition in this study. The extraction the welding process. The proportion controller is used in
result can be seen in Fig. 8. large offset region and the Fuzzy controller is used in small
offset region. Fig. 11 shows the flow chart of the seam
3.3 Image processing results tracking controller.
In small offset region, the equation of a proportion con-
A series of tests were performed to assess the effectiveness troller is given by
of the image processing method. The welding conditions are
shown in Table 1. The layers of the multi-pass welding and
welding parameters are presented in Fig. 9 and Table 2; u ðt Þ ¼ K p e ðt Þ ð5Þ
Fig. 10 shows the image processing results of the first, third,
and tenth beads. A1, B1, and C1 (shown in Fig. 10) are the
A1 A2 A3 A4 A5
B1 B2 B3 B4 B5
C1 C2 C3 C4 C5
Fig. 10 Image processing and feature points
5 Seam tracking results image processing module, feature points extracting module,
and seam tracking module. A multi-thread technology is
The robot is guided along the teaching line; laser vision used to improve the accuracy of this system.
sensor is used for monitoring the whole welding process Aiming at the problems of multi-pass welding seam
while Fuzzy-P controller is adopted to track the joint. Ex- tracking, this paper put forward a practical image processing
perimental conditions are shown in Table. 1. method, which can successfully extract feature points and
Fig. 14 displays a root welding result. In root welding, CCD obtain the position information. Based on the accurate
camera is ahead; the groove is regular, and the bottom feature of extracting welding deviation information, a Fuzzy-P con-
the laser line is the center of the welding joint. In this experiment, troller was used to control the torch to move along the joint
the bottom feature point was chosen as a benchmark point. with constant velocity.
Fig. 15 displays a filled welding result. The bottom To validate the system performance, V-grooved joint of
feature point has been submerged after root welding, the multi-pass on the flat plates was used. Different trajectory
shape of groove becomes irregular and the surface of and offset on root welding, filled welding, and cosmetic
welding seam is uneven. The top left feature point or the welding have been tested. Experimental results have shown
top right feature point can be used as a benchmark point. In that the system can precisely control MAG welding, and the
this experiment, the top left point was chosen as a bench- tracking error is within the range of 0.3 mm.
mark point.
Fig. 16 shows a cosmetic welding result. It is diffi- Acknowledgments The authors would like to acknowledge the sup-
port of the S&T plan project (no. GJJ11499) of Jiangxi Provincial
cult to track the joint if the edge line of groove has
Education Department of China.
been submerged by the former welding seams. There-
fore, in cosmetic welding, one of the edge lines serves
as the last welding pass. In this experiment, the 13th
welding pass shown in Fig. 9 were welded finally, and
References
the top left point is chosen as a benchmark point.
The experimental results show that the proposed seam
1. Kim YB, Kim JG, Jang WT, Park J, Moon HS, Kim JO (2008)
tracking system has a capability to produce a uniform
Development of automatic welding system for multi-layer and
weld bead at a high level of acceptable quality. The multi-pass welding. Proceedings of the 17th World Congress The
tracking errors in all the experiments were remained with- International Federation of Automatic Control Seoul, Korea, July
in the range of 0.3 mm. 6–11, 2008 pp: 4290–4291
2. Kang MG, Kim JH, Park YJ, Woo GJ (2007) Laser vision system
for automatic seam tracking of stainless steel pipe welding ma-
chine. International Conference on Control, Automation and Sys-
6 Conclusions tem, Seoul, Korea pp: 1046–1051
3. Haug K, Pritschow G (1998). Robust laser-stripe sensor for
automated weld seam tracking in the shipbuilding industry.
The software of seam tracking system for multi-pass
IECON ‘98. Proceedings of the 24th Annual Conference of
welding is written on Visual C++6.0 using Windows XP the IEEE Industrial Electronics Society (Cat. No.98CH36200)
operating system. It includes image gathering module, 2: 1236–1241
460 Int J Adv Manuf Technol (2013) 69:451–460
4. Kuo HC, Wu LJ (2002) An image tracking system for welded 13. Villán AF, Acevedo RG, Alvarez EA, Lopez AC, Garcia DF,
seams using fuzzy logic. J Mater Process Technol 120:169–185 Fernandez RU, Meana MJ (2011) Low-cost system for weld track-
5. Huang YW, Tung PC, Wu CY (2007) Tuning PID control of an ing based on artificial vision. Ind Res 47(3):1159–1167
automatic arc welding system using a SMAW process. Int J Adv 14. Kim JW, Bae HS (2005) A study on a vision sensor system for
Manuf Technol 34:56–61 tracking the I-butt weld joints. J Mech Sci Technol 19(10):1856–
6. Moon HS, Kim YB, Beattie RB (2006) Multi sensor data fusion 1863
for improving performance and reliability of fully automatic 15. Fang ZJ, Xu D, Tan M (2010) Visual seam tracking system for butt
welding system. Int J Adv Manuf Technol 28:286–293 weld of thin plate. Int J Adv Manuf Technol 49:519–526
7. Bae KY, Park JH (2006) A study on development of inductive 16. Xu PQ, Tang XH, Yao S (2008) Application of circular laser vision
sensor for automatic weld seam tracking. J Mater Process Technol sensor (CLVS) on welded seam tracking. J Mater Process Technol
176:111–116 205:404–410
8. Bingul Z, Cook GE, Strauss AM (2000) Application of fuzzy logic 17. Chen W (1990) Monitoring joint penetration using infrared sens-
to spatial thermal control in fusion welding. IEEE Trans Ind Appl ing techniques. Weld J 69(5):181–185
36(6):1523–1530 18. Huang W, Kovacevic R (2012) Development of a real-time laser-
9. Pritschow G, Mueller G, Horber H (2002) Fast and robust image based machine vision system to monitor and control welding
processing for laser stripe-sensors in arc welding automation. Proc process. Int J Adv Manuf Technol 63:235–248
IEEE Int Symp Ind Electron 2:651–656 19. Mark S. Nixon, Alberto S. Aguado (2010) Feature extraction and
10. Zhou L, Lin T, Chen SB (2006) Autonomous acquisition of seam image processing. Newnes, Oxford pp.10-11
coordinate for arc welding robot based on visual servoing. J Intell 20. Zhang TY, Suen CY (1984) A fast parallel algorithm for thinning
Robot Syst 47:239–255 digital patterns. Commun ACM 27(3):236–239
11. Chokkalingham S, Chandrasekhar N, Vasudevan M (2011) 21. Yu JY, Na SJ (1998) A study on vision sensors for seam tracking of
Predicting the depth of penetration and weld bead width from the height-varying weldment. Part 2: applications. Mechatronics 8:21–
infra red thermal image of the weld pool using artificial neural 36
network modeling. J Intell Manuf 385–392 22. Shi YH, Wang GR (2006) Vision-based seam tracking system for
12. Sung K, Lee H, Choi YS (2009) Development of a multiline laser underwater flux cored arc welding. Sci Technol Weld Join
vision sensor for joint tracking in welding. Weld J 88(4):79–85 11(3):271–277