Professional Documents
Culture Documents
Fast CNN
Fast CNN
Abstract: Pitch length and surface braiding angle are two important parameters of braided composite preforms. In this paper,
a method based on Faster R-CNN is proposed to measure the two parameters. First, after image acquisition, a fabric image
database including initial cropped images, augmented images, and target images is established. Then, the target images are
classified into four categories according to the gray change characteristics. Third, a Faster R-CNN fabric detection model is
trained on the fabric image database. Fourth, targets are detected by the trained network, and corners are detected based on
the detected targets. Finally, pitch lengths and surface braiding angles are measured based on the detected corners.
Experimental results show that the proposed method achieves the automatic measurement of pitch lengths and surface
braiding angles of 2D and 3D braided composite preforms with high accuracy.
Keywords: Braided composite preform, Pitch length, Surface braiding angle, Deep learning, Faster R-CNN
Figure 1. Pitch length and surface braiding angle; (a) in ideal scenario, (b) in practical scenario (2D fabric), and (c) in practical scenario (3D
fabric).
590
Braided Parameter Measurement Using Faster R-CNN Fibers and Polymers 2020, Vol.21, No.3 591
Experimental
Materials
The materials include 2D and 3D fabrics. The 2D fabrics
are biaxial braided by carbon-fiber (6k, 6,000 filaments), Figure 3. Image acquisition system.
carbon-fiber (3k, 3,000 filaments), glass-fiber and aramid-
fiber, separately, which are braided tubes provided by
Shuomin Technology Co. Ltd., China (see Figure 2(a)). The Image Acquisition
3D fabrics are 3D, four-directional braided by carbon-fiber The original images are acquired by the image acquisition
(12k, 12,000 filaments), and carbon-fiber (24k, 24,000 system displayed in Figure 3. First, the fabric is put on the
filaments), separately, supplied by the Institute of Textile surface of the measurement platform with micrometer on
Composites of Tianjin Polytechnic University (see Figure one side of the fabric, keeping the surface of the fabric and
2(b)). The average diameters of the 2D braided carbon-, micrometer on the same horizontal plane. Second, the CCD
glass- and aramid-fiber filaments are, respectively, 5.70 µm, camera (Vieworks VA-8MC-M16AO) with a camera lens
7.91 µm and 11.02 µm, tested by the College of Textiles of (Nikkor 28 mm f/2.8D) and circular polarizing filter (CPF)
Tianjin Polytechnic University using the fiber fineness (NiSi CPL 52 mm) attached is vertically installed above the
analyser. fabric, and the dome light source (DLS) (OPT-RID180-
592 Fibers and Polymers 2020, Vol.21, No.3 Zhitao Xiao et al.
RGB) is put on the surface of the fabric, making the specific image datasets and corresponding annotations, such
camera’s optical centre go through the central axis of the as the Microsoft COCO dataset [42] and Pascal VOC dataset
DLS light hole and the centre of the fabric surface. Finally, [43]. Faster R-CNN is composed of RPN (Region Proposal
the camera exposure value and the rotated angle of CPF are Network) and Fast R-CNN modules. The RPN module,
adjusted, and a clear image shown on the computer screen is which is a deep fully convolutional network, proposes
selected. regions, and Fast R-CNN module uses the proposed regions.
The image acquisition system can obtain clear and The detailed steps of corner detection based on Faster R-
accurate gray-level fabric images. The reasons are described CNN are described as follows.
as follows: first, the CCD camera is an industrial black-and- Step 1: Establish the fabric image database, including
white CCD camera with a high resolution of 3296×2472 initial cropped images, augmented images and target images.
pixels, which has less distortion and clearer image quality Step 1.1: Crop initial fabric images from the original
than normal cameras. Second, the DLS can uniformly images acquired in the Image Acquisition Section. The
illuminate the fabric surface [37], and the CPF can filter cropped initial fabric images contain eight types of fabrics,
reflections with certain vibration directions [38,39], which as shown in Figure 4; Figure 4(a) to (c) are, respectively, 2D
are helpful to reduce the reflection on the fabric image carbon-fiber (6k), carbon-fiber (3k) and carbon-fiber (3k)
surface. fabrics with different of pitch lengths and surface braiding
angles; Figure 4(d) to (e) are respectively, 2D glass- and
Corner Detection based on Faster R-CNN aramid-fiber fabrics; and Figure 4(f) to (h) are, respectively,
The acquired images are then processed by corner 3D carbon-fiber (12k), carbon-fiber (24k) and carbon-fiber
detection based on Faster R-CNN. (24k) fabrics with different of pitch lengths and surface
Faster R-CNN [23] contains the Zeiler and Fergus (ZF) braiding angles.
model [40] and VGG-16 model [41] pre-trained with Step 1.2: Augment the initial fabric images by image
Figure 6. Training sample images and related target images; (a) F2dc (6k) with labeled corners and cropped rectangular box, (a-1) target
1
image of (a) (Class one), (a-2) target image of (a) (Class two), (b) F2dc (3k), (b-1) target image of (b) (Class one), (b-2) target image of (b)
2
(Class two), (c) F2dc (3k), (c-1) target image of (c) (Class one), (c-2) target image of (c) (Class two), (d) F2dg (3k), (d-1) target image of (d)
3
(Class three), (d-2) target image of (d) (Class three), (e) F2da (3k), (e-1) target image of (e) (Class three), (e-2) target image of (e) (Class
three), (f) F3dc (12k), (f-1) target image of (f) (Class four), (f-2) target image of (f) (Class four), (g) F3dc (24k), (g-1) target image of (g)
1 2
(Class four), (g-2) target image of (g) (Class four), (h) F3dc (24k), (h-1) target image of (h) (Class four), and (h-2) target image of (h) (Class
3
four).
Table 1. Number of images in the training, validating, testing and target sets of the fabric images database
Image number F dc
2 1
F dc
2 2
F dc
2 3
F dg
2
F da
2
F3dc1
F3dc2
F3dc3
Total set 800 800 800 800 800 700 700 700
Training set 3,660
Validating set 1,220
Testing set 1,220
Target set 388,026
Braided Parameter Measurement Using Faster R-CNN Fibers and Polymers 2020, Vol.21, No.3 595
Figure 8. PR curves; (a) PR curve of class 1, (b) PR curve of class 2, (c) PR curve of class 3, and (d) PR curve of class 4.
596 Fibers and Polymers 2020, Vol.21, No.3 Zhitao Xiao et al.
Figure 9. Partial tested results; (a) tested results of F dc (2D, 6k, carbon-fiber), (b) tested results of F dc (2D, 3k, carbon-fiber), (c) tested
2 1 2 2
results of F dc (2D, 3k, carbon-fiber), (d) tested results of F dg (2D, glass-fiber), (e) tested results of F da (2D, aramid-fiber), (f) tested results
2 3 2 2
of F3dc (3D,12k, carbon-fiber), (g) tested results of F dc (3D, 24k, carbon-iber), and (h) tested results of F dc (3D, 24k, carbon-fiber).
3 3 3 3 3
Figure 11. Process of initial corner detection; (a) FCOj, (b) FBM3Dj, (c) Fpcj, (d) FBj, and (e) adjusted corner.
598 Fibers and Polymers 2020, Vol.21, No.3 Zhitao Xiao et al.
Figure 13. Tested images with labeled angles; (a) image F with labeled angles and part corners, (b) image F with labeled angles, (c) image
1 2
F with labeled angles, (d) image F with labeled angles, (e) image F with labeled angles, (f) image F with labeled angles, (g) image F
3 4 5 6 7
Figure 14. Pitch length and surface braiding angle measured by hand; (a) pitch length measured by hand, (b) part of (a), and (c) surface
braiding angle measured by hand.
(24k) and carbon-fiber (24k) composite preforms cropped From Table 3, the standard deviations of pitch length and
from the original images acquired in the Image Acquisition surface braiding angle measured by hand are small. It is thus
Section. reasonable to use the manual measurement results to
With manual measurements, the pitch length dplm is benchmark the results of our proposed method.
measured by clicking the relevant points on the image, and Figure 15 shows the detected corner maps, pitch length
the surface braiding angle θm is measured by protractor. As relative errors, and surface braiding angle relative errors,
shown in Figure 14(a), for example, the coordinates of corner where the relative errors are arranged in ascending order.
C1(188, 238) and C2(190, 399) are acquired by clicking Table 4 is the average measurement values and relative
points on the image shown on the computer screen and the errors of pitch length and surface braiding angle, where d
pitch length is computed as d plm = dc (188 −190)2 + (238 − 399)2 , and θ denote the values of pitch length and surface braiding
where dc is the calibration value in image acquisition, the angle, respectively. The subscripts ah, adp, and aac denote
accuracy of the corner position can be the sub-pixel level, the average value measured by hand, by our method and by
and the corner points are clicked in the case of image auto-correlation (Thx; see the Corner Detection based on
magnification (see Figure 14(b)) to improve the accuracy of Faster R-CNN Section, Step 4), respectively. eddp, edac and
the corner positions. In Figure 14(c), the angle θ measured eθdp are relative errors calculated by:
by protractor is 49.2 o, so the surface braiding angle θm is
(1/2)θ=(1/2)49.2=24.6 o. Each pitch length and surface d ah − d adp
braiding angle measured by hand is the average value of ten eddp = ×100% (6)
d ah
measurements. The measured angles are labeled in Figure
13(a)-(h). In Figure 13(a), for example, the measured surface d ah − d aac
braiding angles are (1/2)θi (i=1, 2, 3, ..., 35), and the edac = ×100% (7)
d ah
measured pitch lengths are dAiAjpl=dc×dAiAj (i=1, 2, ...; j=i+1),
where dAiAj is the pixel distance between corners Ai and Aj. θ ah − θ adp
Similarly, the pitch length dBiBjpl, pitch lengths in other lines, eθ dp = ×100% (8)
θ ah
and pitch lengths in other images can be measured.
Table 3. Standard deviation of pitch length and surface braiding angle measured by hand
Standard deviation of pitch length Standard deviation of surface braiding angle
Minimum Maximum Average Minimum Maximum Average
F 1
0.0072 0.079 0.021 0.25 0.78 0.52
F 2
0.0045 0.042 0.016 0.28 0.82 0.56
F 3
0.0084 0.035 0.018 0.30 0.75 0.52
F 4
0.0054 0.035 0.017 0.25 0.88 0.46
F 5
0.0054 0.040 0.014 0.17 0.62 0.40
F 6
0.011 0.093 0.041 0.22 0.77 0.53
F 7
0.016 0.089 0.042 0.14 0.77 0.53
F 8
0.015 0.096 0.048 0.24 0.77 0.53
600 Fibers and Polymers 2020, Vol.21, No.3 Zhitao Xiao et al.
Figure 15. Corner maps and relative errors; (a) corner map of F , (b) pitch length relative errors of F , (c) surface braiding angle relative
1 1
errors of F , (d) corner map of F , (e) pitch length relative errors of F , (f) surface braiding angle relative errors of F , (g) corner map of F ,
1 2 1 2 3
(h) pitch length relative errors of F , (i) surface braiding angle relative errors of F , (j) corner map of F , (k) pitch length relative errors of F ,
3 3 4 4
(l) surface braiding angle relative errors of F , (m) corner map of F , (n) pitch length relative errors of F , (o) surface braiding angle relative
4 5 5
errors of F , (p) corner map of F , (q) pitch length relative errors of F , (r) surface braiding angle relative errors of F , (s) corner map of F ,
5 6 6 6 7
(t) pitch length relative errors of F , (u) surface braiding angle relative errors of F , (v) corner map of F , (w) pitch length relative errors of
7 7 8
From Figure 15 and Table 4, some observations can be achieves smaller average relative errors than the existing
made: autocorrelation-based method. In particular, image F2, i.e.,
• Our method can not only measure individual pitch the aramid-fiber fabric image, has larger relative errors when
lengths and surface braiding angles but also measure measured by autocorrelation than by our method, because
average values of the two parameters, and it generally the aramid-fiber fabric image has weaker contrast in textile
602 Fibers and Polymers 2020, Vol.21, No.3 Zhitao Xiao et al.
Table 4. Average measurement values and relative errors of pitch length and surface braiding angle
dah (mm) dadp (mm) daac (mm) θah (°) θadp (°) eddp (%) edac (%) eθdp (%)
F 1
2.49 2.49 2.56 44.20 45.75 0 2.81 3.51
F 2
2.37 2.37 1.78 31.60 31.88 0 24.9 0.89
F 3
4.10 4.10 4.18 35.31 34.89 0 1.95 1.19
F 4
1.82 1.82 1.81 43.40 43.71 0 0.55 0.71
F 5
1.97 1.97 1.97 35.17 34.96 0 0 0.60
F 6
3.89 3.88 3.91 24.01 23.20 0.26 0.51 3.37
F 7
3.89 3.88 3.92 32.65 33.92 0.26 0.77 3.89
F 8
5.48 5.47 5.58 24.48 24.02 0.18 1.82 1.88
edges than other types of images, and the autocorrelation- This work was supported by Applied Basic Research
based method is sensitive to image contrast. Programs of China National Textile and Apparel Council
• Our method achieves higher accuracy for pitch length (No. J201509) and the Program for Innovative Research
than surface braiding angle, and the reasons for this are as Team in University of Tianjin (No. TD13-5034).
follows: first, the individual pitch length depends on two
corners, while the individual surface braiding angle depends References
on four corners, which adds to the affecting factors. Second,
the influence of corner position changes on angle is greater 1. O. Bacarreza, P. Wen, and M. H. Aliabadi in “Micromechanical
than that on distance. Modelling of Textile Composites” (M. H. Aliabaki Ed.),
• Our method achieves higher accuracy for pitch length Computational and Experimental Methods in Structures,
and surface braiding angle of 2D braided fabrics than 3D Vol. 6, pp.1-74, Woven Composites, World Scientific
braided fabrics because the fabric bundle crimp of 3D Publishing Co., Hackensack, 2015.
braided fabrics is larger than that of 2D braided fabrics, and 2. S. Rana and R. Fangueiro, “Advanced Composites in
thus the contrast of neighborhoods centered at the corners of Aerospace Engineering, Advanced Composite Materials
3D braided fabrics is weaker than that of 2D braided fabrics. for Aerospace Engineering”, Woodhead Publishing, 2016.
3. A. Fouladi, R. J. Nedoushan, J. Hajrasouliha, M. Sheikhzadeh,
Conclusion Y. M. Kim, W. J. Na, and W. R. Yu, Appl. Compos. Mater.,
26, 479 (2019).
In this paper, we achieve the automatic measurement of 4. X. Gao, B. Sun, and B. Gu, Aerosp. Sci. Technol., 82, 46
pitch length and surface braiding angle of braided composite (2018).
preforms based on Faster R-CNN. Various conclusions are 5. W. Ye, W. Li, Y. Shan, J. Wu, H. Ning, D. Sun, N. Hu, and
described as follows: S. Fu, Compos. Part B-Eng., 156, 355 (2019).
1. Our method has good robustness for luminance and 6. S. Rana and R. Fangueiro, “Braided Structures and
contrast, has good applicability for the automatic Composites: Production, Properties, Mechanics and
measurement of pitch lengths and surface braiding angles Technical Applications”, Vol. 3, CRC Press, 2015.
of 2D and 3D fabrics, and is also suitable for measuring 7. S. Rana, S. Parveen, and R. Fangueiro in “Advanced
different fiber braided fabrics, such as carbon-, glass- and Carbon Nanotube Reinforced Multi-scale Composites”
aramid-fiber fabrics. (Bakerpur Ehsan Ed.), Advanced Composite Materials:
2. In the fabric database, the target matting method based on Manufacturing, Properties, and Applications, De Gruyter
the corner centered efficiently improved the accuracy of Open, 2015.
the initial corner position. 8. G. Balokas, S. Czichon, and R. Rolfes, Compos. Struct.,
3. The Faster R-CNN net used for initial corner detection 183, 550 (2018).
improves the applicability of the corner detection 9. A. Rawal, H. Saraswat, and A. Sibal, Text. Res. J., 85,
algorithm for weak-contrast images, which efficiently 2083 (2015).
avoids missing corners. 10. Q. Guo, G. Zhang, and J. Li, Mater. Des., 46, 291 (2013).
11. J. Sun, Y. Wang, G. Zhou, and X. Wang, Polym. Compos.,
Acknowledgements 39, 1076 (2018).
12. H. Zhou, W. Zhang, T. Liu, B. Gu, and B. Sun, Compos.
The authors wish to thank Prof. Jae Ryoun Youn for Part A: Appl. Sci. Manuf., 79, 52 (2015).
handling the review of the paper and the two anonymous 13. Y. Wang, Z. Liu, N. Liu, L. Hu, Y. Wei, and J. Ou, Compos.
reviewers for their helpful comments. Struct., 136, 75 (2016).
Braided Parameter Measurement Using Faster R-CNN Fibers and Polymers 2020, Vol.21, No.3 603
14. X. Li, C. Chu, L. Zhou, J. Bai, C. Guo, F. Xue, P. Lin, and Multimed. Tools Appl., 77, 3303 (2018).
P. K. Chu, Compos. Sci. Technol., 142, 180 (2017). 32. L. Wu, H. Li, J. He, and X. Chen, J. Phys.: Conf. Ser.,
15. B. Shi, S. Liu, A. Siddique, Y. Du, B. Sun, and B. Gu, Int. 1176, 032045 (2019).
J. Damage Mech., 28, 404 (2019). 33. H. Xie, Y. Chen, and H. Shin, Appl. Intell., 49, 1200
16. Z. Wan, J. Shen, and X. Wang, J. Text. Res. (in Chinese), (2019).
25, 42 (2004). 34. D. Siegmund, A. Prajapati, F. Kirchbuchner, and A.
17. L. Gong and Z. Wan, Comput. Measurement Control (in Kuijper, International Workshop on Artificial Intelligence
Chinese), 14, 730 (2006). and Pattern Recognition, pp.77-84, Springer, Cham, 2018.
18. Z. Wan and J. Li, AUTEX Res. J., 6, 30 (2006). 35. B. Wei, K. Hao, X.S. Tang, and L. Ren, International
19. M. Kang, X. Leng, Z. Lin, and K. Ji, International Conference on Artificial Intelligence on Textile and
Workshop on Remote Sensing with Intelligent Processing Apparel, pp.45-51, Springer, Cham, 2018.
(RSIP), IEEE, pp.1-4, 2017. 36. Z. Lin, Z. Guo, and J. Yang, Proceedings of the 11th
20. H. Jiang and E. Learned-Miller, arXiv preprint arXiv, International Conference on Machine Learning and
1606.03473 (2017). Computing, ACM, pp.429-433, 2019.
21. R. B. Girshick, J. Donahue, T. Darrell, and J. Malik, 37. Opt Machine Vision Web. http://www.optmv.net/index.php/
CVPR, pp.580-587, 2014. Productionservice/pro_cdetail/pro_id/64 (Accessed July 18,
22. R. B. Girshick, ICCV, pp.1440-1448, 2015. 2019).
23. S. Ren, K. He, R. Girshick, and J. Sun, NIPS, pp.91-99, 38. E. Liepinsh, J. Kuka, and M. Dambrova, J. Pharmacol.
2015. Toxicol., 67, 98 (2013).
24. R. Sa, W. Owens, R. Wiegand, M. Studin, D. Capoferri, K. 39. S. Umeyama and G. Godin, IEEE Transactions on Pattern
Barooha, A. Greaux, R. Rattray, A. Hutton, J. Cintineo, Analysis and Machine Intelligence, 26, 2338 (2004).
and V. Chaudhary, 39th Annual International Conference of 40. M. D. Zeiler and R. Fergus, European Conference on
the IEEE Engineering in Medicine and Biology Society Computer Vision (ECCV), p.818, 2014.
(EMBC), pp.564-567, 2017. 41. K. Simonyan and A. Zisserman, International Conference on
25. W. Fan, H. Jiang, L. Ma, J. Gao, and H. Yang, 10th Learning Representations (ICLR), 2015.
International Conference on Digital Image Processing 42. T. Y. Lin, M. Maire, S. Belongie, J. Hays, P. Perona, D.
(ICDIP), Vol. 10806, p.108065A, 2018. Ramanan, P. Dollár, and C. L. Zitnick, European
26. D. Zhang, W. Zhu, H. Zhao, F. Shi, and X. Chen, Medical Conference on Computer Vision, p.740, 2014.
Imaging: Image Processing, International Society for 43. M. Everingham, L. Van Gool, C. K. I. Williams, J. Winn,
Optics and Photonics, Vol. 10574, p.105741U, 2018. and A. Zisserman, Int. J. Comput. Vision, 88, 303 (2010).
27. E. Yahalomi, M. Chernofsky, and M. Werman, Intelligent 44. J. J. Hernández-López, A. L. Quintanilla-Olvera, J. L.
Computing-Proceedings of the Computing Conference, López-Ramírez, F. J. Rangel-Butanda, M. A. Ibarra-
pp.971-981, Springer, Cham, 2019. Manzano, and D. L. Almanza-Ojeda, Procedia Technology, 3,
28. X. Gao, L. Peng, and M. Sun, J. Nucl. Med., 60, 284 196 (2012).
(2019). 45. P. Kovesi, Australian Pattern Recognition Society
29. J. E. Espinosa, S. A. Velastin, and J. W. Branch, 9th Conference: DICTA, p.309, 2003.
International Conference on Pattern Recognition Systems 46. K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian,
(ICPRS), Valparaíso, Chile, 2018. Proceedings of SPIE, Image Processing: Algorithms and
30. C. C. Tsai, C. K. Tseng, H. C. Tang, and J. I. Guo, Asia- Systems, Neural Networks, and Machine Learning, Vol.
Pacific Signal and Information Processing Association 6064, p.606414, 2006.
Annual Summit and Conference (APSIPA ASC), IEEE, 47. Z. Xiao, L. Pei, F. Zhang, L. Geng, J. Wu, J. Tong, J. Xi,
pp.1605-1608, 2018. and P. O. Ogunbona, Text. Res. J., 88, 2641 (2018).
31. M. Zhang, C. Gao, Q. Li, L. Wang, and J. Zhang,