Sensors 20 04505

Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

sensors

Article
Computer Vision System for Welding Inspection of
Liquefied Petroleum Gas Pressure Vessels Based on
Combined Digital Image Processing and Deep
Learning Techniques
Yarens J. Cruz 1, *, Marcelino Rivas 1 , Ramón Quiza 1 , Gerardo Beruvides 2
and Rodolfo E. Haber 3
1 Centro de Estudio de Fabricación Avanzada y Sostenible, Universidad de Matanzas, Matanzas 40100, Cuba;
[email protected] (M.R.); [email protected] (R.Q.)
2 Social Innovation Business, Hitachi Europe Ltd., 40547 Hitachi, Germany; [email protected]
3 Centro de Automática y Robótica, CSIC-UPM, 28500 Madrid, Spain; [email protected]
* Correspondence: [email protected]; Tel.: +53-53434814

Received: 7 July 2020; Accepted: 10 August 2020; Published: 12 August 2020 

Abstract: One of the most important operations during the manufacturing process of a pressure
vessel is welding. The result of this operation has a great impact on the vessel integrity; thus, welding
inspection procedures must detect defects that could lead to an accident. This paper introduces a
computer vision system based on structured light for welding inspection of liquefied petroleum gas
(LPG) pressure vessels by using combined digital image processing and deep learning techniques.
The inspection procedure applied prior to the welding operation was based on a convolutional neural
network (CNN), and it correctly detected the misalignment of the parts to be welded in 97.7% of
the cases during the method testing. The post-welding inspection procedure was based on a laser
triangulation method, and it estimated the weld bead height and width, with average relative errors
of 2.7% and 3.4%, respectively, during the method testing. This post-welding inspection procedure
allows us to detect geometrical nonconformities that compromise the weld bead integrity. By using
this system, the quality index of the process was improved from 95.0% to 99.5% during practical
validation in an industrial environment, demonstrating its robustness.

Keywords: computer vision; digital image processing; deep learning; welding inspection

1. Introduction
Pressure vessels are closed containers, tanks, or pipelines designed to receive and store a fluid at a
pressure greater than outer ambient pressure conditions. They are used in a large number of industries,
such as power generation, chemical, petroleum, petrochemical, and nuclear industries. The fluids
contained in pressure vessels may have specific characteristics, such as volatility, compressibility,
flammability, or radioactivity. Cylindrical vessels are generally preferred, because they present simpler
manufacturing problems and make better use of the available space [1].
Modern industries need to reduce the production time in order to accomplish their customers’
demands, and to achieve higher profits; industries that produce pressure vessels are not the exception.
As a consequence, it is still a challenge to apply systematic methodologies for monitoring and preventing
defect occurrences within the manufacturing shop floors, due to the increasing complexity of both
products and production systems [2]. If the defects originating during the manufacturing process are
not detected, the vessel can break during operation. A ruptured pressure vessel can be hazardous,
possibly leading to poison gas leaks, fires, or explosions, which may cause significant losses of human

Sensors 2020, 20, 4505; doi:10.3390/s20164505 www.mdpi.com/journal/sensors


Sensors 2020, 20, 4505 2 of 13

lives and properties [3]. Thus, the application of more efficient monitoring strategies on the production
lines at critical stages is required to avoid the generation or propagation of defects [4].
Various methodologies can be adopted to manufacture a vessel. Nonetheless, the basic
manufacturing process of a pressure vessel consists of the following stages: forming, pressing, spinning,
bending, welding, post-weld heat treatment, assembly, and finishing. During the manufacturing of
liquefied petroleum gas (LPG) pressure vessels, a large number of defects originate in the welding stage.
The use of non-destructive testing (NDT) techniques can contribute to the early detection of these
defects, allowing the deployment of cost-effective line monitoring and control systems that reduce
expensive off-line measure-rework-assess loops. NDT techniques, such as radiography, ultrasonic
testing, penetrant liquid testing, magnetic particle testing, phased arrays, time-of-flight diffraction, and
multi-elements eddy current, are more and more extensively applied. Tomography, acoustic emissions,
ultrasonic guided waves, and laser ultrasonic techniques continue to be strong topics of interest [5].
Lately, NDT techniques based on computer vision are becoming integral parts of many production
systems, due to the increasing computing power, hyper connectivity, and easy installation of digital
cameras in production lines. Many 3-D light detection and ranging (LiDAR) techniques have been
explored and reported in the literature [6,7]; however, these sensing systems are expensive and require
a high computational cost. Nowadays, computer vision technologies have demonstrated unprecedent
benefits in the industry; they allow detection of defects unnoticeable to human operators, automate
extremely tedious measuring tasks, perform visual inspection in risky environments, substitute costly
end-of-line product quality inspection procedures for multi-stage inspection systems, etc.
A wall thickness of 2 mm is typically used for 10 kg LPG pressure vessels. Due to the thinness of
the wall, it is critical to achieve a high precision in the butt joint alignment before welding. The use of
structured light vision is widely spread for alignment evaluation, since it allows reduction of the effect
of visual perturbations commonly found on shop floors.
A multi-functional monocular visual sensor, based on the combination of cross-lines and single-line
laser structured lights, was proposed by Jichang et al. [8]. This method allowed determination of the
dimensions of V-type butt groove based on a single image processing. Wang et al. [9] developed a
robust weld seam recognition method under heavy noise based on structured light vision. By using
this algorithm, butt joints, T joints, and lap joints were accurately recognized. Shao et al. [10] used
combined passive and active vision sensors to measure dimensions in butt joints with seam gap less
than 0.1 mm. Later, Shao et al. [11], continued to develop the previously mentioned system by using a
particle filter in order to make it more robust. Fan et al. [12] introduced a method based on digital
image processing for initial point alignment in narrow robotic welding. This method allowed detection
of the weld seam center point when a laser stripe line was projected over a junction, and it was used
later by Fan et al. [13] to develop a weld seam tracking system. Robertson et al. [14] used a Keyence
blue laser profilometer to digitize the geometry of the weld seam; this permitted them to formulate an
automated welding plan. Recently, Chen et al. [15] introduced a method for the inspection of complex
joints, which allowed them to assign different parameters for the welding process, depending on the
features of the region. Du et al. [16] proposed a convolutional neural network (CNN) to perform
the feature area recognition and weld seam identification. This CNN analyzed the projection of a
659 nm laser stripe over different types of junctions, obtaining a validation accuracy of 98.0% under
strong noise.
Welding defects and nonconformities are not only caused by pre-welding conditions. Some of
these defects are intrinsic to the welding operation itself. For this reason, it is necessary to assess the
weld bead integrity, especially in the case of pressure vessels, where the weld bead is going to be under
constant stress. The use of structured light vision is the core of many studies, while other computer
vision approaches propose the analysis of X-ray images, infrared images, ultraviolet images, etc.
Pinto-Lopera et al. [17] proposed a system for measuring weld bead geometry by using a single
high-speed camera and a long-pass optical filter. Soares et al. [18] introduced a method for weld bead
edge identification that allowed them to recognize discontinuities. Han et al. [19] and Zhou et al. [20]
Sensors 2020, 20, 4505 3 of 13

used the RANSAC algorithm for fitting linear functions to a laser stripe over a welded surface; which
allowed them to measure weld bead dimensions. Ye et al. [21] investigated the use of a model-based
classification method to automatically segment the bead from the welding surface, regardless of
the distance and the angle of the scanner to the welding surface. An approach based on pixel
intensity analysis was proposed by Singh et al. [22] for studying weld beads in P-91 steel plates.
Leo et al. [23] introduced a system for detecting and following the internal weld bead in stainless steel
kegs. Zeng et al. [24] proposed a weld bead detection method based on light and shadow feature
construction, using directional lights. Dung et al. [25] compared three deep learning methods for
crack detection in gusset plate welded joints. Khumaidi et al. [26] and Zhang et al. [27] used CNN for
weld inspection, reaching accuracies of 95.8% and 93.9%, respectively, for the classification of three
different types of defects, while the CNN developed by Bacioiu et al. [28] achieved a 93.4% accuracy
during the classification of five different types of defects. The transfer learning approach was used by
Yang et al. [29] for optical inspection of laser welding. A deep neural network model that extracted
the intrinsic features of X-ray images was used by Hou et al. [30] for automatic weld defect detection,
reaching an accuracy of 97.2%.
Although several researches on pre-welding and post-welding inspection exist, they are usually
only focused on one of these tasks. There is a lack of studies on the combined use of computer vision
techniques to provide an integral welding inspection under practical shop floor conditions, which
may include variation in illumination, presence of fume, and mechanical vibrations. For dealing with
the aforementioned shortcomings, this paper proposes an integrated system for pre-welding and
post-welding inspection, based on computer vision techniques, for industrial applications.

2. Materials and Methods


In the welding stage of LPG pressure vessels analyzed, most defects were related to butt joint
misalignment and incorrect weld bead geometry. For this reason, the proposed system should be able
to detect both types of imperfections. The welding inspection system proposed in this paper includes
a 650 nm line laser projector with a fan angle of 90◦ , a Raspberry Pi (RPi) 3B, an 8 MP Raspberry
Pi Camera Module V2, an Omrom E6B2 rotary encoder, a display, and a server. The sensors were
installed in the welding stage of the LPG pressure vessel production process. In this stage, the vessels
were mounted in an automated device that allowed rotation of them. The laser line was projected
orthogonally to the joint, with an incidence angle of 30◦ over the vessel surface from a distance of
500 mm. The camera was conveniently placed to capture the laser incidence area from a distance of
72 mm. The area captured by the camera was 65 × 87 mm. Rotating the vessel allowed the capture of
images along the circumferential junction before and after the welding operation. A rotational offset
of 6◦ between images is proposed. The rotary encoder was used in order to determine the precise
moment for capturing the images. All the processing of the data was carried out in the RPi. Relevant
information was shown in the display and also stored in the server for further analysis. Figure 1
shows an overview of the system, while Figure 2 shows images captured before and after the welding
operation. The images in Figure 2 were cropped vertically for displaying the area of interest, however,
the horizontal size remained the same, corresponding to 65 mm on the vessel surface.
In the production line, the inspection system was implemented in two separate steps: a pre-welding
inspection for detecting misalignment in the parts to be joined, and a post-welding inspection directed
to estimate the weld bead dimensions and to verify if they accomplished the technical requirements.
Figure 3 presents the block diagrams for both inspection tasks. In the following subsections, the steps
of these algorithms are explained.

2.1. Pre-Processing
The pre-processing step was the same for both inspection tasks. After being captured, images
were, firstly, resized to 640 × 480 pixels, and then cropped to 320 × 480 pixels. Next, they were
converted from RGB color space to grayscale, because information about hue and saturation is not
Sensors 2020, 20, 4505 4 of 13

significant for processing. A third transformation was conducted to separate the laser line from the
background. A binarization based on an adaptive threshold allowed us to reach this objective. The
threshold was calculated using the method proposed by Bradley and Roth [31], based on the local mean
intensity in the neighborhood of each pixel. The resulting images have just one value per pixel: 0 or 1;
indicating black or white, respectively. After this transformation, only the biggest object of each image
was preserved, this contributed to eliminating smaller objects produced by ambient noise. In order to
simplify the analysis, the laser line was reduced to a one-pixel width line. An example demonstrating
the previously described transformations can be seen in Figure 4. After these operations, images were
ready 2020,
Sensors to be20,
processed.
x FOR PEER REVIEW 4 of 13
Sensors 2020, 20, x FOR PEER REVIEW 4 of 13

Figure 1. System overview.


Figure 1.
Figure 1. System
System overview.
overview.

(a) (b)
(a) (b)
Figure 2.
Figure Images
2. Images captured
captured with
with the the proposed
proposed system:
system: (a) before
(a) before weldingwelding operation;
operation; (b) after (b) after
welding
Figure 2.operation.
welding Images captured with the proposed system: (a) before welding operation; (b) after welding
operation.
operation.
In the production line, the inspection system was implemented in two separate steps: a pre-
In the production line, the inspection system was implemented in two separate steps: a pre-
welding inspection for detecting misalignment in the parts to be joined, and a post-welding
welding inspection for detecting misalignment in the parts to be joined, and a post-welding
inspection directed to estimate the weld bead dimensions and to verify if they accomplished the
inspection directed to estimate the weld bead dimensions and to verify if they accomplished the
technical requirements. Figure 3 presents the block diagrams for both inspection tasks. In the
technical requirements. Figure 3 presents the block diagrams for both inspection tasks. In the
following subsections, the steps of these algorithms are explained.
following subsections, the steps of these algorithms are explained.
Sensors 2020, 20, 4505 5 of 13
Sensors 2020, 20, x FOR PEER REVIEW 5 of 13

(b)
(a)
Sensors 2020, 20, x FOR PEER REVIEW 6 of 13
3. Block
Figure 3.
Figure Blockdiagrams
diagramsof the
of inspection tasks: tasks:
the inspection (a) pre-welding inspection;
(a) pre-welding (b) post-welding
inspection; inspection.
(b) post-welding
inspection.

2.1. Pre-Processing
The pre-processing step was the same for both inspection tasks. After being captured, images
were, firstly, resized to 640 × 480 pixels, and then cropped to 320 × 480 pixels. Next, they were
converted from RGB color space to grayscale, because information about hue and saturation is not
significant for processing. A third transformation was conducted to separate the laser line from the
background. A binarization based on an adaptive threshold allowed us to reach this objective. The
threshold was (a) calculated using the method(b) proposed by Bradley(c) and Roth [31], based(d) on the local
mean intensity in the neighborhood of each pixel. The resulting images have just one value per pixel:
Figure 4. Images pre-processing: (a) RGB color space; (b) grayscale; (c) binarized image; (d) one-pixel
0 or 1;Figure
indicating black
4. Images or white, respectively.
pre-processing: (a) RGB color After
space; this transformation,
(b) grayscale; onlyimage;
(c) binarized the biggest object of
(d) one-pixel
width laser profile.
width laser profile.
each image was preserved, this contributed to eliminating smaller objects produced by ambient noise.
In
2.2.order to simplify
Pre-Welding Image theProcessing
analysis,and
theDecision
laser line was reduced to a one-pixel width line. An example
Making
2.2. Pre-Weldingthe
demonstrating Image Processingdescribed
previously and Decision Making
transformations can be seen in Figure 4. After these
In the last decade, artificial intelligence has been widely applied in pattern recognition. Among
operations, images
In thetechniques, wereartificial
last decade, ready tointelligence
be processed. has been widely applied in pattern recognition.
available image classification using CNN has been reported in many studies Among
[32–35].
This method has demonstrated to learn interpretable and powerful image features after[32–35].
available techniques, image classification using CNN has been reported in many studies This
the correct
method has demonstrated to learn interpretable and powerful image features
training. For this reason, it was proposed as the processing method for evaluating the pre-welding after the correct
training.
images Forneeded
that this reason, it was proposed
to be classified into twoas the processing
categories: method
“correct for evaluating
alignment” the pre-welding
and “incorrect alignment”.
Due to the characteristics of the images, a CNN architecture containing few layers was and
images that needed to be classified into two categories: “correct alignment” “incorrect
proposed. This
alignment”. Due to the characteristics
architecture is shown in Figure 5. of the images, a CNN architecture containing few layers was
proposed. This architecture is shown in Figure 5.
available techniques, image classification using CNN has been reported in many studies [32–35]. This
method has demonstrated to learn interpretable and powerful image features after the correct
training. For this reason, it was proposed as the processing method for evaluating the pre-welding
images that needed to be classified into two categories: “correct alignment” and “incorrect
Sensors 2020, 20, Due
alignment”. 4505 to the characteristics of the images, a CNN architecture containing few layers
6 of
was13

proposed. This architecture is shown in Figure 5.

Figure 5. Convolutional neural network (CNN) architecture.


Figure 5. Convolutional neural network (CNN) architecture.

The purpose
The purpose of of the
the first
first layer
layer is
is to
to serve
serve as
as an
an input
input toto the
the network
network by by mapping
mapping each
each pixel
pixel of
of
every image received into a matrix. The 2-D convolutional layer performs a
every image received into a matrix. The 2-D convolutional layer performs a convolution operation convolution operation
on the
on the output
output ofof the
the preceding
preceding layer,
layer, using
using aa set
set of
of filters
filters in
in order
order to
to learn
learn the
the features
features relevant
relevant to
to
distinguishing a “correct alignment” image from an “incorrect alignment”
distinguishing a “correct alignment” image from an “incorrect alignment” image. The convolutionimage. The convolution
operation is
operation is described
described byby the
thefollowing
followingequation:
equation:
XX
G[m, n] = ( f ∗ h)[m, n] = h[ j, k] f [m − j, n − k], (1)
j k

where f is the input image mapped into a matrix, h is a filter, and m and n are the rows and columns,
respectively. In this case, a total of 64 filters were used, and the activation function was “relu”, described
by the following equation:
f (x) = max(0, x), (2)

which is commonly used in deep neural networks.


The pooling layer downsamples the results of the preceding layer by discarding features, which
helps make the model more general, and reduces computation time. The pooling technique used
was “max pooling”, which examines squares of 2 × 2 features, and keeps only the maximum value.
Dropout layers randomly remove certain features by setting them to zero in each training epoch,
this is a commonly used technique to prevent overfitting. The flatten layer allows reshaping of the
three-dimensional data received from the first dropout layer into one dimension. Dense layers or
fully-connected layers learn the relationships among features, and perform classification by connecting
every neuron in the preceding layer to all the neurons they contain. The first dense layer contained
128 neurons and their activation function was “relu”, while the second dense layer contained only
Sensors 2020, 20, 4505 7 of 13

2 neurons, corresponding to the number of categories that needed to be classified, and their activation
function was “softmax”, described by the following equation:

ezi
σ(z)i = PK zj
, for i = 1, . . . , K, (3)
j=1 e

where z is, in this case, the output of the preceding layer. This function allows us to interpret the output
of the network as probabilities, where the category with the highest value will be the predicted class.
For pre-welding inspection, a total of 60 images per vessel are taken and analyzed one at a time.
The CNN classifies each image as “aligned” or “misaligned”. When the first misaligned image is
detected, the inspection process is interrupted, the vessel is labeled as defective, and it is removed from
the production line. If all the images are classified as aligned, the vessel is labeled as acceptable, and is
submitted to the welding operation. All the information generated by the system is stored in the server.

2.3. Post-Welding Image Processing and Decision Making


CNN capabilities for image classification are remarkable. However, the processing of post-welding
images implies a different task, to accurately estimate the weld bead dimensions. For this reason, another
approach based on the examination of the laser profile was proposed. In order to determine where the
weld bead is located in the image, the first and second derivative of the one-pixel width laser profile
are calculated using the numeric approximations described in the following equations, respectively:

f (x0 + h) − f (x0 − h)
f 0 (x0 ) ≈ , (4)
2h

f (x0 + h) − 2 f (x0 ) + f (x0 − h)


f 00 (x0 ) ≈ , (5)
h2
where h is a step value.
The minimum value of the second derivative proved to be a representative feature of the weld
bead center. The maximum values of the second derivative on both sides of the weld bead center
proved to be representative features of the weld bead edges. Figure 6 shows the mentioned points in the
second derivative graphic representation, while Figure 7 shows the same points over the RGB image.
Sensors 2020, 20, x FOR PEER REVIEW 8 of 13

Figure 6.
Figure 6. Second
Second derivative
derivative of
of the
the one-pixel
one-pixel width laser profile
width laser profile after
after the
the welding
welding operation.
operation.

The estimations of the weld bead dimensions are performed using a laser triangulation algorithm.
Calculating the distance between the weld bead edges feature points (left and right features points of
Figure 7) allows estimation of the weld bead width, W, in pixels. Calculating the distance between
the weld bead center feature point and the line joining the weld bead edges feature points allows
Sensors 2020, 20, 4505 8 of 13

estimation of the weld bead height, H, in pixels. These values are converted to millimeters by applying
a scale factor depending on the distance between the camera and the examined area.
Figure 6. Second derivative of the one-pixel width laser profile after the welding operation.

Figure 7. Feature points detected over the RGB image.


Figure 7. Feature points detected over the RGB image.

The post-welding
For estimations of the weld60bead
inspection, images dimensions
per vessel are performed
are also analyzed, using
one aat laser
a time.triangulation
For each of
them, the weld bead height and width are computed. If, for all the images, these values and
algorithm. Calculating the distance between the weld bead edges feature points (left are inright
the
corresponding intervals (i.e., 0.5 mm ≤ H ≤ 1.5 mm and 7.0 mm ≤ W ≤ 9.0 mm), the vessel is labeledthe
features points of Figure 7) allows estimation of the weld bead width, W, in pixels. Calculating as
distance between
acceptable, and is the weld bead
submitted center operations.
to further feature point Onand
thethe line joining
contrary, if, forthe weld image,
a given bead edges feature
any of these
points are
values allows
below estimation of the
the lower limit weld
(i.e., H < bead
0.5 mmheight,
or W <H,7.0inmm),
pixels. These values
the inspection are isconverted
process to
interrupted,
millimeters by applying a scale factor depending on the distance between the
the vessel is classified as defective, and is removed from the production line. Finally, if, for a given camera and the
examined
image, anyarea.
of the dimensions are above the upper limit of the corresponding intervals (i.e., H > 1.5 mm
For post-welding
or W > 9.0 mm), the vessel inspection, 60 images
is still labeled per vessel are
as acceptable, butalso analyzed,
a warning one at atotime.
is emitted For eachfor
the operator of
them, the weld bead height and width are computed. If, for all the images,
considering a possible readjustment of the welding parameters, for preventing material and energy these values are in the
corresponding
wasting. All theintervals (i.e., generated
information 0.5 mm ≤ Hby≤ the1.5 mm andis7.0
system mm ≤inW
stored the≤ 9.0 mm), the vessel is labeled as
server.
acceptable, and is submitted to further operations. On the contrary, if, for a given image, any of these
values
3. are below the lower limit (i.e., H < 0.5 mm or W < 7.0 mm), the inspection process is interrupted,
Results
the vessel is classified as defective, and is removed from the production line. Finally, if, for a given
A dataset containing 1090 images, divided into 872 images for training and 218 for validation,
image, any of the dimensions are above the upper limit of the corresponding intervals (i.e., H > 1.5
was used for the parametrization of the CNN. Of the total images, 565 had a misalignment value
mm or W > 9.0 mm), the vessel is still labeled as acceptable, but a warning is emitted to the operator
inferior to 1 mm and were labeled as “correct alignment”; and 525 images had a misalignment value
for considering a possible readjustment of the welding parameters, for preventing material and
superior to 1 mm and were labeled as “incorrect alignment”. During training, the accuracy of the CNN
energy wasting. All the information generated by the system is stored in the server.
rapidly grew in the first epochs, and after the 35th epoch was superior to 95.0%, as shown in Figure 8.
To prevent overfitting, the model was validated after each training epoch by evaluating it with unseen
3. Results
data. The validation loss reached its minimum value in the 44th epoch, as shown in Figure 9. After
A dataset
this epoch, therecontaining 1090 images, of
was no improvement divided into 872 loss
the validation images for training
during and 218epochs,
20 consecutive for validation,
and for
was reason,
this used forthethetraining
parametrization
was early of the CNN.
stopped. TheOfnetwork
the total images, 565
parameters forhad
the aepoch
misalignment
where thevalue
best
inferior to 1 mm and were labeled as “correct
validation was achieved were recovered and set to the CNN. alignment”; and 525 images had a misalignment value
A new dataset containing 109 “correct alignment” images and 111 “incorrect alignment” images
was used for testing the model. Table 1 shows the results obtained during testing. The accuracy
obtained was 97.7%, which demonstrates that the CNN was able to learn the relevant features for
classifying the images with the occurrence of few errors, and it was concluded that this algorithm was
satisfactory for the pre-welding inspection task. Additionally, the system was designed to store all the
data in the server; this will allow the creation of a larger dataset in the future for retraining the model
and obtaining a better accuracy.
superior to 1 mm and were labeled as “incorrect alignment”. During training, the accuracy of the
superior to 1 mm and were labeled as “incorrect alignment”. During training, the accuracy of the
CNN rapidly grew in the first epochs, and after the 35th epoch was superior to 95.0%, as shown in
CNN rapidly grew in the first epochs, and after the 35th epoch was superior to 95.0%, as shown in
Figure 8. To prevent overfitting, the model was validated after each training epoch by evaluating it
Figure 8. To prevent overfitting, the model was validated after each training epoch by evaluating it
with unseen data. The validation loss reached its minimum value in the 44th epoch, as shown in Figure
with unseen data. The validation loss reached its minimum value in the 44th epoch, as shown in Figure
9. After this epoch, there was no improvement of the validation loss during 20 consecutive epochs,
9. After
Sensors this epoch, there was no improvement of the validation loss during 20 consecutive epochs,
and this20,reason,
for2020, 4505 9 of 13
the training was early stopped. The network parameters for the epoch where the
and for this reason, the training was early stopped. The network parameters for the epoch where the
best validation was achieved were recovered and set to the CNN.
best validation was achieved were recovered and set to the CNN.

Figure
Figure 8. CNN
CNN accuracy
accuracy during
during training.
training.
Figure 8.
8. CNN accuracy during training.

Figure 9. CNN validation loss for every training epoch.


Figure 9.
Figure 9. CNN
CNN validation
validation loss
loss for
for every training epoch.
every training epoch.
A new dataset containing 109 “correct alignment” images and 111 “incorrect alignment” images
A new dataset containing 109 Table 1. Confusion
“correct matrix images
alignment” for test dataset.
and 111 “incorrect alignment” images
was used for testing the model. Table 1 shows the results obtained during testing. The accuracy
was used for testing the model. Table 1 shows the results obtained during testing. Alignment”
The accuracy
obtainedTotal 97.7%,=which
wasImages 220 demonstrates Actualthat the CNN
“Correct was able to learn
Alignment” the
Actual relevant features for
“Incorrect
obtained was 97.7%, which demonstrates that the CNN was able to learn the relevant features for
classifying
Predictedthe images
“correct with the occurrence of few
alignment” 105errors, and it was concluded that 1 this algorithm
classifying the images with the occurrence of few4 errors, and it was concluded that this algorithm
was satisfactory
Predicted for the
“incorrect pre-welding inspection task. Additionally, the system was110
alignment” designed to store
was satisfactory for the pre-welding inspection task. Additionally, the system was designed to store
all the data in the server; this will allow the creation of a larger dataset in the future for retraining the
all the data in the server; this will allow the creation of a larger dataset in the future for retraining the
model andthe
With obtaining
objectivea better accuracy.
model and obtaining a of testing
better the accuracy of the post-welding inspection method, 10 vessels were
accuracy.
analyzed. For each vessel, a total of 60 measurements were verified. The results are resumed in Table 2.
Table 1. Confusion matrix for test dataset.
The values correspond to the maximum absolute
Table 1. Confusion errorfor
matrix (MAE) and maximum relative error (MRE)
test dataset.
per variable.
Total Images = 220 Actual “Correct Alignment” Actual “Incorrect Alignment”
Total Images = 220 Actual “Correct Alignment” Actual “Incorrect Alignment”
Predicted “correct alignment” 105 1
Predicted “correct
Table alignment”
Maximum absolute error and 105 1
Predicted “incorrect 2.alignment” 4maximum relative error per variable.
110
Predicted “incorrect alignment” 4 110
Vessel MAE Width MRE Width MAE Height MRE Height
With the objective of testing the accuracy of the post-welding inspection method, 10 vessels were
With1the objective of testing
0.29 mmthe accuracy of 3.9% 0.05 mm method, 10 vessels
the post-welding inspection 3.5% were
analyzed. 2For each vessel,0.24
a total
mm of 60 measurements
3.4% were verified.
0.04 mmThe results are 2.8%
resumed in
analyzed. For each vessel, a total of 60 measurements were verified. The results are resumed in
Table 2. The
3 values correspond to the maximum4.2%
0.30 mm absolute error (MAE) and maximum relative
0.03 mm 3.0% error
Table 2. The values correspond to the maximum 2.5absolute error (MAE) and maximum relative
2.0 % error
(MRE) per4 variable. 0.21 mm % 0.03 mm
(MRE) per5 variable. 0.25 mm between the results
3.8 % obtained by0.03
Table 3 shows the comparison the mm
method used in the2.4 proposed
%
Table6 3 shows the comparison
0.26 mm between the 4.1 % obtained by0.04
results themm
method used in the3.2 proposed
%
system and others found in the literature, demonstrating that the errors produced during the
7 0.28 mm 3.9 % 0.02 mm
system and others found in the literature, demonstrating that the errors produced during 1.7 % the
estimation8 of the weld bead dimensions
0.31 mm are within
4.2 %the common range for both variables.
0.03 mm 2.5 After
% this
estimation of the weld bead dimensions are within the common range for both variables. After this
comparison,
9 it was concluded that the proposed
0.17 mm algorithm was 0.03
2.0 % satisfactory
mm for the post-welding
2.7 %
comparison, it was concluded that the proposed algorithm was 0.05
satisfactory for the post-welding
inspection10task. 0.19 mm 2.5 % mm 3.4 %
inspection task.
Average 0.25 mm 3.4 % 0.04 mm 2.7 %

Table 3 shows the comparison between the results obtained by the method used in the proposed
system and others found in the literature, demonstrating that the errors produced during the estimation
of the weld bead dimensions are within the common range for both variables. After this comparison,
it was concluded that the proposed algorithm was satisfactory for the post-welding inspection task.
6 0.26 mm 4.1 % 0.04 mm 3.2 %
7 0.28 mm 3.9 % 0.02 mm 1.7 %
8 0.31 mm 4.2 % 0.03 mm 2.5 %
9 0.17 mm 2.0 % 0.03 mm 2.7 %
Sensors 2020, 20, 4505 10 0.19 mm 2.5 % 0.05 mm 3.4 % 10 of 13

Average 0.25 mm 3.4 % 0.04 mm 2.7 %


Table 3. Methods comparison.
Table 3. Methods comparison.
System Average Absolute Error for Average Absolute Error for
Average
Width Absolute Average Absolute
Height
System
Error for Width Error for Height
Method used in the proposed system 0.25 mm 0.04 mm
Method usedAin[20]
Method the proposed system 0.250.25
mm mm 0.040.05
mm mm
Method Method
B [36] A [20] 0.150.25
mm mm 0.050.11
mm mm
Method B [36] 0.15 mm 0.11 mm
4. Discussion
4. Discussion
In order to evaluate the performance of the proposed integrated system under actual shop floor
In order
conditions, thetobehavior
evaluateof
the performance
a batch of the
containing proposed
1000 vessels integrated system
was analyzed. understep,
In every actual
allshop floor
the items
conditions,
were the for
evaluated behavior of a batch
determining containing
if they conformed1000 vessels
to the was analyzed.
requirements of theIn every step, all
corresponding the items
operations.
were evaluated for determining if they conformed to the requirements of the
A comparison was carried out between the previously existing human-based inspection system and corresponding
operations.
the proposedAcomputer
comparison wasapproach.
vision carried out between
Figure the previously
10 shows existing
the obtained human-based inspection
results.
system and the proposed computer vision approach. Figure 10 shows the obtained results.

Figure 10. Practical validation results of the proposed system vs. human-based inspection.
Figure 10. Practical validation results of the proposed system vs. human-based inspection.

As can
As can be
be seen,
seen, in
in the
the pre-welding
pre-welding inspection,
inspection, the
the computer
computer vision
vision system
system showed
showed aa better
better
performance, not only rejecting a higher number of nonconforming items (96.3%) than the human-
performance, not only rejecting a higher number of nonconforming items (96.3%) than the human-based
system (47.5%), but also wrongly rejecting a lower number of conforming items (0.4% vs. 1.6%).
As the welding of misaligned parts produced dimensionally incorrect weld joints, the fraction of
nonconforming items after the welding operation was also higher for the human-based inspection
system (11.1% vs. 7.3%). Finally, in the post-welding inspection, the ratio of true positives was
remarkably higher for the computer vision system (94.0% vs. 59.0%), while presenting a slightly lower
false negatives ratio (1.6% vs. 3.3%).
From an overall analysis of the whole system, it can be noted that introducing the computer
vision inspection increased the process quality index (i.e., the ratio between the conforming items and
the total produced items) from 95.0% to 99.5%. It should be remarked that this is an important step
toward a zero defects production. From the productivity point of view, the welding conforming index
(i.e., the ratio between the final conforming items and the welded items) was notably higher for the
Sensors 2020, 20, 4505 11 of 13

computer vision system (91.2% vs. 86.0%). This last index is especially important, as the welding
operation has a significant economic and environmental impact in the lifecycle assessment of the
produced vessels.

5. Conclusions
As the main outcome of the paper, a computer vision inspection system, for detecting joint
misalignment and geometrical defects in a welding process, was designed and implemented. In spite
of its low cost, the used hardware was shown to be effective for achieving the proposed goal, and
demonstrated a robust performance under the shop floor conditions where it was tested. In the
pre-welding inspection, the used CNN was capable of detecting misalignment in 97.7% of the cases
during the method testing. On the other hand, the laser triangulation approach used in the post-welding
inspection estimated the weld bead dimensions with an average relative error of 3.4% for the weld
bead width and 2.7% for the height during the method testing. The improvement of the overall quality
index of the process during practical validation, from 95.0% to 99.5%, supported the technical feasibility
of the industrial introduction of the proposed system.
As future development of the present work, it might be considered the incorporation of connectivity
capabilities to the implemented modules, through the concepts of the Industrial Internet of Things.
This addition will be an important step toward the integration of the considered welding process to a
modern manufacturing environment with interconnected production stations and lines.

Author Contributions: Conceptualization: M.R., G.B., and R.E.H.; methodology: M.R. and R.Q.; software:
Y.J.C.; validation: Y.J.C. formal analysis: Y.J.C. and M.R.; investigation: Y.J.C. and M.R.; writing—original draft
preparation: Y.J.C. and M.R.; writing—review and editing: R.Q., G.B., and R.E.H.; visualization: Y.J.C. and R.Q.
All authors have read and agreed to the published version of the manuscript.
Funding: This research has been partially funded by the European H2020 research and innovation program, ECSEL
Joint Undertaking, and Spanish National Funding Authority, entitled Power2Power: Providing next-generation
silicon-based power solutions in transport and machinery for significant decarbonisation in the next decade, under
grant agreement No. 826417 and PCI2019-103361.
Acknowledgments: Authors want to thank to the Forming Enterprise of Matanzas (Cuba) for supporting execution
of the experimental work.
Conflicts of Interest: The authors declare no conflicts of interest.

References
1. Khobragade, R.P.; Gandhe, R.R. Design & Analysis of Pressure Vessel with Hemispherical & Flat Circular
End. Int. J. Innov. Res. Sci. Technol. 2017, 4, 62–84.
2. Iarovyi, S.; Martinez, J.L.; Haber, R.E.; del Toro, R.M. From artificial cognitive systems and open architectures
to cognitive manufacturing systems. In Proceedings of the 2015 IEEE 13th International Conference on
Industrial Informatics, Cambridge, UK, 22–24 July 2015; IEEE: Piscataway, NJ, USA, 2015. [CrossRef]
3. Toudehdehghan, A.; Hong, T.W. A critical review and analysis of pressure vessel structures. In Proceedings
of the 1st International Postgraduate Conference on Mechanical Engineering, Pahang, Malaysia, 31 October
2018; IOP Publishing: Bristol, UK, 2019. [CrossRef]
4. Eger, F.; Coupek, D.; Caputo, D.; Colledani, M.; Penalva, M.; Ortiz, J.A.; Freiberger, H.; Kollegger, G. Zero
Defect Manufacturing Strategies for Reduction of Scrap and Inspection Effort in Multi-Stage Production
Systems. Procedia CIRP 2018, 67, 368–373. [CrossRef]
5. Chauveau, D. Review of NDT and process monitoring techniques usable to produce high-quality parts by
welding or additive manufacturing. Weld. World 2018, 62, 1097–1118. [CrossRef]
6. Castaño, F.; Beruvides, G.; Haber, R.E.; Artuñedo, A. Obstacle recognition based on machine learning for
on-chip lidar sensors in a cyber-physical system. Sensors 2017, 17, 2109. [CrossRef]
7. Castaño, F.; Beruvides, G.; Villalonga, A.; Haber, R.E. Self-tuning method for increased obstacle detection
reliability based on internet of things LiDAR sensor models. Sensors 2018, 18, 1508. [CrossRef]
8. Guo, J.; Zhu, Z.; Sun, B.; Yu, Y. Principle of an innovative visual sensor based on combined laser structured
lights and its experimental verification. Opt. Laser Technol. 2019, 111, 35–44. [CrossRef]
Sensors 2020, 20, 4505 12 of 13

9. Wang, N.; Zhong, K.; Shi, X.; Zhang, X. A robust weld seam recognition method under heavy noise based on
structured-light vision. Robot. Comput. Integr. Manuf. 2020, 61. [CrossRef]
10. Shao, W.J.; Huang, Y.; Zhang, Y. A novel weld seam detection method for space weld seam of narrow butt
joint in laser welding. Opt. Laser Technol. 2018, 99, 39–51. [CrossRef]
11. Shao, W.; Liu, X.; Wu, Z. A robust weld seam detection method based on particle filter for laser welding by
using a passive vision sensor. Int. J. Adv. Manuf. Technol. 2019, 104, 1–10. [CrossRef]
12. Fan, J.; Jing, F.; Yang, L.; Long, T.; Tan, M. An initial point alignment method of narrow weld using laser
vision sensor. Int. J. Adv. Manuf. Technol. 2019, 102, 201–212. [CrossRef]
13. Fan, J.; Deng, S.; Jing, F.; Zhou, C.; Yang, L.; Long, T.; Tan, M. An initial point alignment and seam tracking
system for narrow weld. IEEE Trans. Ind. Inform. 2019, 16, 877–886. [CrossRef]
14. Robertson, S.; Penney, J.; McNeil, J.L.; Hamel, W.R.; Gandy, D.; Frederick, G.; Tatman, J. Piping and Pressure
Vessel Welding Automation through Adaptive Planning and Control. JOM 2020, 72, 526–535. [CrossRef]
15. Chen, W.; Xiong, W.; Cheng, J.; Gu, Y.; Li, Y. Robotic Vision Inspection of Complex Joints for Automatic
Welding. In Proceedings of the 17th International Conference on Computer and Information Science,
Singapore, 6–8 June 2018; IEEE: Piscataway, NJ, USA, 2018. [CrossRef]
16. Du, R.; Xu, Y.; Hou, Z.; Shu, J.; Chen, S. Strong noise image processing for vision-based seam tracking in
robotic gas metal arc welding. Int. J. Adv. Manuf. Technol. 2019, 101, 2135–2149. [CrossRef]
17. Pinto-Lopera, J.E.; Motta, J.M.S.T.; Alfaro, S.C.A. Real-Time Measurement of Width and Height of Weld
Beads in GMAW Processes. Sensors 2016, 16, 1500. [CrossRef]
18. Soares, L.B.; Weis, A.A.; Guterres, B.V.; Rodrigues, R.N.; Botelho, S.S.C. Computer Vision System for Weld
Bead Analysis. In Proceedings of the 13th International Joint Conference on Computer Vision, Imaging and
Computer Graphics Theory and Applications, Madeira, Portugal, 27–29 January 2018; SciTePress: Setubal,
Portugal, 2018. [CrossRef]
19. Han, Y.; Fan, J.; Yang, X. A structured light vision sensor for on-line weld bead measurement and weld
quality inspection. Int. J. Adv. Manuf. Technol. 2020, 106, 2065–2078. [CrossRef]
20. Zhou, K.; Ye, G.; Gao, X.; Zhong, K.; Guo, J.; Zhang, B. Weld Bead Width and Height Measurement Using
RANSAC. In Proceedings of the 4th International Conference on Control and Robotics Engineering, Nanjing,
China, 20–23 April 2019; IEEE: Piscataway, NJ, USA, 2019. [CrossRef]
21. Ye, G.; Guo, J.; Sun, Z.; Li, C.; Zhong, S. Weld bead recognition using laser vision with model-based
classification. Robot. Comput. Integr. Manuf. 2018, 52, 9–16. [CrossRef]
22. Singh, A.K.; Dey, V.; Rai, R.N.; Debnath, T. Weld Bead Geometry Dimensions Measurement based on Pixel
Intensity by Image Analysis Techniques. J. Inst. Eng. (India): Series C 2017, 100, 379–384. [CrossRef]
23. Leo, M.; Coco, M.D.; Carcagnì, P.; Spagnolo, P.; Mazzeo, P.L.; Distante, C.; Zecca, R. Automatic visual
monitoring of welding procedure in stainless steel kegs. Opt. Lasers Eng. 2017, 104, 220–231. [CrossRef]
24. Zeng, J.; Chang, B.; Du, D.; Hong, Y.; Zou, Y.; Chang, S. A visual weld edge recognition method based on light
and shadow feature construction using directional lighting. J. Manuf. Process. 2016, 24, 19–30. [CrossRef]
25. Dung, C.V.; Sekiya, H.; Hirano, S.; Okatani, T.; Miki, C. A vision-based method for crack detection in gusset
plate welded joints of steel bridges using deep convolutional neural networks. Autom. Constr. 2019, 102,
217–229. [CrossRef]
26. Khumaidi, A.; Yuniarno, E.M.; Purnomo, M.H. Welding Defect Classification Based on Convolution Neural
Network (CNN) and Gaussian Kernel. In Proceedings of the International Seminar on Intelligent Technology
and Its Application, Surabaya, Indonesia, 28–29 August 2017; IEEE: Piscataway, NJ, USA, 2017. [CrossRef]
27. Zhang, Y.; You, D.; Gao, X.; Zhang, N.; Gao, P.P. Welding defects detection based on deep learning with
multiple optical sensors during disk laser welding of thick plates. J. Manuf. Syst. 2019, 51, 87–94. [CrossRef]
28. Bacioiu, D.; Melton, G.; Papaelias, M.; Shaw, R. Automated defect classification of SS304 TIG welding process
using visible spectrum camera and machine learning. NDT E Int. 2019, 107. [CrossRef]
29. Yang, Y.; Pan, L.; Ma, J.; Yang, R.; Zhu, Y.; Yang, Y.; Zhang, L. A High-Performance Deep Learning Algorithm
for the Automated Optical Inspection of Laser Welding. Appl. Sci. 2020, 10, 933. [CrossRef]
30. Hou, W.; Wei, Y.; Jin, Y.; Zhu, C. Deep features based on a DCNN model for classifying imbalanced weld
flaw types. Measurement 2019, 131, 482–489. [CrossRef]
31. Bradley, D.; Roth, G. Adaptive Thresholding using the Integral Image. J. Graph. Tools 2007, 12, 13–21.
[CrossRef]
Sensors 2020, 20, 4505 13 of 13

32. Itu, R.; Danescu, R. A self-calibrating probabilistic framework for 3d environment perception using monocular
vision. Sensors 2020, 20, 1280. [CrossRef]
33. König, C.; Helmi, A.M. Sensitivity analysis of sensors in a hydraulic condition monitoring system using cnn
models. Sensors 2020, 20, 3307. [CrossRef]
34. Lo, C.-C.; Lee, C.-H.; Huang, W.-C. Prognosis of bearing and gear wears using convolutional neural network
with hybrid loss function. Sensors 2020, 20, 3539. [CrossRef]
35. Yang, K.; Chu, R.; Zhang, R.; Xiao, J.; Tu, R. A novel methodology for series arc fault detection by temporal
domain visualization and convolutional neural network. Sensors 2020, 20, 162. [CrossRef]
36. Khanna, P.; Maheshwari, S. Development of Mathematical Models for Prediction and Control of Weld Bead
Dimensions in MIG Welding of Stainless Steel 409M. Mater. Today: Proc. 2018, 5, 4475–4488. [CrossRef]

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like