2D Robot With AI To Replicate The Path
2D Robot With AI To Replicate The Path
2D Robot With AI To Replicate The Path
doi:10.1017/S026357472100059X
RESEARCH ARTICLE
Abstract
Welding is a complex manufacturing process. Its quality depends on the welder skills, especially in welding complex
paths. For consistency in modern industries, the arm robot is used to accomplish this task. However, its programming
and reprogramming are time consuming and costly and need an expert programmer. These limit the use of robots
in medium and small industries. This paper introduces a new supervised learning technique for programming a
4-degree of freedom (DOF) welding arm robot with an automatic feeding electrode. This technique is based on
grasping the welding path control points and motion behavior of an expert welder. This is achieved by letting the
welder move the robot end effector, which represents the welding torch, through the welding path. At the path
control points, the position and speed are recorded using a vision system. Later, these data are retrieved by the
robot to replicate the welding path. Several 2D paths are tested to assess the proposed approach accuracy and
programming time and easiness in comparison with the common one. The results prove that the proposed approach
includes fewer steps and consumes less programming time. Moreover, programming can be accomplished by the
welder and no need for an expert programmer. These enhancements will improve the share of robots in welding and
similar industries.
1. Introduction
Welding is one of the important and widely used manufacturing processes. It is a process to join met-
als by melting and fusing them, which adds value to the product and economy. In the manual welding
process, the weld quality depends on the skillfulness of the welder. However, it is a physically demand-
ing process and leads to repetitive injuries with bad effects on the welder’s health. This is due to the
heat, radiation, and toxic gases. Due to the disadvantages over manual operation, robotic welding plays
an important role in industrial manufacturing in recent years [1]. This is because of its flexibility and
intelligence with the potential of performing tasks as well as humans, or even better at acceptable cost
and quality levels. The welding robot needs a program to set the trajectory profile and follow the prede-
fined planned path. Robot programming approaches are classified into; off-line method, teach pendant
method, and lead-through approach [2].
The off-line programming method includes text-based and graphical programming environments.
Text-based programming is based on conventional programming languages. It is time consuming, costly,
and needs a well-trained expert programmer. The graphical programming environment provides an alter-
native to their text-based counterparts as it enables the possibility of using simulation technology tools
and robot kinematics [3, 4]. Therefore, the robot is programmed and simulated before the move to the
real world. However, this method requires extra time and an expert programmer [5].
In the teach-pendant programming method, the operator uses the teach pendant buttons to move
the robot arm from point-to-point and save every point position. The robot can redo the process once
Downloaded from https://www.cambridge.org/core. University of Glasgow Library, on 11 Aug 2021 at 21:48:00, subject to the Cambridge Core terms of use, available
at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S026357472100059X
2 Mohamed Hosni Mohamed Ali and Mostafa Rostom Atia
the whole program has been generated using these points. To define the desired trajectory in welding
and painting, there are difficulties due to the distance between the programmer and the workplace [6].
Therefore, the teach-pendant programming method not intuitive especially in workpieces with complex
geometry, where it needs rounds of refs. [7, 8]. More challenges are raised when the application includes
a common force control tool between the robotic arm and humans such as peg-hole intersection and
surface grinding [9].
When compared to other approaches, the lead through method is useful in teaching the robot arm the
control points and in between speed along the welding path [10, 11]. It is based on the physical movement
of the manipulator by the skilful operator [12], while its motion is recorded then replicated later by the
robot. However, one of the existing challenges for generating a robotic arm trajectory is the observation
of a human demonstration. To replicate this motion, the programmer requires deep knowledge of robot
motion modes and programming steps [13]. Therefore, it is not guaranteed that the tip of the robotic
arm reaches the desired point.
The literature review addresses several difficulties in using robots in some processes such as welding.
They are the long time of programming and reprogramming, need of an expert in process, and program-
ming and consequently high cost. This limits the propagation of using robots, especially in medium and
small industries. Therefore, developing a fast and simple programming method is one of the main goals
for the researchers.
This paper aims to introduce a new programming approach based on lead-through programming and
a vision system. Its contribution is intended to simplify the programming process to the level of the
technicians and reduce the programming and reprogramming time. It is expected that this will reduce
the cost and complexity and hence increase the share of the robot in the welding industry.
The experimental setup is explained in detail in Section 2. The method details are described in
Section 3. The experimental work including a discussion of the results and a comparison between the
proposed approach and common practice is in Section 4. Section 5 is the conclusions and contribution.
2. Experimental setup
This section describes the experimental setup used in this research. The setup is used to test experimen-
tally the proposed programming approach. Figure 1 shows the main components of the setup. It consists
of a 4-degree of freedom (DOF) robotic arm with a pen, with 120 mm length, as its end effector. The
pen mimics the welding torch for safety and simplifies the test procedure. Besides, the setup includes a
personal computer (PC) and a machine vision. The test paths are drawn on a whiteboard with 590 mm
width and 440 mm height. It contains a matrix of black dots with 5mm space in X and Y directions.
Figure 2 shows an example of a test path.
The robot type is phantom-x pincher robotic arm, which has four revolute joints. Every joint has a
smart servo motor running from 0◦ to 300◦. Its reach (350 mm) vertical and (310 mm) horizontal. It has
smart actuators with a daisy chain connected through USB2Dynamixels and controlled by Arbotix-m
Downloaded from https://www.cambridge.org/core. University of Glasgow Library, on 11 Aug 2021 at 21:48:00, subject to the Cambridge Core terms of use, available
at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S026357472100059X
Robotica 3
robocontroller. The smart actuators can record, save, and playback their motions. The robot controller is
connected directly to the PC by USB port 2.0. The PC has a processor Intel core i7-7500U Central
Processing Unit (CPU) @ 2.70 Gigahertz (GHz) and 32.0 Gigabyte (GB) random access memory
(RAM).
The machine vision is Microsoft Kinect one for windows. It is connected to PC through USB 3.0 port
in eye-to-hand concept (fixed) and developed using the Software Development Kit (SDK) 2.0 and graph-
ical programming G language, which perceives the depth based on its time-of-flight (ToF) technology
[14], as well as a red, green, and blue (RGB) camera to capture videos and images. The captured images
are processed using LABVIEW robotics software to identify the pen back location as an indication to
the pen tip as shown in Fig. 3. The machine vision is calibrated using National Instruments (NI) vision
calibration tool. The aim of this calibration is giving the measurements in pixels a value in mm unit.
The camera’s axis should be perpendicular as possible to the planned path for less distortion error and
the background is dark to enhance light contrast.
Downloaded from https://www.cambridge.org/core. University of Glasgow Library, on 11 Aug 2021 at 21:48:00, subject to the Cambridge Core terms of use, available
at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S026357472100059X
4 Mohamed Hosni Mohamed Ali and Mostafa Rostom Atia
3. Method description
In this section, the proposed lead-through programming approach is described in steps. The program-
ming is accomplished in two main steps; record and playback. In record step, the operator grasps the
stylus pen, as the robot end effector, and follows the desired path on the board as if he welds this path as
shown in Fig. 4. The stylus pen back is used as a pointer to the welding tip as the pen length is known and
the two ends are coaxial. During this motion, arms angle and angular velocity are recorded by the robot
controller. These data are transferred and recorded on the PC. Moreover, the machine vision system
records the path control points as the desired reference.
In the playback step, these data are used to generate the robot move-program code using the developed
software. This program intends to replicate the original move. The machine vision system observes and
records the control points of the actual path of the end effector. These data are used later to assess the
system’s accuracy.
Downloaded from https://www.cambridge.org/core. University of Glasgow Library, on 11 Aug 2021 at 21:48:00, subject to the Cambridge Core terms of use, available
at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S026357472100059X
Robotica 5
In Fig. 6, all of the experiments applied on the second set are complex moves in two directions such
as square shape. In this case, the robot’s mechanical backlash is observed in turn points. In path #4, the
absolute error range is from 0.0809 to 0.0866 mm and point’s numbers 4, 9, 15, 21 are the turn points
with 0.0810, 0.0812, 0.0809, and 0.08143 mm absolute errors in X, Y positive and negative directions.
Hence, in path #5, the absolute error range is from 0.1007 to 0.1082 mm and point’s numbers 7, 13,
19, 22 are the turn points with 0.0103, 0.0102, 0.1030, and 0.1007 mm absolute errors in X, Y positive,
and negative directions. Finally, in path #6, the absolute error range is from 0.1717 to 0.1908 mm and
point’s numbers 3, 9, 16, 21 are the turn points with 0.1770, 0.1718, 0.1717, and 0.1769 mm absolute
errors in X, Y positive, and negative directions.
For welding applications, the results show small and acceptable absolute errors in retrieving the
original control points on the path. These errors have several sources. One source is the mechanical
inaccuracies such as backlash and inertia. This is clear in error in the first control point where the robot
is at an arbitrary point before reaching the desired point. In addition, it appears in every sharp turn and
when the movement direction is reversed. Another source is the camera axis perpendicular to the path
Downloaded from https://www.cambridge.org/core. University of Glasgow Library, on 11 Aug 2021 at 21:48:00, subject to the Cambridge Core terms of use, available
at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S026357472100059X
6 Mohamed Hosni Mohamed Ali and Mostafa Rostom Atia
Downloaded from https://www.cambridge.org/core. University of Glasgow Library, on 11 Aug 2021 at 21:48:00, subject to the Cambridge Core terms of use, available
at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S026357472100059X
Robotica 7
Figure 7. (a) PhantomX-Pincher—CAD model and (b) coordinate frame assignment. Where: θ1 = Base,
θ2 = Shoulder, θ3 = Elbow and θ4 = Wrist.
plan. The accuracy of the points degrades with the distance from the focus point. This is clear in straight-
line paths. Human error is one of the major sources of errors as the operator follows the path using his
vision and move ability during lead-through programming.
Downloaded from https://www.cambridge.org/core. University of Glasgow Library, on 11 Aug 2021 at 21:48:00, subject to the Cambridge Core terms of use, available
at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S026357472100059X
8 Mohamed Hosni Mohamed Ali and Mostafa Rostom Atia
toolbox [16] is used for solving the kinematics problem of the robot. For example, Fig. 8 shows the user
interface while displaying the numerical solution of the inverse kinematics problem for the end effector
coordinates at a certain point. Based on DH convention, the transformation matrix from joint i to joint
i + 1 is given by:
⎛ ⎞
cosθ i −sinθ i cosαi sinθ i sinαi αicosθ i
⎜ ⎟
⎜ sinθ i cosαi cosθ i −sinαi cosθ i αisinθ i ⎟
Tii−1 = ⎜⎜ 0
⎟... (1)
⎝ sinθ i cosαi di ⎟⎠
0 0 0 1
Downloaded from https://www.cambridge.org/core. University of Glasgow Library, on 11 Aug 2021 at 21:48:00, subject to the Cambridge Core terms of use, available
at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S026357472100059X
Robotica 9
Figure 9. Robot simulation and the executed paths in the real world.
Figure 10. Process sequences of the proposed approach and the common practice.
Downloaded from https://www.cambridge.org/core. University of Glasgow Library, on 11 Aug 2021 at 21:48:00, subject to the Cambridge Core terms of use, available
at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S026357472100059X
10 Mohamed Hosni Mohamed Ali and Mostafa Rostom Atia
Table IV. Comparison between the proposed approach and the common practice.
Proposed approach Common practice
Time of programming Shorter time Longer time
Expert programmer Not required Required
CAD simulation Not required Required
Kinematics solutions Not required Required
Product Small piece Large piece
the comparison between the two approaches. Table V shows the source of error between the proposed
approach and common practice. Figure 11 shows a comparison between the programming times of the
two approaches for three paths. The programming time reduction (TR%) in the proposed approach is
significant.
5. Conclusion
In conclusion, this paper introduces a new lead-through programming approach with the assistant of
a vision system. The proposed approach is applied on a 4-DOF welding arm robot with an automatic
feeding electrode. The proposed approach is tested experimentally. A comparison between the proposed
and common approaches is carried out. The proposed approach has fewer steps and less complexity in
programming and reprogramming. It has a shorter programming time with a high percentage of more
than 99%. As the programming can be done by the skilled welder, there is no need for the rare and
Downloaded from https://www.cambridge.org/core. University of Glasgow Library, on 11 Aug 2021 at 21:48:00, subject to the Cambridge Core terms of use, available
at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S026357472100059X
Robotica 11
expensive expert programmer. Path following accuracy is expected to increase as the welder follows the
actual path with the desired offset and with all manufacturing deviations. However, since the inputs are
based on the human senses, the proposed approach is limited to this range of accuracy. Therefore, the
proposed approach is suitable for programming and reprogramming of robots used in small and medium
enterprises with acceptable accuracy. This will increase the robot share in global industries.
References
[1] Z. Wenming, D. Zhihai and L. Zhanqi, “Present Situation and Development Trend of Welding Robot,” 2nd International
Conference on Materials Science, Machinery and Energy Engineering (MSMEE) (2017).
[2] M. H. A. L. Wei and L. S. Yong, “An Industrial Application of Control of Dynamic Behavior of Robots - A Walk
Through Programmed Welding Robot,” Proceedings of the IEEE, International Conference on Robotics and Automation,
San Francisco (2000).
[3] H.-C. Lin, Y. Fan, T. Tang and M. Tomizuka, “Human Guidance Programming on a 6-DoF Robot with Collision Avoidance,”
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea (2016).
[4] M. Karlsson, A. Robertsson and R. Johansson, “Autonomous Interpretation of Demonstrations for Modification of
Dynamical Movement Primitives,” IEEE International Conference on Robotics and Automation (ICRA), Singapore (2017).
[5] L. Bascetta, G. Ferretti, G. Magnani and P. Rocco, “Walk-through programming for robotic manipulators based on
admittance control,” Robotica 31(7), 1143–1153 (2013).
[6] L. Qi, D. Zhang and J. Li, “A Lead-Through Robot Programming Approach Using A 6-DOF Wire-based Motion Tracking
Device,” Proceedings of the 2009 IEEE, International Conference on Robotics and Biomimetics, Guilin, China (2009).
[7] T.-W. Kim, K.-Y. Lee, J. Kim, M.-J. Oh and J. H. Lee, “Wireless Teaching Pendant for Mobile Welding Robot in Shipyard,”
Proceedings of the 17th World Congress, The International Federation of Automatic Control, Seoul, Korea (2008).
[8] I. W. Muzan, T. Faisal, H. Al-Assadi and M. Iwan, “Implementation of Industrial Robot for Painting Applications,”
International Symposium on Robotics and Intelligent Sensors (IRIS) (2012).
[9] H.-C. Lin, T. Tang, Y. Fan, Y. Zhao, M. Tomizuka and W. Chen, “Robot Learning from Human Demonstration with Remote
Lead Through Teaching,” European Control Conference (ECC) (2016) pp. 388–394.
[10] A. A. Mohammed and M. Sunar, “Kinematics Modeling of a 4-DOF Robotic Arm,” International Conference on Control,
Automation and Robotics, Singapore (2015).
[11] A. Somasundar and G. Yedukondalu, “Robotic Path Planning and Simulation by Jacobian Inverse for Indstrial Applications,”
International Conference on Robotics and Smart Manufacturing (RoSMa) (2018).
[12] L. A. Ferreira, Y. L. Figueira, I. F. Iglesias and M. Á. Souto, “Offline CAD-based Robot Programming and Welding
Parametrization of a Flexible and Adaptive Robotic Cell Using Enriched CAD/CAM System for Shipbuilding,” 27th
International Conference on Flexible Automation and Intelligent Manufacturing, Modena, Italy (2017).
[13] H.-I. Lin, Y.-C. Liu and Y.-H. Lin, “Intuitive kinematic control of a robot arm via human motion,” Procedia Eng. 79, 411–416
(2014). https://www.sciencedirect.com/science/article/pii/S1877705814009412.
[14] E. Lachat, H. Macher, M. A. Mittet, T. Landes and P. Grussenmeyer, “First Experiences with Kinect v2 Sensor for Close
Range 3D Modeling,” 6th International Workshop 3D-ARCH, Avila, Spain (2015).
[15] B. Prashanth and A. R. S, “Design, Fabrication and Control of Four Degrees of Freedom Serial Manipulator,” International
Conference on Smart Systems and Inventive Technology (ICSSIT) (2018).
[16] P. Corke, “Using the Robotics Toolbox with a real robot,” In: Robotics, Vision & Control (Springer, 2013).
Cite this article: M. H. M. Ali and M. R. Atia. “A lead through approach for programming a welding arm robot using machine
vision”, Robotica. https://doi.org/10.1017/S026357472100059X
Downloaded from https://www.cambridge.org/core. University of Glasgow Library, on 11 Aug 2021 at 21:48:00, subject to the Cambridge Core terms of use, available
at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S026357472100059X