Academia.eduAcademia.edu

Industrial Tracking Camera and Product Vision Detection System

2019, Journal of Mechanical Engineering Research and Developments

Many industrial and commercial applications today are beginning to use autonomous systems to increase productivity and reduce costs for production and manpower. Most of these applications are only semi-autonomous, it still needs assistance from a human to start up or receive continual instructions. It can be improve the productivity using the image processing techniques based on the camera processing capabilities and more efficient vehicles. This research paper describes a vision tracking system platform and USB camera that used to make a distinguishing operation applying to real-time video tracking processing for moving product. The industrial camera tracking system is designed to provide tracking and sorting for the products based on the shape quality criterion, which is means reject the product with low quality (bad shape). The platform is used to distinguish different shapes of products and tracking operation. The received video is displayed in the computer through the acquisition video, taking advantage of the toolkit for acquisition video and image processing. It can be determined where the place of the part and detect the product, and then send information to the control system to remove unwanted product.

Journal of Mechanical Engineering Research and Developments (JMERD) 42(4) (2019) 277-280 Journal of Mechanical Engineering Research and Developments (JMERD) ISSN: 1024-1752 CODEN : JERDFO DOI : http://doi.org/10.26480/jmerd.04.2019.277.280 RESEARCH ARTICLE INDUSTRIAL TRACKING CAMERA AND PRODUCT VISION DETECTION SYSTEM Wisam T. Abbood1, Hiba K. Hussein1 and Oday I. Abdullah2,3* 1Department of Automated Manufacturing Engineering, University of Baghdad, Baghdad, Iraq Engineering Department, University of Baghdad, Baghdad, Iraq Baghdad-Aljadria 47024, Iraq 3Hamburg University of Technology, Hamburg, Germany *Corresponding Author E-mail: [email protected] 2Energy This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. ARTICLE DETAILS ABSTRACT Article History: Many industrial and commercial applications today are beginning to use autonomous systems to increase productivity and reduce costs for production and manpower. Most of these applications are only semi- autonomous, it still needs assistance from a human to start up or receive continual instructions. It can be improve the productivity using the image processing techniques based on the camera processing capabilities and more efficient vehicles. This research paper describes a vision tracking system platform and USB camera that used to make a distinguishing operation applying to real-time video tracking processing for moving product. The industrial camera tracking system is designed to provide tracking and sorting for the products based on the shape quality criterion, which is means reject the product with low quality (bad shape). The platform is used to distinguish different shapes of products and tracking operation. The received video is displayed in the computer through the acquisition video, taking advantage of the toolkit for acquisition video and image processing. It can be determined where the place of the part and detect the product, and then send information to the control system to remove unwanted product. Received 14 July 2019 Accepted 20 August 2019 Available online 12 September 2019 KEYWORDS industrial tracking, objects tracking, moving products tracking, sorting products 1. INTRODUCTION The vision system applied to inspect the product in industrial factory. It is give the speed to detection of product and reduced the time. It is consist from vision sensor (camera) as line or area sensors using to input the digital image or video into the host computer. The image processing algorithm applied to analysis the digital image or video that makes it distinguish the defect, shape or color. Often, the analysis filters applied to distinguish operation in vision system such as edge detection, surface defect detection and intensity of light reflecting from the surface of product detection. The video processing algorithms more complicated compared with the image processing algorithms. The tracking video applied the video processing algorithms to detect the product. Recently, tracking video is used in wide range of engineering applications; the tracking is used in deferent fields as face tracking, color tracking and object tracking. It was used tracking for objects in industrial sector for products, human of factory and human cloth of factory tracking. The video tracking is based on image process to mono video and object detection algorithms. The aims of this research are detection the products, camera tracking to the good product and rejection to the bad product. Also, it can be achieved the followings: 1. image set to the product at the selected illumination angle as template are saving. 2. Produce the mono video to get rid of color analysis when analyzing video. 3. Apply the analysis algorithm to detect the pixels matching of template that saving and the product mono video. 4. Tracking the good product that detection from the matching operation. Probabilistic tracking is used the smart camera that enable by network to object tracking [1]. Automatic tracking system is based on real time machine vision which is applied to motion analysis in industrial field [2]. There many targets for tracking of inter-camera with non-over lapping fields of view such as identities of people when they are moving from one camera to another [3]. High accuracy model tracking is easy to use and easy to integrate under the difficult environmental conditions [4]. Tracking system based on an infrared laser device for very large indoor environments, consisting of several hundred square meters to obtain a six degree of freedom [5]. Tracking system by applying the camera-based fuel for quantification of the motion fuel particles at the surface of fluidized beds operated under hot conditions [6]. Optical tracking system is used in surgical procedure that consist from robot system with autonomously position tools at points correlated with imaging techniques such as MR and CT [7]. Tracking in the real time is predefined object using a single wireless camera, such as the object tracking based on shape and color analysis [8]. Object tracking based on multi agent robot system that can be handled the operations using of stereo vision in unstructured laboratory environment [9]. Path tracking for robot controlled with camera-space manipulation is used to define the trajectory over an arbitrary surface [10]. Robot positioning and path tracking precise are based on the vision and calibration-free robot control method such as camera space manipulation, the tracking performed by industrial robot over large surfaces of arbitrary shape, size and orientation [11]. Multi vehicle tracking and counting in the real time are used under fisheye camera based on point tracking [12]. A novel real time 3D-model tracking is used monocular camera that can be provided an accurate 3D-location of the object tracking [13]. Iris tracking in real time using the smart camera, LabVIEW and vision software tool utilized the tracking algorithms [14]. One of the important types of the visual tracking technology that works based on the effective method of image processing to capture the dynamic movement of an overhead crane [15]. Tracking camera RRP (Revolute Revolute Prismatic) joints to structure robot with three-joint and the vehicle provides position, velocity and acceleration control [16]. Multiple symmetric and nonsymmetric objects that tracking in the real time at dynamic environment [17]. Also, the visual trajectory tracking is used iterative learning [18]. Industrial robot is used to track an object using laser beam projection, the object moving in 2D-plane that works space [19]. There are many investigations deal with the human tracking approach that applied in industrial environment [20, 21 and 22], sometime using RGB-D smart camera [23, 24 and 25], or using high visibility clothing to tracking [26]. In some cases, it was used the multi camera for tracking [27] or using wireless camera network [28]. The flexible camera inspection system is used for sensing and tracking in 3D position [29]. Face tracking system that has the high-quality classifier [30]. Multi cameras using for tracking Cite The Article: Wisam T. Abbood, Hiba K. Hussein and Oday I. Abdullah (2019). Industrial Tracking Camera And Product Vision Detection System. Journal of Mechanical Engineering Research and Developments, 42(4) : 277-280. Journal of Mechanical Engineering Research and Developments (JMERD) 42(4) (2019) 277-280 mobile object [31]. Active smart node is used to detect object and provide the relation between cameras to tracking the object [32]. Real time camera tracking using depth map [33]. In this paper, it was designed tracking the product vision system based on the camera and the quality of the shape for product. It was developed the system tracking for the moving products and detect the shape quality of the product (slandered shape and dimensions of the product). The camera has the ability to move in different directions (right, left, up and down). The developed system has the ability to find the percentage of the quality of the products; it was used 100 samples of two different products in shape and size. 2. TRACKING CAMERA AND VISION DETECTION SYSTEM The tracking camera and vision detection system is consisted of camera, ARDUINO microcontroller, conveyor belt, rejection arm. The camera was fixed on two servo motors, the first servo motor was controlled the camera at the right or left moving and then the second servo motor was controlled the camera at the up or down moving. These motors are connected to ARDUINO microcontroller as show in Figure 1. The conveyor belt is connected to dc motor 24V. The rejection arm is consisted of metal plate have 250 mm width and 1200 mm length that connected to servo motor to rotate arm when reject the bad products as shown in figure 2. The basic information of MG996R servo motor of tracking camera and rejection parts is lists in Table 1. Table 1: The Basic Information of MG996R Servo Motor Modulation: Torque: Speed: Weight: Dimensions: Motor Type: Gear Type: Rotation/Support: Pulse Width: Connector Type: Analog 4.8V: 25.00 oz-in (1.80 kg-cm) 4.8V:0.12 sec/60° 0.32 oz (9.0 g) Length:0.91 in (23.0mm) Width:0.48 in (12.2mm) Height:1.14 in (29.0 mm) 3-pole Plastic Bushing 500-2400 µs JR information for all pixels values for gray-level or color within the area of target [34, 35 and 36]. The target area can bound with rectangle or ellipse, that define as [34], Xi=(ui, vi, hi, wi, θi) (1) Where Xi is state of the target area at the video, Yi= (ui,vi) is defined the center, hi is the high, wi is the width and θi is the clockwise rotation (this is optionally). The functions for the template are the L1 norm: [34], dL1 (Xi, IT, I) = ∑𝑤∊𝐼𝑇|𝐼(𝐴(𝑤, 𝑋𝑖)) − 𝐼𝑇(𝑤)| (2) Where A is a transformation that, given the state Xi, and w is maps a pixel position in the coordinate system of the template IT onto the coordinate system of the input image I from the video. The L2 norm: [34], dL2 (Xi, IT, I) = ∑𝑤∊𝐼𝑇(𝐼(𝐴(𝑤, 𝑥𝑖)) − 𝐼𝑇(𝑤))2 (3) The normalized cross-correlation coefficient: [34], 1 dC (Xi, IT, I) = 1-|𝐼|−1 ∑𝑤∊𝐼𝑇 ̅̅̅ ) (𝐼(𝐴(𝑤,𝑋𝑖))−𝐼)̅ −(𝐼𝑇(𝑤)−𝐼𝑇 𝜎𝐼 𝜎𝐼𝑇 (4) Where σ is indicates the mean and variance of pixel values for the template IT. The operation of tracking product is based on matching of the product template IT with real video of target product represented by X i as in equation (1) which is Gray code video. The analysis for the video by first deferential and second deferential equations for the function of template as in equations (2) and (3) gives indication to the moving of target in video pixels coordinate. In equation (4) the deferential equation gives the crosscorrelation coefficient for template function which applied to give the variance of pixel value for the target compare with template image. The flowchart of the developed approach to tracking product is shown in figure 3. The template is inputting as begin and initial counter for product matching, tracking and counter for product rejected. Then convert the video of product from RGB to Grey code, later the system will be checked if the template matches with the target product or not. The product tracking counter will be increased by 1, then the boundary box will be disappeared around the target product in the real time video until the product across the system and if not matching, the rejecting product counter will be increased by 1. Finally, the rejection part will be pushed the bad product (rejected product) away from the conveyor belt. Figure 1: Tracking Camera with Servo Motors Connected to ARDUINO Microcontroller Open Video Input Template Ip = 0 In = 0 Convert RGB video to Grey Code Video If product in Video Matching to Template Figure 2: Tracking camera with Conveyor Belt and Rejection Part The performance of system starts when the product arrives to the conveyor belt and then the camera detect the shape of product and then tracking it from the beginning to reject the bad quality product. The rejection servo motor part will be rotated when the product has the low quality of shape (bad shape) or not same standard dimensions to remove it from conveyor belt. In = In + 1 Great Boundary Box around Target Product Rejected Product Ip = Ip + 1 3. THE TRACKING AND DETECTION OF THE PRODUCT The tracking and detection of the object (product) based on algorithm to matching the reference picture for product shape or size when the product moving in the real time video, this operation is called template. The template is a common target representation, which is the positional Continued Tracking Figure 3: Flowchart of the developed approach to tracking product based on matching Cite The Article: Wisam T. Abbood, Hiba K. Hussein and Oday I. Abdullah (2019). Industrial Tracking Camera And Product Vision Detection System. Journal of Mechanical Engineering Research and Developments, 42(4) : 277-280. Journal of Mechanical Engineering Research and Developments (JMERD) 42(4) (2019) 277-280 4. RESULTS AND DISCUSSIONS The results of product detection and tracking are achieved based on the matching of pixels video of product and pixels sample of product. Therefore, the detection and tracking will be dependent on the size and shape of standard product. The tracking camera can be detected and tracked for multi-product in one view field, and the undesired part (defected part) will be rejected, only for the product that didn’t match with reference product. This operation was produced in real time, the conveyor belt speed dependent on tracking servo motors that which fixed the camera. It was used different samples of the products: Pepsi Can and Plastic Gear. It was used 100 products from each product to verify the accuracy of the results of the developed system. The speed conveyor belt is 4 cm/s. It was found based on the results that the system efficiency is 100% to detect the product based on the shape and the size. The results of system are shown in figures 4-8. The product acceptance and tracking of Pepsi Can is shown in figure 4 that it is continues moving to filling part. The defected Pepsi Cans are rejected by rejection part of system as shown in figures 5. The system can be detected the accepted product from rejected product if they are in same field of the video of camera as shown in figure 6. The system can be detected the small defect about 3mm × 2mm in Plastic Gear (dimensions: 20 mm diameter and 2 mm thickness). The accepted gear is shown in figure 7 and the rejected gear is shown in figure 8, it can be seen the dimensions of defecting according to size of gears. It was matched the target products with the reference products according to the detecting process by the developed system between all defective products. Figure 7: Gears product detection and tracking Figure 8: System cannot detection and tracking for gears product that have defect 5. CONCLUSIONS In this research work, it was developed a new system that has the ability to detect and track any kind of product according to shape and size, and it can be detected and tracking the product even with the complex shape. The camera was tracking the product to the end the conveyor belt of system for the good product. The system sends the rejection decision for the rejection part at the suitable time and in real time if the bad products are detection. The only one disadvantage in the developed system is the limitation to detect the product if it is become far from the certain distance for tracking camera and if the angle of illumination of camera changed. Because of the template picture will change in pixel if change in angle of illumination. In future work, it will be enhanced the present system to be faster to be more suitable for fast production process. REFERENCES Figure 4: The product detection and tracking (Pepsi Can) [1] Fleck, S., Lanwer, S., Straßer, W. 2005. A smart camera approach to real-time tracking. In 2005 13th European Signal Processing Conference, IEEE, 1-4. [2] Li, S., Liu, W., Xin, D., Qiao, S. 2011. An automatic identification tracking system applied to motion analysis of industrial field. In 2011 International Conference on Electric Information and Control Engineering, IEEE, 1151-1154. [3] Cai, Y., Medioni, G. 2014. Exploring context information for intercamera multiple target tracking. In IEEE Winter Conference on Applications of Computer Vision, IEEE, 761-768. Figure 5: System can’t detection and tracking for products that have defect (Pepsi Can) [4] Wuest, H., Engekle, T., Wientapper, F., Schmitt, F., Keil, J. 2016. From CAD to 3D Tracking—Enhancing & Scaling Model-Based Tracking for Industrial Appliances. In 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), IEEE, 346-347. [5] Scheer, F., Müller, S. 2012. Indoor Tracking for Large Area Industrial Mixed Reality. In ICAT/EGVE/EuroVR, 21-28. [6] Sette, E., Vilches, T.B., Pallarès, D., Johnsson, F. 2016. Measuring fuel mixing under industrial fluidized-bed conditions–A camera-probe based fuel tracking system. Applied energy, 163, 304-312. [7] Šuligoj, F., Jerbić, B., Švaco, M., Šekoranja, B., Mihalinec, D., Vidaković, J. 2015. Medical applicability of a low-cost industrial robot arm guided with an optical tracking system. In 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, 3785-3790. [8] Gornea, D., Popescu, D., Stamatescu, G., Fratila, R. 2014. Monocamera robotic system for tracking moving objects. In 2014 9th IEEE Conference on Industrial Electronics and Applications, 1820-1825. IEEE. Figure 6: The products detection and tracking between the products that have defects [9] Šuligoj, F., Šekoranja, B., Švaco, M., Jerbić, B. 2014. Object tracking with a multiagent robot system and a stereo vision camera. Procedia Engineering, 69, 968-973. Cite The Article: Wisam T. Abbood, Hiba K. Hussein and Oday I. Abdullah (2019). Industrial Tracking Camera And Product Vision Detection System. Journal of Mechanical Engineering Research and Developments, 42(4) : 277-280. Journal of Mechanical Engineering Research and Developments (JMERD) 42(4) (2019) 277-280 [10] Bonilla, I., Mendoza, M., Gonzalez-Galvan, E. J., Chavez-Olivares, C., Loredo-Flores, A., Reyes, F. 2012. Path-tracking maneuvers with industrial robot manipulators using uncalibrated vision and impedance control. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 42(6), 1716-1729. [11] González-Galván, E.J., Chávez, C.A., Bonilla, I., Mendoza, M., Raygoza, L.A., Loredo-Flores, A., Zhang, B. 2011. Precise industrial robot positioning and path-tracking over large surfaces using non-calibrated vision. In 2011 IEEE International Conference on Robotics and Automation, 5160-5166. IEEE. [12] Wang, W., Gee, T., Price, J., Qi, H. 2015. Real time multi-vehicle tracking and counting at intersections from a fisheye camera. In 2015 IEEE Winter Conference on Applications of Computer Vision, 17-24. IEEE. [13] Zhu, W., Wang, P., Li, F., Su, J., Qiao, H. 2015. Real-time 3D modelbased tracking of work-piece with monocular camera. In 2015 IEEE/SICE International Symposium on System Integration (SII), 777-782. IEEE. [14] Mehrubeoglu, M., Pham, L.M., Le, H.T., Muddu, R., Ryu, D. 2011. Realtime eye tracking using a smart camera. In 2011 IEEE Applied Imagery Pattern Recognition Workshop (AIPR), 1-7. IEEE. [22] Papaioannou, S., Markham, A., Trigoni, N. 2017. Tracking people in highly dynamic industrial environments. IEEE Transactions on mobile computing, 16(8), 2351-2365. [23] Carraro, M., Munaro, M., Menegatti, E. 2016. Cost-efficient RGB-D smart camera for people detection and tracking. Journal of Electronic Imaging, 25(4), 041007. [24] Munaro, M., Lewis, C., Chambers, D., Hvass, P., Menegatti, E. 2016. RGB-D human detection and tracking for industrial environments. In Intelligent Autonomous Systems, Springer, Cham, 13, 1655-1668. [25] Lieberknecht, S., Huber, A., Ilic, S., Benhimane, S. 2011. RGB-D camera-based parallel tracking and meshing. In 2011 10th IEEE International Symposium on Mixed and Augmented Reality, 147-155. IEEE. [26] Mosberger, R., Andreasson, H., Lilienthal, A.J. 2013. Multi-human tracking using high-visibility clothing for industrial safety. In 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, 638-644, IEEE. [15] Chang, C.Y., Lie, H.W. 2012. Real-time visual tracking and measurement to control fast dynamics of overhead cranes. IEEE Transactions on Industrial Electronics, 59(3), 1640-1649. [27] Ragaglia, M., Bascetta, L., Rocco, P. 2014. Multiple camera human detection and tracking inside a robotic cell an approach based on image war, computer vision, kd trees and particle filtering. In 2014 11th International Conference on Informatics in Control, Automation and Robotics (ICINCO), 2, 374-381, IEEE. [16] Altan, A., Hacioğlu, R. 2014. The controller of the camera used in target tracking for unmanned vehicle with model predictive controller. In 2014 22nd Signal Processing and Communications Applications Conference (SIU), 1686-1689. IEEE. [28] Devi, G.U., Priyan, M.K., Gokulnath, C. 2018. Wireless camera network with enhanced SIFT algorithm for human tracking mechanism. International Journal of Internet Technology and Secured Transactions, 8(2), 185-194. [17] Akkaladevi, S., Ankerl, M., Heindl, C., Pichler, A. 2016, May. Tracking multiple rigid symmetric and non-symmetric objects in real-time using depth data. In 2016 IEEE International Conference on Robotics and Automation (ICRA) 5644-5649. IEEE. [29] Hatcher, C., Ruhge, F.R. 2017. U.S. Patent No. 9,681,107. Washington, DC: U.S. Patent and Trademark Office. [18] Jia, B., Liu, S., Liu, Y. 2015. Visual trajectory tracking of industrial manipulator with iterative learning control. Industrial Robot: An International Journal, 42(1), 54-63. [19] Aouf, N., Rajabi, H., Rajabi, N., Alanbari, H., Perron, C. 2004. Visual object tracking by a camera mounted on a 6DOF industrial robot. In IEEE Conference on Robotics, Automation and Mechatronics, 1, 213-218. IEEE. [20] Najmaei, N., Kermani, M.R., Al-Lawati, M.A. 2011. A new sensory system for modeling and tracking humans within industrial work cells. IEEE Transactions on Instrumentation and Measurement, 60(4), 12271236. [21] Mosberger, R., Andreasson, H. 2013. An inexpensive monocular vision system for tracking humans in industrial environments. In 2013 IEEE International Conference on Robotics and Automation, 5850-5857. IEEE. [30] Bigioi, P., Pososin, A., Gangea, M., Petrescu, S., Corcoran, P. 2012. U.S. Patent No. 8,155,397. Washington, DC: U.S. Patent and Trademark Office. [31] Iwase, Y., Imaizumi, M. 2012. U.S. Patent No. 8,115,814. Washington, DC: U.S. Patent and Trademark Office. [32] Cheng, S.P., Jang, L.G., Kuo, J.Y., Chuang, J.H. 2012. U.S. Patent No. 8,218,011. Washington, DC: U.S. Patent and Trademark Office. [33] Newcombe, R., Izadi, S., Molyneaux, D., Hilliges, O., Kim, D., Shotton, J.D.J., Kohli, P., Fitzgibbon, A., Hodges, S.E., Butler, D.A. 2016. Microsoft Technology Licensing LLC, 2016. Real-time camera tracking using depth maps. U.S. Patent, 9, 242, 171. [34] Maggio, E., Cavallaro, A. 2011. Video tracking: theory and practice. John Wiley & Sons. [35] Jianbo, S., Tomasi, C. 1994. Good features to track. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 593-600. [36] Koller, D., Daniilidis, K., Nagel, H.H. 1993. Model-based object tracking in monocular image sequences of road traffic scenes. International Journal of Computer 11263on, 10(3), 257-281. Cite The Article: Wisam T. Abbood, Hiba K. Hussein and Oday I. Abdullah (2019). Industrial Tracking Camera And Product Vision Detection System. Journal of Mechanical Engineering Research and Developments, 42(4) : 277-280.