Strojniški vestnik - Journal of Mechanical Engineering 62(2016)4, 207-212 © 2016 Journal of Mechanical Engineering. All rights reserved. D0l:10.5545/sv-jme.2015.3227 Original Scientific Paper Received for review: 2015-11-19 Received revised form: 2016-03-17 Accepted for publication: 2016-03-24 Adaptive Robotic Deburring of Die-Cast Parts with Position and Orientation Measurements Using a 3D Laser-Triangulation Sensor Hubert Kosler1 - Urban Pavlovcic2* - Matija Jezersek2 - Janez Mozina2 1 Yaskawa Motoman, Slovenia 2 University of Ljubljana, Faculty of Mechanical Engineering, Slovenia A system for adaptive robotic deburring with a correction of the errors in workpiece positioning is presented. The correction is based on 3D measurements of the workpiece's surface and its registration to the target surface, measured on a reference workpiece. The surface measurement is performed with a laser-triangulation profilometer. The reference tool path is determined using robot teaching on the reference, already deburred, workpiece. The positioning errors of the currently processed workpiece are compensated by tool path adaptation in accordance with the registration results by means of rotation and translation. The experiments showed that the average precision of localization is 0.06 mm and the average bias between the true and measured values is 0.23 mm. The developed adaptive system is also applicable in other similar applications where it is difficult to ensure repeatable clamping of the workpiece. Keywords: adaptive robotic machining, deburring, laser triangulation, position error correction, localization Highlights • A new, simplified method for workpiece position and orientation corrections is presented. • The 3D measurement of the workpiece is based on a laser-triangulation technique. • The differences in position and orientation are calculated using an iterative closest point algorithm. • The tool path of the current workpiece is adapted by rotating and translating the reference tool path in accordance with the measured positioning error. • The method is especially applicable for manufacturing lines with large batch sizes, such as the robotic deburring of die-cased parts. 0 INTRODUCTION Burr formation (see Fig. 1) is a major concern in the surface and edge finishing of workpieces, since it may injure a human worker and reduce the quality of the workpieces, [1] and [2]. To remove the burr, another process must be introduced into a manufacturing line, i.e., deburring. Unfortunately, manual deburring is time consuming, costly and demands a very high level of skill and experience to maintain consistency [3]. However, it is still common, even in today's most fully automated factories [4]. The process itself adds little or no added value to the product, but the costs for some parts can be as high as 35 % of total cost of the part, [5] and [6]. Therefore, the need to automate the process is obvious. Most attempts at deburring automation are based on the use of robots to manipulate the workpiece or deburring tool along a predefined path. Usually, they are programmed manually by the robot operators. This process is often referred to as robot teaching. The conventional robot-deburring approach is based on the assumptions that the workpiece has no defects and is located at a known position [7]. This is why the robot can travel along a rigid programmed path [8]. Unfortunately, those assumptions are not always true: die-casted workpieces may vary in geometry to a certain level, and position and orientation errors occur when the workpiece is fixed in the jig or grasped by the robot. While we can realistically assume that workpieces with defects are removed from the production line in steps prior to deburring, differences in the orientation and the position between grasped workpieces are inevitable. Minimal misalignments can be neglected when active force control is introduced, [5] and [9]. However, with increasing misalignments the accuracy of the deburring decreases and, consequently, too much or too little material is removed. That is why minimizing these differences prior to deburring is helpful [10]. This issue is traditionally solved by either specially designed fixtures that ensure a repeatable workpiece position or by measuring the position and orientation of the workpiece's key features [7]. The latter approach is more suitable when the batch size is large. Several authors developed systems to correct these misalignments in applications for robot deburring or other robotized processes. Song et al. [10] presented an approach where the tool path is generated based on a CAD model. Then the taught *Corr. Author's Address: University of Ljubljana, Faculty of Mechanical Engineering, Aškerčeva 6, 1000 Ljubljana, Slovenia, urban.pavlovcic@fs.uni-lj.si 207 Strojniski vestnik - Journal of Mechanical Engineering 62(2016)4, 207-212 points are matched to a generated tool path, so that the modified tool path is acquired. Habibi and Pescaru [11] registered a patent for a system that trains a robot to recognize and localize objects using 3D vision. Biegelbauer and Vincze [12] 3D measured the surface, segmented the range image and fitted a cylinder to detect the actual position of the bore. A system that scans and localizes the workpieces in 3D for assembly and pick-and-place operations was presented by Skotheim et al. [13]. Rajaraman et al. [14] developed a laser-scanner-based localization system for automatic robot welding. In contrast to the presented approaches, our method preserves the robot's teaching step which is made on the reference workpiece. During normal operation the workpiece misalignment relative to the reference is measured and the robot tool path is properly adapted. A custom laser-triangulation sensor for a harsh industrial environment was developed for 3D measurements of the surface of a currently processed workpiece. The positional misalignment is then determined using a registration algorithm that calculates the misalignment relative to the reference workpiece. This approach requires a minimal adjustment of the system and only one additional software module. 1 EXPERIMENTAL SETUP The deburring of a die-cast part is presented in Fig. 1. The burr (indicated by orange arrows) is located on a contact of the upper and lower parts of the mold (the X-Z plane of the workpiece). The die-cast part is processed with the system schematically presented in Fig. 2. The robot first grasps it at the bottom side. Then it makes a 3D scan of the side that is visible in Fig. 1. For that purpose the robot moves the part under the laser-triangulation sensor that is fixed in space relative to the floor. The 3D data is then used to calculate the position and orientation of the current workpiece relative to the reference one (see the second paragraph for details). Finally, the part is deburred with the deburring tool, which is also fixed relative to the floor. A Yaskawa Motoman MA1800 robot [15] with a DX100 controller is used for the workpiece manipulation. It is a vertical jointed-arm able to handle a payload of 15 kg within 3.2 m of vertical and 1.8 m of horizontal reach, and with a repeatability of 0.08 mm. The 3D measurements are based on the lasertriangulation principle [16], which is used in a custom-developed laser-triangulation sensor (Yaskawa MOTOSense profilometer). The sensor consists of a laser line projector (Flexpoint MVnano, wavelength 660 nm, power 100 mW, line thickness 0.1 mm) and a CCD camera (Basler, Ace acA645-100gm, 659x494 pix, 100 FPS), which are attached into a housing. The aluminum housing protects the projector and the camera from dust and any potential collision. Its dimensions and the measuring range are shown in Fig. 3. The precision of the measured profile is 0.1 mm and the typical scanning resolution is 0.2 mm, which means that the measurement time for a 20-mm-long surface is 1 s. Fig. 1. Die-cast part used in the experiments. The location of the burr is indicated with orange arrows and is approximately the X-Z sectioning plane 3D measuring phase MOTOsense deburring tool deburring phase _ robot <_ pp <_ controller Fig. 2. Schematic of the system with the 3D measuring and deburring stages The images acquired by the triangulation sensor are processed with a PC. In the first step the profiles 208 Kosler, H. - Pavlovcic, U. - Jezersek, M. - Mozina, J. Strojniski vestnik - Journal of Mechanical Engineering 62(2016)4, 207-212 are detected from each image with sub-pixel accuracy. In the next step the series of profiles is transformed into a 3D surface according to the known positions of the camera, the laser projector and the robot arm. Finally, the workpiece misalignment is calculated and the correction is sent to the robot controller. \ i \ \ I V j — Measuring range \ w ^ I_ Fig. 3. Dimensions of the laser-triangulation sensor and its measuring range 2 POSITION AND ORIENTATION MEASUREMENT The position and orientation measurement method is based on an assumption that the shape of each part does not change significantly, i.e., it is expected to be within the allowable tolerances. Thus, only the misalignment of the current workpiece with respect to the reference workpiece is determined and the transformation of the taught tool path is calculated. A flowchart of the method is presented in Fig. 4. The robot teaching is performed on one workpiece that is manually deburred prior to the teaching. Hereinafter, it is referred to as the reference workpiece, but note that it is not a special workpiece, since any workpiece from the same batch can serve as a reference. The surface of the reference workpiece is measured in 3D after the robot teaching and, which is most important, during the same clamping. When other workpieces are about to be deburred with the robot system, their surface is first measured in 3D. Then the relative rotation and translation with respect to the reference workpiece are calculated based on a registration using the iterative closest point (ICP) algorithm [17], implemented as an ICP function in trimesh2 SDK [18]. The algorithm minimizes the distance between the target (the surface of the reference workpiece) and the source (the surface of the currently processed workpiece) sets of points. The registration is done by translating and rotating the latter. The results of the algorithm are a transformation matrix (translation and rotation) and a goodness-of-fit estimator. Fig. 4. Flowchart of the adaptive deburring process The tool path taught on the reference workpiece is then transformed to match the position and orientation of the currently processed workpiece using the transformation matrix. The central region of the workpiece was selected for 3D measurement, as it is shown in Fig. 5. The geometry of this region confines all translations and rotations, which is the required condition for reliable and stable registration [17]. Adaptive Robotic Deburring of Die-Cast Parts with Position and Orientation Measurements Using a 3D Laser-Triangulation Sensor 209 Strojniski vestnik - Journal of Mechanical Engineering 62(2016)4, 207-212 3 EXPERIMENTS The version of the ICP algorithm used in the experiments uses four parameters: the maximum distance between the corresponding points (MPD), the maximum number of iterations (MNI), and two values to determine whether the convergence has been reached. The third parameter determines the number of the last N iterations (NNE), where the error is increased in order to finish the algorithm. Value of N is determined by the fourth parameter [18]. Two experiments were conducted in order to characterize the proposed method. With the first one we characterized the precision of the proposed method. A 3D measurement of the same workpiece was repeated 10 times: the first measurement served as a target surface to which all the other surfaces (sources) were registered. The workpiece was not detached during the measurement repetitions. With the second experiment the accuracy of the proposed method was assessed. In this case the systematic offsets along the X, Y and Z axes were incrementally added to the robot's scanning path. Offsets were executed in a 0.5 mm step in each direction separately, from 0 mm to 4 mm. The same workpiece was measured after each offset and the workpiece was not detached during the test. The measured surfaces were registered in order to determine the offsets, which should be equal to the offsets introduced by the robot. For a statistical analysis of the differences between the results of the registration of the full and reduced point clouds an analysis of variance (ANOVA) at the 95 % level of significance was used. Except where noted differently, the results are provided in the notation Average value (Standard deviation). Geo magic Studio (Raindrop Geomagic) was used for the visualization of the results. 4 RESULTS The laser-triangulation sensor measured the profiles for every 0.2 mm of the move, so that each measurement was composed of about 500 profiles, containing 659 points each. After the exclusion of the background points, the average number of points in each point cloud was 77582 (493). A typical 3D measurement of the workpiece is shown in Fig. 5a. The MPD parameter of the ICP algorithm was set to 10 mm based on the maximal expected displacement. Other three parameters were set on default values: MNI = 100, NNE = 5 and N = 7, as it is advised in [18]. The influence of the geometry and parameters required for the correct alignment is further described in [19]. Whether the registration was successful the deviation between the reference and the current surface was checked. a) b) Fig. 5. a) Measured surface of the workpiece; and b) map of the deviations between the registered target and the source surface The average deviations were 0.06 mm (0.08 mm) and a typical result is visualized in Fig. 5b. The average calculation time for the registration was 1.52 s (0.10 s). In order to speed up the calculation, we investigated the influence of the point reduction on the point cloud. The results show that if the number of points in the cloud is reduced to 6 % of the original value, the calculation time was reduced to 0.22 s (0.03 s), while the change in accuracy of the registration was not statistically significant (the ANOVA p-values were 0.25, 0.92 and 0.64 for translations in X, Y and Z directions, respectively). The results of repeated measurements of the same workpiece without any intentionally introduced offset indicates that the calculated offset (bias) was 0.03 mm (0.06 mm), 0.01 mm (0.01 mm) and -0.07 mm (0.12 mm) in the X, Y, and Z directions, respectively. The results of the second experiment are shown in Fig. 6. The diagram shows the Euclid distance between the measured and the robot offsets versus the offset from the origin. The mean Euclid distance of all the points is 0.23 mm (0.12 mm). The average distances in the individual directions are 0.05 mm (0.10 mm), -0.02 mm (0.03 mm), -0.03 mm (0.15 mm) in the X, Y, and Z directions, respectively. Although the workpiece was not deliberately rotated during the experiment, the registration algorithm returned minimal rotations of -0.02° (0.02°), 0.01° 210 Kosler, H. - Pavlovcic, U. - Jezersek, M. - Mozina, J. Strojniski vestnik - Journal of Mechanical Engineering 62(2016)4, 207-212 (0.02°) and -0.02° (0.06°) around the X, Y and Z axes, respectively. 0.5 '0.4 O0.3 -d x) 0.2 3 LÜ 0.1 □ □ mean + std • □ ° + + mean n o o + • □ + • • □ • • ................"8"— $ 1 mean - std □ • + 0 Translation in: + X axis + + o o o Y axis □ 7 axis • all 3 axes i fci 0 Offset from origin [mm] Fig. 6. Euclid distance between the measured and reference points to the displacement in each direction 5 DISCUSSION The results of the registration were compared with the results acquired with commercially available software for 3D surfaces using the Global registration function in Geomagic Studio. Although the results using the ICP algorithm from our program were slightly better (approximately 0.01 mm), the differences were not statistically significant (ANOVA, p = 0.05). We estimate that the accuracy of determining the positional misalignment of the workpiece is mainly affected by the accuracy of the robot arm. For our case it is approximately 0.1 mm, which means that about 50 % of the distances between the true and measured origins from the robot. On the other hand, since the same robot with the same accuracy is used for the deburring, the same accuracy can be attributed to the deburring system as a whole. The second parameter that affects the precision of the whole system is the laser-triangulation sensor. Reference [20] shows that increasing the sensor precision from 1.6 mm to 0.3 mm improves the registration precision from 1.60° to 0.12°. In our case the sensor precision is 0.1 mm, which is very close to the standard deviations between the measured and true offsets. Since the expected offsets between the true and the current workpiece position are less than 4 mm, we implemented the ICP algorithm, which found the local minimum in terms of deviation. In cases where larger displacements are expected, the introduction of course registration using feature detection and matching (e.g., SIFT [21], MSER [22], SURF [23] feature detectors) should be considered. 6 CONCLUSIONS An adaptive robotic system was developed for the deburring of die-cast parts with position and orientation error correction. It uses a custom-developed laser-triangulation sensor to measure the 3D shape of the workpiece's surface and the ICP registration algorithm to determine the orientation and position error with respect to the reference workpiece, on which the robot was taught. The experiments show that the errors in the position and orientation in the range up to 4 mm are reduced to 0.23 mm (0.12 mm). The remaining errors can be further reduced by selecting a more accurate robot and triangulation sensor. The developed adaptive system is also applicable in other similar applications where it is difficult to ensure the repeatable clamping of a workpiece. 7 REFERENCES [1] Song, H.C., Song, J.B. (2013). Precision robotic deburring based on force control for arbitrary shaped workpiece using CAD model matching. International Journal of Precision Engineering and Manufacturing, vol. 14, no. 1, p. 85-91, DOI:10.1007/s12541-013-0013-2. [2] Ton, T., Park, H., Ko, S. (2011). Experimental analysis of deburring process on inclined exit surface by new deburring tool. CIRP Annals Manufacturing Technology, vol. 60, no. 1, p. 129-132, DOI:10.1016/j.cirp.2011.03.124. [3] Valente, C.M.O., Oliveira, J.F.G. (2004). A new approach for tool path control in robotic deburring operations. ABCM Symposium Series in Mechatronics, vol. 1, p. 124-133. [4] Bagde, S.T. (2014). Development of combined deburring and inspection system. IOSR Journal of Mechanical and Civil Engineering, vol. 9, p. 63-68. [5] Kazerooni, H. (1988). Automated robotic deburring using impedance control. IEEE Control System Magazine, vol. 8, no. 1, p. 21-25, DOI:10.1109/37.464. [6] Asakawa, N., Toda, K., Takeuchi, Y. (2002). Automation of chamfering by an industrial robot; for the case of hole on free-curved surface. Robotics and Computer Integrated Manufacturing, vol. 18, no. 5-6, p. 379-385, DOI:10.1016/ s0736-5845(02)00006-6. [7] Srinivasan, H., Harrysson, O.L.A., Wysk, R.A. (2015). Automatic part localization in a CNC machine coordinate system by means of 3D scans. The International Journal of Advanced Manufacturing Technology, vol. 81, no. 5, p. 1127-1138, DOI:10.1007/s00170-015-7178-z. [8] Jaweera, N., Webb, P. (2010). Measurement assisted robotic edge deburring of aero engine components. WSEAS Transactions on Systems and Control, vol. 5, no. 3, p. 174-183. [9] Nagata, F., Kusumoto, Y., Fujimoto, Y., Watanabe, K. (2007). Robotic sanding system for new designed furniture with freeformed surface. Robotics and Computerlntegrated Adaptive Robotic Deburring of Die-Cast Parts with Position and Orientation Measurements Using a 3D Laser-Triangulation Sensor 211 Strojniski vestnik - Journal of Mechanical Engineering 62(2016)4, 207-212 Manufacturing, vol. 23, no. 4, p. 371-379, D0l:10.1016/j. rcim.2006.04.004. [10] Song, H.C., Kim, B.S., Song, J.B. (2012). Tool path generation based on matching between teaching point and CAD model for robotic deburring. IEEE/ASME International Conference on Advanced Intelligent Mechatronics, p. 890-895, D0I:10.1109/ AIM.2012.6265921. [11] Habibi, B., Pescaru, S. (2004). Method and apparatus for single camera 3D vision guided robotics. US Patent 6,816,755, US Patent & Trademark Office, New York. [12] Biegelbauer, G., Vincze, M. (2006). 3D vision-guided bore inspection system. ICVS IEEE International Conference on Computer Vision Systems, p. 1-22, D0I:10.1109/ICVS.2006.1. [13] Skotheim, 0., Lind, M., Ystgaard, P., Fjerdingen, S.A. (2012). A flexible 3D object localization system for industrial part handling. IEEE/RSJ International Conference on Intelligent Robots and Systems, p. 3326-3333, D0I:10.1109/ IR0S.2012.6385508. [14] Rajaraman, M., DawsonHaggerty, M., Shimada, K., Bourne, D. (2013). Automated workpiece localization for robotic welding. IEEE International Conference on Automation Science and Engineering, p. 681-686, D0I:10.1109/CoASE.2013.6654062. [15] Yaskawa Motoman Robotics. MA1800 Arc Welding, from www. motoman.com/datasheet/MA1800.pdf accessed on 201510-22. [16] Jezeršek, M., Možina, J. (2003). A laser anamorph profilometer. Strojniški vestnik - Journal of Mechanical Engineering, vol. 49, no. 2, p. 76-89. [17] Besl, P.J., McKay, N.D. (1992). Method for registration of 3D shapes. Proceedings of SPIE1611, Sensor Fusion IV: Control Paradigms and Data Structures, D0l:10.1117/12.57955. [18] Rusinkiewicz, S. (2015). trimesh, from gfx.cs.princeton.edu/ proj/trimesh2/ accessed on 2015-10-22. [19] Rusinkiewicz, S., Levoy, M. (2001). Efficient variants of the ICP algorithm. Proceedings of 3rd International Conference on 3-D Digital Imaging and Modeling, p. 145-152, D0I:10.1109/ IM.2001.924423. [20] Pavlovcic, U, Diaci, J., Mozina, J, Jezersek, M. (2013). Characterization of the headtotrunk orientation with handheld optical 3D apparatus based on the fringe projection technique. BioMedical Engineering OnLine, vol. 12, art. num. 96, D0I:10.1186/1475-925X-12-96. [21] Lowe, D.G. (1999). Object recognition from local scale-invariant features. The Proceedings of the 7th International Conference on Computer Vision, vol. 2, p. 1150-1157, D0I:10.1109/ICCV.1999.790410. [22] Matas, J., Chum, O., Urban, M., Pajdla, T. (2002). Robust wide-baseline stereo from maximally stable extremal regions. Image and Vision Computing, vol. 22, no. 10, p. 761-767, D0I:10.1016/j.imavis.2004.02.006. [23] Bay, H., Ess, A., Tuytelaars, T., Van Gool, L. (2008). Speeded-Up Robust Features (SURF). Computer Vision and Image Understanding, vol. 110, no. 3, p. 346-359, D0I:10.1016/j. cviu.2007.09.014. 212 Kosler, H. - Pavlovcic, U. - Jezersek, M. - Mozina, J.