リケラボ論文検索は、全国の大学リポジトリにある学位論文・教授論文を一括検索できる論文検索サービスです。

リケラボ 全国の大学リポジトリにある学位論文・教授論文を一括検索するならリケラボ論文検索大学・研究所にある論文を検索できる

リケラボ 全国の大学リポジトリにある学位論文・教授論文を一括検索するならリケラボ論文検索大学・研究所にある論文を検索できる

大学・研究所にある論文を検索できる 「RGB-D SLAM技術を用いた産業用マニピュレータのための運動学キャリブレーションと3Dマッピングの同時実行」の論文概要。リケラボ論文検索は、全国の大学リポジトリにある学位論文・教授論文を一括検索できる論文検索サービスです。

コピーが完了しました

URLをコピーしました

論文の公開元へ論文の公開元へ
書き出し

RGB-D SLAM技術を用いた産業用マニピュレータのための運動学キャリブレーションと3Dマッピングの同時実行

李 景輝 横浜国立大学 DOI:info:doi/10.18880/00013926

2021.06.17

概要

Like many other technologies in human history, these advanced technologies are changing the way people live and work. Since the dawn of the robotic revolution began in earnest in the 1950s, there have been a variety of different robots that play different roles in our life. In particular, industrial robots play crucial roles in the process of social productivity among them. Recently, more accurate, productive, and economical industrial manipulators have been demanded in the industrial community. Industrial manipulators can offer lower manufacturing costs and better quality when doing repetitive works. And we know that the accuracy of the manipulators is one of the most important factors influencing the quality of the product. However, a manipulator will produce kinematic errors during manufacturing. The manual calibration method has a high cost and limited quality, which can not meet production demands. Thus more effective automatic kinematic calibration is demanded. Moreover, environmental mapping has served for the motion planning of the manipulator. However, it is also a time-consuming work to reconstruct the workspace.

The main purpose of this thesis is to propose a more effective and economical method which can improve the accuracy of industrial manipulators. I introduce a simultaneous kinematic calibration, localization, and mapping (SKCLAM) method, which can simultaneously calibrate the kinematic parameters of an industrial robot manipulator and reconstruct an environment map of the workspace by using an RGB-D camera attached to its end effector. First of all, I will introduce the background of this study first. Then, I will define the SKCLAM problem and propose the basic idea of the SKCLAM method to solve it. In this dissertation, two SKCLAM approaches will be reported. The biggest difference between these two approaches is kinematic calibration without or with calibration markers. Moreover, RGB-D SLAM techniques are considered for 3D reconstruction of workspace. I will establish some simulations and experiments to confirm the effectiveness of the SKCLAM method. The results of the kinematic calibration and 3D reconstruction also will be evaluated. Both SKCLAM approaches can perform these two tasks successfully. However, the number of the kinematic parameters that can be calibrated and calibration accuracy without calibration markers are less than those with calibration markers, which requires less preparatory works. Finally, I will discuss the SKCLAM method and the future of this work.

参考文献

[Roth 1987] Z. Roth, B. Mooring and B. Ravani, “An overview of robot calibration,” IEEE Journal on Robotics and Automation, vol. 3, no. 5, pp. 377–385, 1987.

[Hollerbach 1996] J. M. Hollerbach and C. W. Wampler, “The calibration index and taxonomy for robot kinematic calibration methods,” The International Journal of Robotics Research, vol. 15, no. 6, pp. 573–591, 1996.

[Verma 2019] T. Verma and N. R. Chauhan, “A critical review on calibration of robots,” In: M. Kumar, R. Pandey, V. Kumar (eds), Advances in Interdisciplinary Engineering, Lecture Notes in Mechanical Engineering, Springer, Singapore, 2019.

[Judd 1987] R. Judd and A. Knasinski, “A technique to calibrate industrial robots with experimental verification,” Proceedings of 1987 IEEE International Conference on Robotics and Automation, Raleigh, NC, USA, pp. 351–357, 1987.

[Denavit 1955] J. Denavit and R. S. Hartenberg, “A kinematic notation for lower-pair mechanisms based on metrics,” The Journal of Applied Mechanics, vol. 22, pp.215–221, 1955.

[Spong 2005] M. W. Spong, S. Hutchinson, M. Vidyasagar, “Robot modeling and control 1st Edition,” Wiley, 2005.

[Hayati 1985] S. Hayati and M. Mirmirani, “Improving the absolute positioning accuracy of robots Manipulators,” Journal of Robotic Systems, vol. 2, no. 4, pp. 397–413, 1985.

[Hollerbach 2016] J. Hollerbach, W. Khalil and M. Gautier, “Model identification,” In: B. Siciliano, O. Khatib (eds), Springer Handbook of Robotics 2nd Edition, Springer International Publishing, pp. 115–122, 2016.

[Driels 1993] M. R. Driels, W. Swayze and S. Potter, “Full-pose calibration of a robot manipulator using a coordinate-measuring machine,” The International Journal of Advanced Manufacturing Technology, vol. 8, pp. 34–41, 1993.

[Cong 2006] D. Cong, D. Yu and J. Han, “Kinematic calibration of parallel robots using CMM,” 2006 6th World Congress on Intelligent Control and Automation, Dalian, pp. 8514– 8518, 2006

[Whitney 1986] D. E. Whitney, C. A. Lozinski and J. M. Rourke, “Industrial robot forward calibration methods and results,” ASME Journal of Dynamic Systems, Measurement and Control, vol. 108, no. 1, pp. 1–8, 1986.

[Judd 1990] R. P. Judd and A. B. Knasinski, “A technique to calibrate industrial robots with experimental verification,” IEEE Transactions on Robotics and Automation, vol. 6, no. 1, pp. 20– 30, 1990.

[Mooring 1991] B. W. Mooring, Z. S. Roth, M. R. Driels, “Fundamentals of manipulator calibration,” A Wiley-Interscience publication, New York, 1991.

[Park 2012] I. Park, B. Lee, S. Cho, Y. Hong and J. Kim, “Laser-based kinematic calibration of robot manipulator using differential kinematics,” IEEE/ASME Transactions on Mechatronics, vol. 17, no. 6, pp. 1059–1067, 2012.

[Hu 2012] J. Hu, J. Wang and Y. Chang, “Kinematic calibration of manipulator using single laser pointer,” 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, pp. 426–430, 2012.

[Du 2016] Du G, Shao H, Chen Y, P. Zhang, X. Liu, “An online method for serial robot selfcalibration with CMAC and UKF,” Robotics and Computer-Integrated Manufacturing. vol. 42, pp. 39–48, 2016.

[Marwan 2017] A. Marwan, M. Simic and F. Imadc, “Calibration method for articulated industrial robots,” Procedia Computer Science. vol. 112, pp. 1601–1610, 2017.

[Zhuang 1996] H. Zhuang and Z. S. Roth, “Camera-aided robot calibration,” Boca Raton (FL): CRC Press, 1996.

[Watanabe 2006] A. Watanabe, S. Sakakibara, K. Ban, M. Yamada, G. Shen, T. Arai, “A kinematic calibration method for industrial robots using autonomous visual measurement,” CIRP Annals, vol. 55, no. 1, pp. 1–6, 2006.

[Du 2013] G. Du, P. Zhang, “Online robot calibration based on vision measurement,” Robotics and Computer-Integrated Manufacturing, vol. 29, no. 6, pp. 484–492, 2013.

[Zhang 2017] X. Zhang, Y. Song, Y. Yang and H. Pan, “Stereo vision based autonomous robot calibration,” Robotics and Autonomous Systems, vol. 93, pp. 43–51, 2017.

[Lee 2018] J.J. Lee and M.H. Jeong, “Stereo camera head-eye calibration based on minimum variance approach using surface normal vectors,” Sensors, vol.18, no. 11, 3706, 2018.

[Miseikis 2016] J. Miseikis, K. Glette, O. J. Elle and J. Torresen, “Automatic calibration of a robot manipulator and multi 3D camera system,” 2016 IEEE/SICE International Symposium on System Integration (SII), Sapporo, pp. 735–741, 2016.

[Wan 2020] F. Wan and C. Song, “Flange-based hand-eye calibration using a 3D camera with high resolution, accuracy, and frame rate,” Frontiers in Robotics and AI, vol. 7, no. 65, 2020.

[Grisetti 2007] G. Grisetti, C. Stachniss and W. Burgard, “Improved techniques for grid mapping with Rao-Blackwellized particle filters,” IEEE Transactions on Robotics, vol. 23, no. 1, pp. 34–46, 2007.

[Kohlbrecher 2011] S. Kohlbrecher, O. von Stryk, J. Meyer and U. Klingauf, “A flexible and scalable SLAM system with full 3D motion estimation,” 2011 IEEE International Symposium on Safety, Security, and Rescue Robotics, pp. 155–160, 2011.

[Hess 2016] W. Hess, D. Kohler, H. Rapp and D. Andor, “Real-time loop closure in 2D LIDAR SLAM,” 2016 IEEE International Conference on Robotics and Automation (ICRA), pp. 1271– 1278, 2016.

[Newcombe 2011] R. A. Newcombe, S. Izadi, O. Hilliges et al., “KinectFusion: Real-time dense surface mapping and tracking,” 2011 10th IEEE International Symposium on Mixed and Augmented Reality, Basel, pp. 127–136, 2011.

[Kerl 2013] C. Kerl, J. Sturm and D. Cremers, “Robust odometry estimation for RGB-D cameras,” 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, pp. 3748– 3754, 2013.

[Salas-Moreno 2013] R. F. Salas-Moreno, R. A. Newcombe, H. Strasdat et al., “SLAM++: Simultaneous localisation and mapping at the level of objects,” Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 1352–1359, 2013.

[Labbe 2013] M. Labbe and F. Michaud, “Appearance-based loop closure detection for online large-scale and long-term operation,” IEEE Transactions on Robotics, vol. 29, no. 3, pp. 734– 745, 2013.

[Endres 2014] F. Endres, J. Hess, J. Sturm, D. Cremers and W. Burgard, “3-D mapping with an RGB-D camera,” IEEE Transactions on Robotics, vol. 30, no. 1, pp. 177–187, 2014.

[Mur-Artal 2017] R. Mur-Artal and J. D. Tardos, “ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras,” IEEE Transactions on Robotics, vol. 33, no. 5, pp. 1255–1262, 2017.

[Engel 2018] J. Engel, V. Koltun and D. Cremers, “Direct sparse odometry,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 40, no. 3, pp. 611–625, 2018.

[Schneider 2018] T. Schneider, M. Dymczyk, M. Fehr et al., “Maplab: An open framework for research in visual-inertial mapping and localization,” IEEE Robotics and Automation Letters, vol. 3, no. 3, pp. 1418–1425, 2018.

[Runz 2018] M. Runz, M. Buffier and L. Agapito, “MaskFusion: Real-time recognition, tracking and reconstruction of multiple moving objects,” 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Munich, Germany, pp. 10–20, 2018.

[Wurm 2010] K. M. Wurm, A. Hornung, M. Bennewitz, C. Stachniss and W. Burgard, “OctoMap : A probabilistic, flexible, and compact 3D map representation for robotic systems,” Proceeding of the ICRA 2010 workshop on best practice in 3D perception and modeling for mobile manipulation, vol. 2, 2010.

[Mur-Artal 2015] R. Mur-Artal, J. M. M. Montiel and J. D. Tardos, “ORB-SLAM: A versatile and accurate monocular SLAM system,” IEEE Transactions on Robotics, vol. 31, no. 5, pp. 1147–1163, 2015.

[Rublee 2011] E. Rublee, V. Rabaud, K. Konolige and G. Bradski, “ORB: An efficient alternative to SIFT or SURF,” 2011 International Conference on Computer Vision, Barcelona, pp. 2564–2571, 2011.

[Yousif 2015] K. Yousif, A. Bab-Hadiashar and R. Hoseinnezhad, “An overview to visual odometry and visual SLAM: Applications to mobile robotics,” Intelligent Industrial Systems, vol. 1, no. 4, pp. 289–311, 2015.

[Cadena 2016] C. Cadena, L. Carlone, H. Carrillo et al., “Past, present, and future of simultaneous localization and mapping: toward the robust-perception age,” IEEE Transactions on Robotics, vol. 32, no. 6, pp. 1309–1332, 2016.

[Kleeman 2003] Kleeman L. “Advanced sonar and odometry error modeling for simultaneous localisation and map building,” Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas (NV), USA, pp. 699–704, 2003.

[Brenneke 2003] C. Brenneke, O. Wulf and B. Wagner, “Using 3D laser range data for SLAM in outdoor environments,” Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas (NV), USA, pp. 1556–1563, 2003.

[Abrate 2007] F. Abrate, B. Bona and M. Indri, “Experimental EKF-based SLAM for minirovers with IR sensors only,” Proceedings of the 3rd European Conference on Mobile Robots (ECMR), Freiburg, Germany, European Conference on Mobile Robots, 2007.

[Engelhard 2011] N. Engelhard, F. Endres, J. Hess, J. Sturm and W. Burgard, “Real-time 3D visual SLAM with a hand-held RGB-D camera,” Proceedings of the RGB-D Workshop on 3D Perception in Robotics at the European Robotics Forum, Vasteras, Sweden, pp. 1–15, 2011.

[Bay 2006] H. Bay, T. Tuytelaars and L. V. Gool, “Surf: Speeded up robust features,” European conference on computer vision, Berlin, Heidelberg, pp. 404–417, 2006.

[Fischler 1981] M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Communications of the ACM, vol. 24, no. 6, pp. 381–395, 1981.

[Henry 2012] P. Henry, M. Krainin, E. Herbst, X. Ren and D. Fox, “RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments,” The International Journal of Robotics Research, vol. 31, no. 5, pp. 647–663, 2012.

[Lowe 2004] D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” International Journal of Computer Vision, vol. 60, no.2, pp. 91–110, 2004.

[Zhang 1994] Z. Zhang, ”Iterative point matching for registration of free-form curves and surfaces,” International journal of computer vision, vol. 13, no. 2, pp. 119–152, 1994.

[Burgard 2008] W. Burgard, O. Brock and C. Stachniss, “A tree parameterization for efficiently computing maximum likelihood maps using gradient descent,” Robotics: Science and Systems III, MITP, pp. 65–72, 2008.

[Henry 2014] P. Henry, M. Krainin, E. Herbst, X. Ren and D. Fox, “RGB-D mapping: Using depth cameras for dense 3D modeling of indoor environments,” Springer Tracts in Advanced Robotics, pp. 477–491, 2014.

[Rosten 2006] E. Rosten and T. Drummond, “Machine learning for high-speed corner detection, ” European conference on computer vision, Springer, Berlin, Heidelberg, pp. 430–443, 2006.

[Calonder 2008] M. Calonder, V. Lepetit and P. Fua, “Keypoint signatures for fast learning and recognition, ” European conference on computer vision, Springer, Berlin, Heidelberg, pp. 58– 71, 2008.

[Triggs 2000] B. Triggs, P. F. McLauchlan, R. I. Hartley and A. W. Fitzgibbon, “Bundle adjustment ― a modern synthesis, ” International Workshop on Vision Algorithms, Springer, Berlin, Heidelberg, pp. 298–372, 2000.

[Endres 2012] F. Endres, J. Hess, N. Engelhard, J. Sturm, D. Cremers and W. Burgard, “An evaluation of the RGB-D SLAM system,” 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, pp. 1691–1696, 2012.

[Kummerle 2011] R. K ¨ ummerle, G. Grisetti, H. Strasdat, K. Konolige and W. Burgard, “g2o: A ¨ general framework for graph optimization,” 2011 IEEE International Conference on Robotics and Automation, Shanghai, pp. 3607–3613, 2011.

[Kummerle ¨ 2012] R. Kummerle, G. Grisetti and W. Burgard, “Simultaneous parameter calibra- ¨ tion, localization, and mapping,” Advanced Robotics, vol. 26, no. 17, pp. 2021–2041, 2012.

[Hara 2012a] K. Hara, M. Ambai, I. Sato and K. Kamiya, “Application of 3-D reconstruction using the KINECT to industrial robot teaching,” Technical report of IEICE. Multimedia and virtual environment, vol. 111, no. 380, pp. 329–333, 2012 (in Japanese).

[Hara 2012b] K. Hara, M. Ambai, I. Sato and K. Kamiya, “Shape measurement of production equipment by RGB-D camera using local descriptors and markers,”Proceedings of the Vision Engineering Workshop 2012, IS2-B2, 2012 (in Japanese).

[Zhi 2017] X. Zhi, S. Schwertfeger, “Simultaneous hand-eye calibration and reconstruction,” 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, pp. 1470–1477, 2017.

[Klingensmith 2016a] M. Klingensmith, S. S. Sirinivasa and M. Kaess, “Articulated robot motion for simultaneous localization and mapping (ARM-SLAM),” IEEE Robotics and Automation Letters, vol. 1, no. 2, pp. 1156–1163, 2016.

[Klingensmith 2015] M. Klingensmith, I. Dryanovski, S. S. Sirinivasa and J. Xiao, “Chisel: Real time large scale 3D reconstruction onboard a mobile device using spatially hashed signed distance fields,” Robotics: Science and Systems, 2015.

[Klingensmith 2016b] M. Klingensmith, “Automatically tracking and calibrating articulated robots using SLAM techniques,” PhD Thesis, CMU-RI-TR-16-36, Robotics Institute, Carnegie Mellon University, July, 2016.

[Li 2019] J. Li, A. Ito, H. Yaguchi and Y. Maeda, “Simultaneous kinematic calibration, localization, and mapping (SKCLAM) for industrial robot manipulators,” Advanced Robotics, Vol. 33, No. 23, pp. 1225–1234, 2019.

[Ito 2020] A. Ito, J. Li and Y. Maeda, “Accuracy Improvement with Checkerboards in SLAMintegrated Kinematic Calibration (SKCLAM) for Industrial Manipulators,” Transactions of the JSME, vol. 86, no. 891, 20-00028, 2020 (in Japanese).

[Tanaka 2020] Y. Tanaka, J. Li, A. Ito and Y. Maeda, “SLAM-integrated kinematic calibration with spherical cameras for industrial manipulators,” The Proceedings of JSME annual Conference on Robotics and Mechatronics 2020 (ROBOMECH2020), 2P2-B05, 2020 (in Japanese).

[OpenCV] “OpenCV: Open computer vision library [Internet],” Available from: https:// opencv.org/.

[Tareen 2018] S. A. K. Tareen and Z. Saleem, “A comparative analysis of SIFT, SURF, KAZE, AKAZE, ORB, and BRISK,” 2018 International Conference on Computing, Mathematics and Engineering Technologies (iCoMET), Sukkur, pp. 1–10, 2018.

[Ou 2020] Y. Ou, Z. Cai, J. Lu, J. Dong and Y. Ling, “Evaluation of image feature detection and matching algorithms,” 2020 5th International Conference on Computer and Communication Systems (ICCCS), Shanghai, China, pp. 220–224, 2020.

[Alcantarilla 2013] P. F. Alcantarilla, J. Nuevo, A. Bartoli, “Fast explicit diffusion for accelerated features in nonlinear scale spaces,” Proceedings of the British Machine Vision Conference (BMVC), Bristol, UK, 2013.

[Yaskawa Motoman HP3J] “Yaskawa Motoman-HP3J: Industrial robot MOTOMAN-HP3J [Internet],” Available from: https://www.yumpu.com/en/document/read/ 54279686/industrial-robot-motoman-hp3j.

[GAZEBO] “GAZEBO: Robot simulation made easy [Internet],” Available from: http:// gazebosim.org/.

[ROS] “The robot operating system (ROS) [Internet],” Available from: https://www.ros. org/.

[Intel RealSense R200] “Intel(R) RealSense(TM) Camera R200: Specifications [Internet],” Available from: https://ark.intel.com/products/92256/ Intel-RealSense-Camera-R200.

[Point Cloud Library] “The Point Cloud Library (PCL) [Internet],” Available from: https: //pointclouds.org/.

[SUNX HL-C135C-BK10] “SUNX HL-C135C-BK10: Specifications [Internet],” Available from: https://www3.panasonic.biz/ac/e/search_num/index.jsp?c= detail&part_no=HL-C135C-BK10.

[Radius filter] “Point Cloud Library: RadiusOutlier removal [Internet],” Available from: https://pointclouds.org/documentation/tutorials/remove_ outliers.html.

[Tomasi 1998] C. Tomasi and R. Manduchi, “Bilateral filtering for gray and color images,” 6th International Conference on Computer Vision, Bombay, India, , pp. 839-846, 1998.

[Paris 2006] S. Paris and F. Durand, ”A fast approximation of the bilateral filter using a signal processing approach” European Conference on Computer Vision, Berlin, Germany:Springer 2006, pp. 568-580, 2006.

[出村 2007] 出村 公成, “簡単!実践!ロボットシミュレーション – Open Dynamics Engine に よるロボットプログラミング,” 森北出版, Japan, 2007.

参考文献をもっと見る

全国の大学の
卒論・修論・学位論文

一発検索!

この論文の関連論文を見る