リケラボ論文検索は、全国の大学リポジトリにある学位論文・教授論文を一括検索できる論文検索サービスです。

リケラボ 全国の大学リポジトリにある学位論文・教授論文を一括検索するならリケラボ論文検索大学・研究所にある論文を検索できる

リケラボ 全国の大学リポジトリにある学位論文・教授論文を一括検索するならリケラボ論文検索大学・研究所にある論文を検索できる

大学・研究所にある論文を検索できる 「In-Hand Manipulation and Recognition with a Humanoid Robot's Multi-Fingered Hand Using Distributed 3-Axis Tactile Sensors and Convolutional Neural Networks」の論文概要。リケラボ論文検索は、全国の大学リポジトリにある学位論文・教授論文を一括検索できる論文検索サービスです。

コピーが完了しました

URLをコピーしました

論文の公開元へ論文の公開元へ
書き出し

In-Hand Manipulation and Recognition with a Humanoid Robot's Multi-Fingered Hand Using Distributed 3-Axis Tactile Sensors and Convolutional Neural Networks

Funabashi Satoshi 早稲田大学

2021.08.04

概要

Nowadays, human co-existing robots are in high expectation for working in actual industrial situations such as factories, kitchens and logistics where many people work but hard load and tedious tasks are necessary and those situations can be a cause of worsening human health. Labor shortage due to the declining birthrate and aging is also crucial for our society, thus the need for robots is increasing. In order for robots to handle various tools and objects in the same way as humans do in nursing care and collaboration, multi-fingered hands take an important role compared to two-fingered grippers because the hands have more than two fingers resulting in that they can change an orientation of an object during a manipulation task. Since the multi-fingered manipulation tasks include a variety of contact states such as rolling contact, a slip and a finger-gating, tactile sensors should be mounted on the surface of a hand that is soft like a human in a multi-fingered hands so that the hands enable complex and delicate movements.

On the other hand, camera-based control of the multi-fingered hands can be one of options to achieve dexterous manipulation. However, since objects are hidden and difficult to see in tasks with multi-fingered hands, there are many studies that require special or large-scale camera placements. Other works put some markers on grasping objects so that the cameras can follow objects trajectory. Considering that the processing method from vision sensor value acquisition to task achievement is complicated or task-dependent, these methods cannot be regarded as practical methods when it comes to application for industries with many environmental changes.

Therefore, using only vision cameras is not potentially a useful method for the multi-fingered manipulation, and thus tactile sensing is important to achieve stable manipulation. Specifically, since the hands make surface contact with an object, it is necessary to acquire surface information using a distributed tactile sensor. Furthermore, tactile sensors should be mounted on not only fingertips but also phalanges and a palm on the hands. When it comes to an implementation of tactile sensors into the hands, it can also be a crucial problem because such sensors are usually too expensive (e.g. 20,000 USD for one single tactile sensor) or difficult to implement on the surface of the hands. Also, the multi-fingered hands usually grasp an object by touching the object from different orientations which means the object is applied forces on different axes from the fingers. Although, tactile sensors should provide 3-axis tactile information (i.e. x-, y- and z-axis) for the situation with the multi-fingered hands, there were almost no tactile sensors which fit to the situation.

If such ideal tactile sensors are mounted on the multi-fingered hands, the hands can acquire the large amount of tactile sensor information which can be used for in-hand manipulation tasks. However, how to process such tactile information is another problem due to its non-linear complicated data structure. Modeling methods are used for in-hand manipulation with tactile sensing, but it is usually difficult to construct the modeling formulas based on tactile sensing because a physical state of provided forces and desired motions is complicated. Therefore, many studies show modeling methods which use only point contact, making complicated manipulation difficult. Also, different modeling methods are required in sub-tasks of one whole task, thus many modeling methods need to be executed for each sub-task.

Machine learning methods are also one of popular methods to process the tactile information, but abundant tactile information hinders from processing the information correctly and results in achieving limited manipulation such as fingertip-based manipulation.

The objective of this research is to aim achieving dexterous manipulation with a multi-fingered hand. Basically, this research bases on the human-mimetic configuration, thus the robot hands used in this research have soft joints and distribute tactile sensors because the core idea starts from making robots work on tasks instead of humans and tasks are basically executed by humans' hands. To achieve such in-hand manipulation, an acquisition of abundant tactile information and using a processing method to extract useful information for the dexterous manipulation is important. In this study, I focused on deep learning methods and 3-axis tactile sensors to the system for in-hand manipulation for the first time to effectively achieve dexterous in-hand information.

Chapter 1 introduces the background of the thesis in terms of aged-society and labor shortage, and the new era of Industry 4.0. In this background, industrial and service robots are surveyed and the fact that those robots are expected to be used instead of human resources is explained. Furthermore, a history of existing robot hands is described which is led to Sugano laboratory’s studies in terms of hardware mechanism. In addition, control methods are focused on so that why the current control methods cannot achieve dexterous multi-fingered in-hand manipulation and what is necessary for achieving the manipulation tasks.

Chapter 2 describes how a robot hand do in-hand manipulation by utilizing tactile information with two fingers as a first step toward multi-fingered in-hand manipulation. It is noteworthy that this chapter shows a method using deep neural network to compress tactile data of in-hand manipulation for the first time. Neurons which represent object information (size) or deep neural networks which compress the data in terms of size and shape information are implemented to assist multi-layer perceptrons generating successful in-hand manipulation.

Chapter 3 shows a method which neural networks control a low-cost hand. Since the hand used in Chapter 2 is highly sophisticated, it is hard to implement the hand into actual situations because such hands are expensive and hard to maintain. Therefore, this chapter introduces an Allegro Hand which is commercially and reasonably available. Due to its cheapness, the controllability of the hand is also low. Then, distributed 3-axis tactile sensor “uSkin” is focused to install to the fingertips of the Allegro Hand. It is also difficult to handle much more tactile information during handling objects. This chapter focuses on using convolutional neural networks (CNN). Overall, this chapter describes how 3-axis tactile sensors and the CNN are used for in-hand manipulation.

Chapter 4 give approaches generating multiple motions with one network based on the proposed method from Chapter 3. Specifically, task parameters corresponding to each manipulation motion were prepared as input to the network. Since fingers motion can be changed drastically during multi-fingered manipulation with a variety of objects (e. g. object property such as size and shape), even one network should have skill to adapt such difference. Eventually, the network could generate several motions by changing the parameters.

Chapter 5 introduces a method to adapt to the difference of number of fingers within the same manipulation motion. By adopting fine-tuning with a finger which has similar motion to the other, the size of training data could be reduced resulting in avoiding a hardware load problem specific to multi-fingered hands. This study also embraces a factor as a preliminary experiment toward multi-fingered in-hand manipulation. As a result, multi-fingered manipulation with only fingertips was successfully achieved.

Chapter 6 reveals 3-axis tactile information can be useful for detecting grasping states based on grasped object property. In this study, deformation and slip detection system for a 2-fingered gripper was developed. A variety of daily objects was used to confirm that grasping states can be correctly identified and useful for 2-fingered gripper. From this result, the grasping states based on object property has potential on that of multi-fingered hands.

Chapter 7 shows that uSkin is implemented on whole surface of the Allegro Hand to achieve multi-fingered tasks. As a first step, this chapter shows the way how uSkin is implemented on the Allegro Hand and evaluate whether 3-axis tactile information is necessary or not. Object recognition is chosen as a task which can evaluate the 3-axis tactile information because it can be useful for multi-fingered manipulation with different objects. This chapter depicts the importance of using 3-axis tactile information by CNN for a multi-fingered hand.

Chapter 8 introduces the problem specific to a multi-fingered hand in which the Allegro Hand has uSkin sensors with different positions on the hand and some sensors have different size. In this case, how to implement tactile information from those sensors at the same time needs to be considered. Specifically, CNN has an ability to extract useful spatial information, thus how to correctly input all tactile information together is the key factor to solve this problem. In this chapter, combined CNN is proposed with convolution layers prepared for each tactile sensor patch. Each convolution layer is combined resulting in making one tactile feature map following actual positions of each tactile sensor. This geometrical approach is firstly proposed in this research.

Chapter 9 describes a control method for multi-fingered in-hand manipulation based on a graph convolutional network (GCN). A CNN cannot be applied to complicated tactile sensor alignments otherwise tactile information needs to be preprocessed arbitrary for combining convolution layers. Therefore, the GCN is used to keep geodesical tactile information and generate dexterous in-hand manipulation. Moreover, labels representing object properties are input to the GCN to change in-hand manipulation motion depending on manipulated objects. Finally, the multi-fingered hand succeeded in-hand manipulation with untrained objects.

Chapter 10 draws a conclusion including a summary, discussion and future work. This thesis describes in-hand manipulation and object recognition methods of multi-fingered humanoid’s hands to achieve a variety of tasks which humans do for their work or daily life. It means this research is toward implementing multi-fingered hands to actual situations including factories and homes to aim releasing people from tedious works. However, there are still some steps need to be progressed. Therefore, this chapter concludes what have done so far and what need to be done in the future.

The contribution of this study is the groundbreaking achievement that the multi-fingered robot hand not only successfully performed a difficult task by deep learning, but also achieved recognition and manipulation after showing how to build a deep learning model that takes the robot's structure into account. This research result suggests the possibility that multi-fingered robot hands will be implemented in society in the future and more tasks will be entrusted to robots, while two-fingered hands are currently used in the field to replace limited tasks. In addition, the proposed method for the multi-fingered robot hands will also be useful for humanoid robots and disaster response biomimetic robots.

参考文献

[1]Simio, “Industry 4.0,” in https://www.simio.com/applications/industry-40/index.php.

[2]I. F. of Robotics, “Global industrial robot sales doubled within five years (2013-2017),” in https://ifr.org/ifr-press-releases/news/global-industrial- robot-sales-doubled-over-the-past-five-years, 2008.

[3]I. Amazon.com, “Amazon picking challenge,” in https://amazonpickingchallenge.org/.

[4]D. R. Challenge, “The darpa robotics challenge,” in https://web.archive.org/web/ 20160428005028/http://www.darparoboticschallenge.org/, 2012.

[5]S. Yumitori, “Overview of robot rd in nedo,” in https://www.nedo.go.jp/content/ 100777818.pdf.

[6]K. R. Corp., “Nextage works side by side with people,” in http://nextage.kawada.jp/en/, Japan, 2019.

[7]ABB, “Yumi® - irb 14000 | collaborative robot,” in https://new.abb.com/products/ robotics/industrial-robots/irb-14000-yumi, Switzerland, 2019.

[8]S. Robotics, “Pepper,” in https://www.softbankrobotics.com/emea/en, Japan, 2019.

[9]L. TIME USA, “Meet the robot chef that can prepare your dinner,” in https://time.com/3819525/robot-chef-moley-robotics/, UK, 2015.

[10]I. Barrett Technology, “Barrett hand,” in https://sbir.gsfc.nasa.gov/SBIR/successes/ss/9- 056text.html, USA, 2019.

[11]“Shadow hand,”

[12]K. Inc., “Kinova jaco assistive robotic arm,” in https://www.kinovarobotics.com/ en/products/assistive-technologies/kinova-jaco-assistive-robotic-arm, Canada, 2019.

[13]W. Robotics, “Products/allegro hand,” in http://www.simlab.co.kr/Allegro- Hand.html, Korea, 2019.

[14]H. Lab, “Hubo hand,” in https://hubolab.kaist.ac.kr/hubo-hand/, Korea, 2019.

[15]Robotiq, “Adaptive hand,” in https://robotiq.com/products/3-finger-adaptive- robot-gripper, Canada, 2019.

[16]Schunk, “Sdh,” in https://schunk.com/jpen/gripping −systems/series/sdh/, Canada, 2019.

[17]A. J. Spiers, M. V. Liarokapis, B. Calli, and A. M. Dollar, “Single-grasp object classification and feature extraction with simple robot hands and tactile sensors,” IEEE Transactions on Haptics, vol. 9, no. 2, pp. 207– 220, 2016, ISSN: 1939-1412. DOI: 10.1109/TOH.2016.2521378.

[18]D. Feng, M. Kaboli, and G. Cheng, “Active prior tactile knowledge transfer for learning tactual properties of new objects,” Sensors, vol. 18, no. 2, 2018.

[19]M. Kaboli, A. D. L. R. T, R. Walker, and G. Cheng, “In-hand object recognition via texture properties with robotic hands, artificial skin, and novel tactile descriptors,” in 2015 IEEE-RAS 15th International Con- ference on Humanoid Robots (Humanoids), 2015, pp. 1155–1160. DOI: 10. 1109/HUMANOIDS.2015.7363508.

[20]T. Matsubara and K. Shibata, “Active tactile exploration with uncer- tainty and travel cost for fast shape estimation of unknown objects,” Robotics and Autonomous Systems, vol. 91, pp. 314 –326, 2017, ISSN: 0921-8890. DOI: https : / / doi . org / 10 . 1016 / j . robot . 2017 . 01 .014. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S092188901630522X.

[21]P. Falco, S. Lu, A. Cirillo, C. Natale, S. Pirozzi, and D. Lee, “Cross- modal visuo-tactile object recognition using robotic active exploration,” in 2017 IEEE International Conference on Robotics and Automation (ICRA), 2017, pp. 5273–5280. DOI: 10.1109/ICRA.2017.7989619.

[22]A. Vásquez, Z. Kappassov, and V. Perdereau, “In-hand object shape identification using invariant proprioceptive signatures,” in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2016, pp. 965–970. DOI: 10.1109/IROS.2016.7759166.

[23]D. Cockbum, J. Roberge, T. Le, A. Maslyczyk, and V. Duchaine, “Grasp stability assessment through unsupervised feature learning of tactile images,” in 2017 IEEE International Conference on Robotics and Automa- tion (ICRA), 2017, pp. 2238–2244. DOI: 10.1109/ICRA.2017.7989257.

[24]H. Liu, D. Guo, and F. Sun, “Object recognition using tactile mea- surements: Kernel sparse coding methods,” IEEE Transactions on In- strumentation and Measurement, vol. 65, no. 3, pp. 656–665, 2016, ISSN: 0018-9456. DOI: 10.1109/TIM.2016.2514779.

[25]S. Levine, P. Pastor, A. Krizhevsky, and D. Quillen, “Learning hand- eye coordination for robotic grasping with deep learning and large- scale data collection,” CoRR, vol. abs/1603.02199, 2016. arXiv: 1603. 02199. [Online]. Available: http://arxiv.org/abs/1603.02199.

[26]A. Zeng, S. Song, J. Lee, A. Rodriguez, and T. A. Funkhouser, “Toss- ingbot: Learning to throw arbitrary objects with residual physics,” CoRR, vol. abs/1903.11239, 2019. arXiv: 1903.11239. [Online]. Avail- able: http://arxiv.org/abs/1903.11239.

[27]OpenAI, “Learning dexterous in-hand manipulation,” CoRR, vol. abs/ 1808.00177, 2018. arXiv: 1808 . 00177. [Online]. Available: http : / / arxiv.org/abs/1808.00177.

[28]S. Funabashi, A. Schmitz, T. Sato, S. Somlor, and S. Sugano, “Robust in-hand manipulation of variously sized and shaped objects,” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2015, pp. 257–263. DOI: 10.1109/IROS.2015.7353383.

[29]V. Kumar, A. Gupta, E. Todorov, and S. Levine, “Learning dexterous manipulation policies from experience and imitation,” CoRR, vol. abs/ 1611.05095, 2016.

[30]M. Meier, F. Patzelt, R. Haschke, and H. J. Ritter, “Tactile convolu- tional networks for online slip and rotation detection,” in ICANN, 2016.

[31]W. Yuan, C. Zhu, A. Owens, M. A. Srinivasan, and E. H. Adelson, “Shape-independent hardness estimation using deep learning and a gelsight tactile sensor,” in 2017 IEEE International Conference on Robotics and Automation (ICRA), 2017, pp. 951–958. DOI: 10.1109/ICRA.2017. 7989116.

[32]C. Larson, J. Spjut, R. A. Knepper, and R. F. Shepherd, “Orbtouch: Recognizing human touch in deformable interfaces with deep neural networks,” CoRR, vol. abs/1706.02542, 2017.

[33]L. le Cao, K. Ramamohanarao, F. Sun, H. Li, W. bing Huang, and Z. M. M. Aye, “Efficient spatio-temporal tactile object recognition with randomized tiling convolutional networks in a hierarchical fusion strat- egy,” in AAAI, 2016.

[34]M. Meier, G. Walck, R. Haschke, and H. J. Ritter, “Distinguishing slid- ing from slipping during object pushing,” in 2016 IEEE/RSJ Interna- tional Conference on Intelligent Robots and Systems (IROS), 2016, pp. 5579– 5584. DOI: 10.1109/IROS.2016.7759820.

[35]S. S. Baishya and B. Bäuml, “Robust material classification with a tac- tile skin using deep learning,” in 2016 IEEE/RSJ International Confer- ence on Intelligent Robots and Systems (IROS), 2016, pp. 8–15. DOI: 10. 1109/IROS.2016.7758088.

[36]W. Yuan, Y. Mo, S. Wang, and E. Adelson, “Active Clothing Material Perception using Tactile Sensing and Deep Learning,” ArXiv e-prints, Nov. 2017. arXiv: 1711.00574 [cs.RO].

[37]R. Calandra, A. Owens, D. Jayaraman, J. Lin, W. Yuan, J. Malik, E. H. Adelson, and S. Levine, “More Than a Feeling: Learning to Grasp and Regrasp using Vision and Touch,” ArXiv e-prints, May 2018. arXiv: 1805.11085 [cs.RO].

[38]H. Iwata, R. Hayashi, Y. Shiozawa, and S. Sugano, “Basic control tech- niques of twendy-one hand which has passive flexibility - achieve- ment of diverse grip and manipulation by transitions between grip forms,” 26th Conference of Robotics Society of Japan, 2008.

[39]M. Johannes, J. D. Bigelow, J. M. Burck, S. D. Harshbarger, M. Ko- zlowski, and T. Van Doren, “An overview of the developmental pro- cess for the modular prosthetic limb,” vol. 30, pp. 207–216, Jan. 2011.

[40]L. Han, Y. S. Guan, Z. X. Li, Q. Shi, and J. C. Trinkle, “Dextrous manip- ulation with rolling contacts,” in Proceedings of International Conference on Robotics and Automation, vol. 2, 1997, 992–997 vol.2. DOI: 10.1109/ ROBOT.1997.614264.

[41]L. Han and J. C. Trinkle, “Dextrous manipulation by rolling and finger gaiting,” in Proceedings. 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146), vol. 1, 1998, 730–735 vol.1. DOI: 10.1109/ROBOT.1998.677060.

[42]S. M. Lavalle, J. J. Kuffner, and Jr., “Rapidly-exploring random trees: Progress and prospects,” in Algorithmic and Computational Robotics: New Directions, 2000, pp. 293–308.

[43]B. Sundaralingam and T. Hermans, “Relaxed-rigidity constraints: Kine- matic trajectory optimization and collision avoidance for in-grasp ma- nipulation,” CoRR, vol. abs/1806.00942, 2018. arXiv: 1806.00942. [On- line]. Available: http://arxiv.org/abs/1806.00942.

[44]R. Martins, D. Faria, and J. Dias, “Symbolic level generalization of in- hand manipulation tasks from human demonstrations using tactile data information,” Oct. 2010. DOI: 10.13140/2.1.3782.2401.

[45]G. Cheng, N. Hendrich, and J. Zhang, “In-hand manipulation action gist extraction from a data-glove,” in Foundations and Practical Appli- cations of Cognitive Systems and Information Processing, F. Sun, D. Hu, and H. Liu, Eds., Berlin, Heidelberg: Springer Berlin Heidelberg, 2014, pp. 773–782, ISBN: 978-3-642-37835-5.

[46]J. Gonzalez-Quijano, M. Abderrahim, C. Bensalah, and A. Al-Kaff, “A human-based genetic algorithm applied to the problem of learning in-hand manipulation tasks,” Oct. 2012.

[47]U. Prieur, V. Perdereau, and A. Bernardino, “Modeling and planning high-level in-hand manipulation actions from human knowledge and active learning from demonstration,” Oct. 2012, pp. 1330–1336, ISBN: 978-1-4673-1737-5. DOI: 10.1109/IROS.2012.6386090.

[48]M. V. Liarokapis and A. M. Dollar, “Learning task-specific models for dexterous, in-hand manipulation with simple, adaptive robot hands,” in 2016 IEEE/RSJ International Conference on Intelligent Robots and Sys- tems (IROS), 2016, pp. 2534–2541. DOI: 10.1109/IROS.2016.7759394.

[49]M. Liarokapis and A. M. Dollar, “Deriving dexterous, in-hand ma- nipulation primitives for adaptive robot hands,” in 2017 IEEE/RSJ In- ternational Conference on Intelligent Robots and Systems (IROS), 2017, pp. 1951–1958. DOI: 10.1109/IROS.2017.8206014.

[50]M. Kaneko, “Enveloping grasp,” in The Robotics Society of Japan, Vol. 18, No. 6, 2000, pp. 782–785.

[51]K. Tahara, K. Maruta, and M. Yamamoto, “External sensorless dy- namic object manipulation by a dual soft-fingered robotic hand with torsional fingertip motion,” in 2010 IEEE International Conference on Robotics and Automation, 2010, pp. 4309–4314. DOI: 10.1109/ROBOT. 2010.5509816.

[52]R. D. Howe, “Tactile sensing and control of robotic manipulation,” Advanced Robotics, vol. 8, no. 3, pp. 245–261, 1993. DOI: 10 . 1163 / 156855394X00356. eprint: https://doi.org/10.1163/156855394X00356. [Online]. Available: https://doi.org/10.1163/156855394X00356.

[53]A. NAKASHIMA, T. SHIBATA, and Y. HAYAKAWA, “Control of grasp and manipulation by soft fingers with 3-dimensional deformation,” SICE Journal of Control, Measurement, and System Integration, vol. 2, no. 2, pp. 78–87, 2009. DOI: 10.9746/jcmsi.2.78.

[54]H. Scharfe, N. Hendrich, and J. Zhang, “Hybrid physics simulation of multi-fingered hands for dexterous in-hand manipulation,” in 2012 IEEE International Conference on Robotics and Automation, 2012, pp. 3777– 3783. DOI: 10.1109/ICRA.2012.6225156.

[55]J. A. C. Ramón, F. T. Medina, and V. Perdereau, “Finger readjustment algorithm for object manipulation based on tactile information,” In- ternational Journal of Advanced Robotic Systems, vol. 10, no. 1, p. 9, 2013. DOI: 10 . 5772 / 53561. eprint: https : / / doi . org / 10 . 5772 / 53561. [Online]. Available: https://doi.org/10.5772/53561.

[56]R. Platt, Jr., A. H. Fagg, and R. A. Grupen, “Null-space grasp control: Theory and experiments,” IEEE Transactions on Robotics, vol. 26, no. 2, pp. 282–295, 2010, ISSN: 1552-3098. DOI: 10.1109/TRO.2010.2042754.

[57]H. Maekawa, K. Tanie, and K. Komoriya, “Tactile sensor based ma- nipulation of an unknown object by a multifingered hand with rolling contact,” in Proceedings of 1995 IEEE International Conference on Robotics and Automation, vol. 1, 1995, 743–750 vol.1. DOI: 10.1109/ROBOT.1995. 525372.

[58]N. Furukawa, A. Namiki, S. Taku, and M. Ishikawa, “Dynamic re- grasping using a high-speed multifingered hand and a high-speed vision system,” in Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006., 2006, pp. 181–187. DOI: 10. 1109/ROBOT.2006.1641181.

[59]M. Li, H. Yin, K. Tahara, and A. Billard, “Learning object-level impedance control for robust grasping and dexterous manipulation,” in 2014 IEEE International Conference on Robotics and Automation (ICRA), 2014, pp. 6784– 6791. DOI: 10.1109/ICRA.2014.6907861.

[60]H. van Hoof, T. Hermans, G. Neumann, and J. Peters, “Learning robot in-hand manipulation with tactile features,” in 2015 IEEE-RAS 15th In- ternational Conference on Humanoid Robots (Humanoids), 2015, pp. 121–127. DOI: 10.1109/HUMANOIDS.2015.7363524.

[61]M. Santello, M. Flanders, and J. Soechting, “Postural hand synergies for tool use,” English (US), Journal of Neuroscience, vol. 18, no. 23, pp. 10 105– 10 115, Dec. 1998, ISSN: 0270-6474.

[62]OpenAI, M. Andrychowicz, B. Baker, M. Chociej, R. Józefowicz, B. McGrew, J. W. Pachocki, J. Pachocki, A. Petron, M. Plappert, G. Pow- ell, A. Ray, J. Schneider, S. Sidor, J. Tobin, P. Welinder, L. Weng, and W. Zaremba, “Learning dexterous in-hand manipulation,” CoRR, vol. abs/ 1808.00177, 2018. arXiv: 1808 . 00177. [Online]. Available: http : / / arxiv.org/abs/1808.00177.

[63]Y. S. S. S. H. Iwata R. Hayashi, “Basic control techniques of twendy- one hand which has passive flexibility - achievement of diverse grip and manipulation by transitions between grip forms,” in 26th Confer- ence of Robotics Society of Japan, paper no.1E3-05, 2008. DOI: 10.1109/ HUMANOIDS.2016.7803315.

[64]K. Kojima, T. Sato, A. Schmitz, H. Arie, H. Iwata, and S. Sugano, “Sen- sor prediction and grasp stability evaluation for in-hand manipula- tion,” in 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2013, pp. 2479–2484. DOI: 10.1109/IROS.2013.6696705.

[65]H. Iwata and S. Sugano, “Design of human symbiotic robot twendy- one,” in 2009 IEEE International Conference on Robotics and Automation, 2009, pp. 580–586. DOI: 10.1109/ROBOT.2009.5152702.

[66]“Http://www.cyberglovesystems.com/products/cyberglove-ii/overview.”

[67]B. Calli, A. Singh, A. Walsman, S. Srinivasa, P. Abbeel, and A. M. Dol- lar, “The ycb object and model set: Towards common benchmarks for manipulation research,” in 2015 International Conference on Advanced Robotics (ICAR), 2015, pp. 510–517. DOI: 10.1109/ICAR.2015.7251504.

[68]P. J. A. and J. R. S., “Edge-orientation processing in first-order tactile neurons,” Nature neuroscience, vol. 17, 1404–1409, 2014. DOI: 10.1038/ nn.3804. [Online]. Available: https://pubmed.ncbi.nlm.nih.gov/ 25174006/.

[69]P. Falco, A. Attawia, M. Saveriano, and D. Lee, “On policy learning ro- bust to irreversible events: An application to robotic in-hand manip- ulation,” IEEE Robotics and Automation Letters, vol. 3, no. 3, pp. 1482– 1489, 2018, ISSN: 2377-3766. DOI: 10.1109/LRA.2018.2800110.

[70]S. Funabashi, A. Schmitz, T. Sato, S. Somlor, and S. Sugano, “Versa- tile in-hand manipulation of objects with different sizes and shapes using neural networks,” in 2018 IEEE-RAS 18th International Confer- ence on Humanoid Robots (Humanoids), 2018, pp. 1–9. DOI: 10.1109/ HUMANOIDS.2018.8624961.

[71]J. Reinecke, A. Dietrich, F. Schmidt, and M. Chalon, “Experimental comparison of slip detection strategies by tactile sensing with the bio- tac®on the dlr hand arm system,” in 2014 IEEE International Conference on Robotics and Automation (ICRA), 2014, pp. 2742–2748. DOI: 10.1109/ ICRA.2014.6907252.

[72]K. Or, A. Schmitz, S. Funabashi, M. Tomura, and S. Sugano, “Devel- opment of robotic fingertip morphology for enhanced manipulation stability,” in 2016 IEEE International Conference on Advanced Intelligent Mechatronics (AIM), 2016, pp. 25–30. DOI: 10.1109/AIM.2016.7576738.

[73]T. P. Tomo, W. K. Wong, A. Schmitz, H. Kristanto, A. Sarazin, L. Ja- mone, S. Somlor, and S. Sugano, “A modular, distributed, soft, 3-axis sensor system for robot hands,” in 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids), 2016, pp. 454–460. DOI: 10.1109/HUMANOIDS.2016.7803315.

[74]T. P. Tomo, A. Schmitz, W. K. Wong, H. Kristanto, S. Somlor, J. Hwang, L. Jamone, and S. Sugano, “Covering a robot fingertip with uskin: A soft electronic skin with distributed 3-axis force sensitive elements for robot hands,” IEEE Robotics and Automation Letters, vol. 3, no. 1, pp. 124–131, 2018. DOI: 10.1109/LRA.2017.2734965.

[75]F. Veiga, H. van Hoof, J. Peters, and T. Hermans, “Stabilizing novel ob- jects by learning to predict tactile slip,” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2015, pp. 5065– 5072. DOI: 10.1109/IROS.2015.7354090.

[76]M. Stachowsky, T. Hummel, M. Moussa, and H. A. Abdullah, “A slip detection and correction strategy for precision robot grasping,” IEEE/ASME Transactions on Mechatronics, vol. 21, no. 5, pp. 2214–2226, 2016, ISSN:1083-4435. DOI: 10.1109/TMECH.2016.2551557.

[77]M. Costanzo, G. D. Maria, and C. Natale, “Slipping control algorithms for object manipulation with sensorized parallel grippers,” in 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018, pp. 7455–7461. DOI: 10.1109/ICRA.2018.8460883.

[78]F. E. V. B., Y. Karayiannidis, C. Smith, and D. Kragic, “Adaptive con- trol for pivoting with visual and tactile feedback,” in 2016 IEEE Inter- national Conference on Robotics and Automation (ICRA), 2016, pp. 399–406. DOI: 10.1109/ICRA.2016.7487159.

[79]N. Chavan-Dafle and A. Rodriguez, “Prehensile pushing: In-hand ma- nipulation with push-primitives,” in 2015 IEEE/RSJ International Con- ference on Intelligent Robots and Systems (IROS), 2015, pp. 6215–6222. DOI: 10.1109/IROS.2015.7354264.

[80]N. C. Dafle, A. Rodriguez, R. Paolini, B. Tang, S. S. Srinivasa, M. Erd- mann, M. T. Mason, I. Lundberg, H. Staab, and T. Fuhlbrigge, “Ex- trinsic dexterity: In-hand manipulation with external forces,” in 2014 IEEE International Conference on Robotics and Automation (ICRA), 2014, pp. 1578–1585. DOI: 10.1109/ICRA.2014.6907062.

[81]J. He, S. Pu, and J. Zhang, “Haptic and visual perception in in-hand manipulation system,” in 2015 IEEE International Conference on Robotics and Biomimetics (ROBIO), 2015, pp. 303–308. DOI: 10 . 1109 / ROBIO . 2015.7418784.

[82]J. Shi, J. Z. Woodruff, P. B. Umbanhowar, and K. M. Lynch, “Dynamic in-hand sliding manipulation,” IEEE Transactions on Robotics, vol. 33, no. 4, pp. 778–795, 2017, ISSN: 1552-3098. DOI: 10.1109/TRO.2017. 2693391.

[83]F. Ficuciello, A. Migliozzi, E. Coevoet, A. Petit, and C. Duriez, “Fem- based deformation control for dexterous manipulation of 3d soft ob- jects,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018, pp. 4007–4013. DOI: 10.1109/IROS.2018. 8593512.

[84]H. van Hoof, T. Hermans, G. Neumann, and J. Peters, “Learning robot in-hand manipulation with tactile features,” in 2015 IEEE-RAS 15th In- ternational Conference on Humanoid Robots (Humanoids), 2015, pp. 121–127. DOI: 10.1109/HUMANOIDS.2015.7363524.

[85]M. Liarokapis and A. M. Dollar, “Deriving dexterous, in-hand ma- nipulation primitives for adaptive robot hands,” in 2017 IEEE/RSJ In- ternational Conference on Intelligent Robots and Systems (IROS), 2017, pp. 1951–1958. DOI: 10.1109/IROS.2017.8206014.

[86]M. V. Liarokapis and A. M. Dollar, “Learning task-specific models for dexterous, in-hand manipulation with simple, adaptive robot hands,” in 2016 IEEE/RSJ International Conference on Intelligent Robots and Sys- tems (IROS), 2016, pp. 2534–2541. DOI: 10.1109/IROS.2016.7759394.

[87]S. Funabashi, S. Morikuni, A. Geier, A. Schmitz, S. Ogasa, T. P. Tomo, S. Somlor, and S. Sugano, “Object recognition through active sensing using a multi-fingered robot hand with 3d tactile sensors,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018 (Accepted).

[88]S. Funabashi, G. Yang, A. Schmitz, A. Geier, and S. Sugano, “Morphology- specific convolutional neural networks for tactile object recognition with a multi-fingered hand,” 2019 IEEE International Conference on Robotics and Automation (ICRA), 2019, Accepted.

[89]S. Funabashi, A. Schmitz, S. Ogasa, and S. Shigeki, “Morphology- specific stepwise learning of in-hand manipulation with a four-fingered hand,” IEEE Transactions on Industrial Informatics, DOI: 10.1109/TII. 2019.2893713.

[90]K. Or, M. Tomura, A. Schmitz, S. Funabashi, and S. Sugano, “Inter- polation control posture design for in-hand manipulation,” in 2015 IEEE/SICE International Symposium on System Integration (SII), 2015, pp. 187–192. DOI: 10.1109/SII.2015.7404976.

[91]A. Chattopadhay, A. Sarkar, P. Howlader, and V. N. Balasubrama- nian, “Grad-cam++: Generalized gradient-based visual explanations for deep convolutional networks,” in 2018 IEEE Winter Conference on Applications of Computer Vision (WACV), 2018, pp. 839–847. DOI: 10. 1109/WACV.2018.00097.

[92]O. M. Andrychowicz, B. Baker, M. Chociej, R. Józefowicz, B. McGrew, J. Pachocki, A. Petron, M. Plappert, G. Powell, A. Ray, J. Schneider, S. Sidor, J. Tobin, P. Welinder, L. Weng, and W. Zaremba, “Learning dex- terous in-hand manipulation,” The International Journal of Robotics Re- search, vol. 39, no. 1, pp. 3–20, 2020. DOI: 10.1177/0278364919887447.eprint: https : / / doi . org / 10 . 1177 / 0278364919887447. [Online]. Available: https://doi.org/10.1177/0278364919887447.

[93]S. S. Baishya and B. Bäuml, “Robust material classification with a tac- tile skin using deep learning,” in 2016 IEEE/RSJ International Confer- ence on Intelligent Robots and Systems (IROS), 2016, pp. 8–15.

[94]X. Long, M. Wonsick, V. Dimitrov, and T. Padır, “Anytime multi-task motion planning for humanoid robots,” in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2017, pp. 4452– 4459. DOI: 10.1109/IROS.2017.8206311.

[95]G. Raiola, X. Lamy, and F. Stulp, “Co-manipulation with multiple probabilistic virtual guides,” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2015, pp. 7–13. DOI: 10.1109/ IROS.2015.7353107.

[96]K. Kase, K. Suzuki, P. Yang, H. Mori, and T. Ogata, “Put-in-box task generated from multiple discrete tasks by ahumanoid robot using deep learning,” in 2018 IEEE International Conference on Robotics and Automa- tion (ICRA), 2018, pp. 6447–6452. DOI: 10.1109/ICRA.2018.8460623.

[97]A. X. Lee, A. Gupta, H. Lu, S. Levine, and P. Abbeel, “Learning from multiple demonstrations using trajectory-aware non-rigid registration with applications to deformable object manipulation,” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2015, pp. 5265–5272. DOI: 10.1109/IROS.2015.7354120.

[98]J. Tani and M. Ito, “Self-organization of behavioral primitives as mul- tiple attractor dynamics: A robot experiment,” IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, vol. 33, no. 4, pp. 481–488, 2003, ISSN: 1083-4427. DOI: 10.1109/TSMCA.2003. 809171.

[99]T. Ogata, S. Matsumoto, J. Tani, K. Komatani, and H. G. Okuno, “Human- robot cooperation using quasi-symbols generated by rnnpb model,” in Proceedings 2007 IEEE International Conference on Robotics and Au- tomation, 2007, pp. 2156–2161. DOI: 10.1109/ROBOT.2007.363640.

[100]L. Rybicki, Y. Sugita, and J. Tani, “Reinforcement learning of multiple tasks using parametric bias,” in 2009 International Joint Conference on Neural Networks, 2009, pp. 2732–2739. DOI: 10 . 1109 / IJCNN . 2009 . 5178868.

[101]Y. Kim and X. Huynh, “Discrimination between genuine versus fake emotion using long-short term memory with parametric bias and fa- cial landmarks,” in 2017 IEEE International Conference on Computer Vi- sion Workshops (ICCVW), 2017, pp. 3065–3072. DOI: 10.1109/ICCVW. 2017.362.

[102]R. Rahmatizadeh, P. Abolghasemi, L. Bölöni, and S. Levine, “Vision- based multi-task manipulation for inexpensive robots using end-to- end learning from demonstration,” in 2018 IEEE International Con- ference on Robotics and Automation (ICRA), 2018, pp. 3758–3765. DOI: 10.1109/ICRA.2018.8461076.

[103]U. Scarcia, K. Hertkorn, C. Melchiorri, G. Palli, and T. Wimböck, “Lo- cal online planning of coordinated manipulation motion,” in 2015 IEEE International Conference on Robotics and Automation (ICRA), 2015, pp. 6081– 6087. DOI: 10.1109/ICRA.2015.7140052.

[104]S. Funabashi, T. Isobe, S. Ogasa, T. Ogata, A. Schmitz, T. P. Tomo, and S. Sugano, “Stable in-grasp manipulation with a low-cost robot hand by using 3-axis tactile sensors with a cnn,” in 2020 IEEE/RSJ Inter- national Conference on Intelligent Robots and Systems (IROS), 2020, Ac- cepted.

[105]A. Gupta, C. Eppner, S. Levine, and P. Abbeel, “Learning dexterous manipulation for a soft robotic hand from human demonstrations,” in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2016, pp. 3786–3793. DOI: 10.1109/IROS.2016.7759557.

[106]M. Li, H. Yin, K. Tahara, and A. Billard, “Learning object-level impedance control for robust grasping and dexterous manipulation,” in 2014 IEEE International Conference on Robotics and Automation (ICRA), 2014, pp. 6784– 6791. DOI: 10.1109/ICRA.2014.6907861.

[107]S. Funabashi, A. Schmitz, T. Sato, S. Somlor, and S. Sugano, “Robust in-hand manipulation of variously sized and shaped objects,” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2015, pp. 257–263. DOI: 10.1109/IROS.2015.7353383.

[108]M. V. Liarokapis, B. Calli, A. J. Spiers, and A. M. Dollar, “Unplanned, model-free, single grasp object classification with underactuated hands and force sensors,” in 2015 IEEE/RSJ International Conference on Intel- ligent Robots and Systems (IROS), 2015, pp. 5073–5080. DOI: 10.1109/ IROS.2015.7354091.

[109]X. Glorot and Y. Bengio, “Understanding the difficulty of training deep feedforward neural networks,” in JMLR W&CP: Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statis- tics (AISTATS 2010), vol. 9, Chia Laguna Resort, Sardinia, Italy, May 2010, pp. 249–256.

[110]K. Takahashi, K. Kim, T. Ogata, and S. Sugano, “Tool-body assimi- lation model considering grasping motion through deep learning,” Robotics and Autonomous Systems, vol. 91, pp. 115 –127, 2017, ISSN: 0921-8890. DOI: https : / / doi . org / 10 . 1016 / j . robot . 2017 . 01 . 002. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S0921889016303852.

[111]K. Mochizuki, S. Nishide, H. G. Okuno, and T. Ogata, “Developmen- tal human-robot imitation learning of drawing with a neuro dynami- cal system,” in 2013 IEEE International Conference on Systems, Man, and Cybernetics, 2013, pp. 2336–2341. DOI: 10.1109/SMC.2013.399.

[112]J. Wan, G. Guo, and S. Z. Li, “Explore efficient local features from rgb-d data for one-shot learning gesture recognition,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 38, no. 8, pp. 1626–1639, 2016. DOI: 10.1109/TPAMI.2015.2513479.

[113]J. Fu, S. Levine, and P. Abbeel, “One-shot learning of manipulation skills with online dynamics adaptation and neural network priors,” in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2016, pp. 4019–4026. DOI: 10.1109/IROS.2016.7759592.

[114]T. Feix, J. Romero, H. Schmiedmayer, A. M. Dollar, and D. Kragic, “The grasp taxonomy of human grasp types,” IEEE Transactions on Human-Machine Systems, vol. 46, no. 1, pp. 66–77, 2016. DOI: 10.1109/ THMS.2015.2470657.

[115]B. A. Kent and E. D. Engeberg, “Robotic hand acceleration feedback to synergistically prevent grasped object slip,” IEEE Transactions on Robotics, vol. 33, no. 2, pp. 492–499, 2017. DOI: 10.1109/TRO.2016.2633574.

[116]N. Kamakura, F. Mitsuhoshi, N. Asami, and M. Nakata, “The motion patterns of human hand in objects manipulation,” The Japanese Associ- ation of Rehabilitation Medicine, 1986 (in Japanese).

[117]C. H. Kim, H. Tsujino, and S. Sugano, “Online motion selection for semi-optimal stabilization using reverse-time tree,” in 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011, pp. 3792– 3799. DOI: 10.1109/IROS.2011.6094807.

[118]K. Hertkorn, M. A. Roa, and C. Borst, “Planning in-hand object ma- nipulation with multifingered hands considering task constraints,” in 2013 IEEE International Conference on Robotics and Automation, 2013, pp. 617–624. DOI: 10.1109/ICRA.2013.6630637.

[119]N. Chavan-Dafle and A. Rodriguez, “Prehensile pushing: In-hand ma- nipulation with push-primitives,” in 2015 IEEE/RSJ International Con- ference on Intelligent Robots and Systems (IROS), 2015, pp. 6215–6222. DOI: 10.1109/IROS.2015.7354264.

[120]N. C. Dafle, A. Rodriguez, R. Paolini, B. Tang, S. S. Srinivasa, M. Erd- mann, M. T. Mason, I. Lundberg, H. Staab, and T. Fuhlbrigge, “Ex- trinsic dexterity: In-hand manipulation with external forces,” in 2014 IEEE International Conference on Robotics and Automation (ICRA), 2014, pp. 1578–1585. DOI: 10.1109/ICRA.2014.6907062.

[121]J. C. A., P. J., C. F. A., and T. F., “Control framework for dexterous ma- nipulation using dynamic visual servoing and tactile sensors’ feed- back,” in Sensors, vol. 14, 2014, 1787–1804. DOI: 10.3390/s140101787.

[122]S. Levine, P. Pastor, A. Krizhevsky, and D. Quillen, Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection, 2016. arXiv: 1603.02199 [cs.LG].

[123]B. Calli, K. Srinivasan, A. Morgan, and A. M. Dollar, “Learning modes of within-hand manipulation,” in 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018, pp. 3145–3151. DOI: 10.1109/ ICRA.2018.8461187.

[124]Y. Matsui, Y. Yamakawa, and M. Ishikawa, “Cooperative operation between a human and a robot based on real-time measurement of location and posture of target object by high-speed vision,” in 2017 IEEE Conference on Control Technology and Applications (CCTA), 2017, pp. 457–462. DOI: 10.1109/CCTA.2017.8062504.

[125]B. Sundaralingam and T. Hermans, “Relaxed-rigidity constraints: In- grasp manipulation using purely kinematic trajectory optimization,” in Robotics: Science and Systems, 2017.

[126]M. C. Koval, D. Hsu, N. S. Pollard, and S. S. Srinivasa, Configuration lattices for planar contact manipulation under uncertainty, 2016. arXiv: 1605.00169 [cs.RO].

[127]C. K. H. I. S. S. D. Kikuchi E. Kasai, “Stabilization of manipulation work based on generalization learning of finger shapes & tactile infor- mation of careful multifinger robot hand,” in 27th Conference of Robotics Society of Japan (in Japanese), 2009, pp. 1578–1585. DOI: 10.1109/ICRA. 2014.6907062.

[128]B. Huang, M. Li, R. D. Souza, J. Bryson, and A. Billard, “A modu- lar approach to learning manipulation strategies from human demon- stration,” Autonomous Robots, vol. 40, pp. 903–927, 2016.

[129]H. van Hoof, T. Hermans, G. Neumann, and J. Peters, “Learning robot in-hand manipulation with tactile features,” in 2015 IEEE-RAS 15th In- ternational Conference on Humanoid Robots (Humanoids), 2015, pp. 121–127. DOI: 10.1109/HUMANOIDS.2015.7363524.

[130]B. Siciliano and O. Khatib, “The motion patterns of human hand in ob- jects manipulation,” Springer Handbook of Robotics, pp. 955–989, 2008.

[131]K. Or, M. Tomura, A. Schmitz, S. Funabashi, and S. Sugano, “Position- force combination control with passive flexibility for versatile in-hand manipulation based on posture interpolation,” in 2016 IEEE/RSJ In- ternational Conference on Intelligent Robots and Systems (IROS), 2016, pp. 2542–2547. DOI: 10.1109/IROS.2016.7759395.

[132]D. P. Kingma and J. Ba, Adam: A method for stochastic optimization, 2017. arXiv: 1412.6980 [cs.LG].

[133]S. Levine, P. Pastor, A. Krizhevsky, and D. Quillen, “Learning hand- eye coordination for robotic grasping with deep learning and large- scale data collection,” CoRR, vol. abs/1603.02199, 2016. arXiv: 1603. 02199. [Online]. Available: http://arxiv.org/abs/1603.02199.

[134]M. Breyer, F. Furrer, T. Novkovic, R. Siegwart, and J. I. Nieto, “Flexible robotic grasping with sim-to-real transfer based reinforcement learn- ing,” CoRR, vol. abs/1803.04996, 2018. arXiv: 1803.04996. [Online]. Available: http://arxiv.org/abs/1803.04996.

[135]F. H. L. Chan, Y. F. Liu, M. Farhan, K. H. Koh, B. L. Luk, and K. W. C. Lai, “Design study of gripping module for multi-size objects with tac- tile sensory feedback control,” in 2016 IEEE International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CY- BER), 2016, pp. 39–44. DOI: 10.1109/CYBER.2016.7574792.

[136]M. N. Saadatzi, S. K. Das, I. B. Wijayasinghe, D. O. Popa, and J. R. Bap- tist, “Precision grasp control with a pneumatic gripper and a novel fingertip force sensor,” in 2018 IEEE 14th International Conference on Automation Science and Engineering (CASE), 2018, pp. 1454–1459. DOI: 10.1109/COASE.2018.8560540.

[137]L. Chin, M. C. Yuen, J. Lipton, L. H. Trueba, R. Kramer-Bottiglio, and D. Rus, “A simple electric soft robotic gripper with high-deformation haptic feedback,” in 2019 International Conference on Robotics and Au- tomation (ICRA), 2019, pp. 2765–2771. DOI: 10 . 1109 / ICRA . 2019 . 8794098.

[138]R. Calandra, A. Owens, D. Jayaraman, J. Lin, W. Yuan, J. Malik, E. H. Adelson, and S. Levine, “More than a feeling: Learning to grasp and regrasp using vision and touch,” CoRR, vol. abs/1805.11085, 2018. arXiv: 1805 . 11085. [Online]. Available: http : / / arxiv . org / abs / 1805.11085.

[139]R. Tanii, T. Nguyen, T. Takahata, and I. Shimoyama, “Elasticity sen- sor using different tactile properties on one chip,” in 2018 IEEE Micro Electro Mechanical Systems (MEMS), 2018, pp. 862–865. DOI: 10.1109/ MEMSYS.2018.8346692.

[140]T. Shimizu, M. Shikida, K. Sato, and K. Itoigawa, “A new type of tactile sensor detecting contact force and hardness of an object,” in Technical Digest. MEMS 2002 IEEE International Conference. Fifteenth IEEE International Conference on Micro Electro Mechanical Systems (Cat. No.02CH37266), 2002, pp. 344–347. DOI: 10.1109/MEMSYS.2002.984273.

[141]Z. Kappassov, D. Baimukashev, O. Adiyatov, S. Salakchinov, Y. Mas- salin, and H. A. Varol, “A series elastic tactile sensing array for tac- tile exploration of deformable and rigid objects,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018, pp. 520–525. DOI: 10.1109/IROS.2018.8593755.

[142]A. Drimus, G. Kootstra, A. Bilberg, and D. Kragic, “Design of a flexi- ble tactile sensor for classification of rigid and deformable objects,” Robotics and Autonomous Systems, vol. 62, 3–15, Jan. 2014. DOI: 10 . 1016/j.robot.2012.07.021.

[143]I. Bandyopadhyaya, D. Babu, A. Kumar, and J. Roychowdhury, “Tac- tile sensing based softness classification using machine learning,” in 2014 IEEE International Advance Computing Conference (IACC), 2014, pp. 1231–1236. DOI: 10.1109/IAdCC.2014.6779503.

[144]W. Yuan, R. Li, M. A. Srinivasan, and E. H. Adelson, “Measurement of shear and slip with a gelsight tactile sensor,” in 2015 IEEE International Conference on Robotics and Automation (ICRA), 2015, pp. 304–311. DOI: 10.1109/ICRA.2015.7139016.

[145]S. Dong, W. Yuan, and E. Adelson, “Improved gelsight tactile sensor for measuring geometry and slip,” Sep. 2017, pp. 137–144. DOI: 10. 1109/IROS.2017.8202149.

[146]S. Dong, D. Ma, E. Donlon, and A. Rodríguez, “Maintaining grasps within slipping bounds by monitoring incipient slip,” 2019 Interna- tional Conference on Robotics and Automation (ICRA), pp. 3818–3824, 2018.

[147]M. Stachowsky, T. Hummel, M. Moussa, and H. A. Abdullah, “A slip detection and correction strategy for precision robot grasping,” IEEE/ASME Transactions on Mechatronics, vol. 21, no. 5, pp. 2214–2226, 2016, ISSN: 1941-014X. DOI: 10.1109/TMECH.2016.2551557.

[148]F. Veiga, H. van Hoof, J. Peters, and T. Hermans, “Stabilizing novel ob- jects by learning to predict tactile slip,” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2015, pp. 5065– 5072. DOI: 10.1109/IROS.2015.7354090.

[149]M. Meier, F. Patzelt, R. Haschke, and H. Ritter, “Tactile convolutional networks for online slip and rotation detection,” vol. 9887, Sep. 2016, pp. 12–19, ISBN: 978-3-319-44780-3. DOI: 10.1007/978-3-319-44781- 0_2.

[150]J. Li, S. Dong, and E. Adelson, “Slip detection with combined tac- tile and visual information,” in 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018, pp. 7772–7777. DOI: 10.1109/ ICRA.2018.8460495.

[151]J. R. S and F. J. Randall, “Coding and use of tactile signals from the fingertips in object manipulation tasks,” Nature Reviews Neuroscience, vol. 10, pp. 1041–1048, 2009. DOI: 10.1038/nrn2621.

[152]J. M. Romano, K. Hsiao, G. Niemeyer, S. Chitta, and K. J. Kuchen- becker, “Human-inspired robotic grasp control with tactile sensing,” IEEE Transactions on Robotics, vol. 27, no. 6, pp. 1067–1079, 2011, ISSN: 1941-0468. DOI: 10.1109/TRO.2011.2162271.

[153]J. Sanchez, J.-A. Corrales, B.-C. Bouzgarrou, and Y. Mezouar, “Robotic manipulation and sensing of deformable objects in domestic and in- dustrial applications: A survey,” The International Journal of Robotics Research, vol. 37, no. 7, pp. 688–716, 2018. DOI: 10.1177/0278364918779698. eprint: https : / / doi . org / 10 . 1177 / 0278364918779698. [Online]. Available: https://doi.org/10.1177/0278364918779698.

[154]J. Roberge, W. Ruotolo, V. Duchaine, and M. Cutkosky, “Improving industrial grippers with adhesion-controlled friction,” IEEE Robotics and Automation Letters, vol. 3, no. 2, pp. 1041–1048, 2018, ISSN: 2377- 3774. DOI: 10.1109/LRA.2018.2794618.

[155]S. Funabashi, G. Yan, A. Geier, A. Schmitz, T. Ogata, and S. Sugano, “Morphology-specific convolutional neural networks for tactile object recognition with a multi-fingered hand,” in 2019 International Confer- ence on Robotics and Automation (ICRA), 2019, pp. 57–63. DOI: 10.1109/ ICRA.2019.8793901.

[156]Z. Kappassov, J.-A. Corrales, and V. Perdereau, “Tactile sensing in dexterous robot hands — review,” Robotics and Autonomous Systems, vol. 74, pp. 195 –220, 2015, ISSN: 0921-8890. DOI: https://doi.org/ 10.1016/j.robot.2015.07.015. [Online]. Available: http://www. sciencedirect.com/science/article/pii/S0921889015001621.

[157]H. Iwata and S. Sugano, “Design of human symbiotic robot twendy- one,” in 2009 IEEE International Conference on Robotics and Automation, 2009, pp. 580–586. DOI: 10.1109/ROBOT.2009.5152702.

[158]L. Natale and E. Torres-Jara, “A sensitive approach to grasping,” in In Proceedings of the sixth international workshop on epigenetic robotics, Citeseer, 2006, pp. 87–94.

[159]L. B. Bridgwater, C. A. Ihrke, M. A. Diftler, M. E. Abdallah, N. A. Radford, J. M. Rogers, S. Yayathi, R. S. Askew, and D. M. Linn, “The robonaut 2 hand - designed to do work with tools,” in 2012 IEEE In- ternational Conference on Robotics and Automation, 2012, pp. 3425–3430. DOI: 10.1109/ICRA.2012.6224772.

[160]G. Cannata and M. Maggiali, “Design of a tactile sensor for robot hands,” in Sensors: Focus on Tactile Force and Stress Sensors, 2008.

[161]W. Yuan, Y. Mo, S. Wang, and E. Adelson, “Active clothing material perception using tactile sensing and deep learning,” in arXiv:1711.00574, 2018.

[162]P. Falco, S. Lu, A. Cirillo, C. Natale, S. Pirozzi, and D. Lee, “Cross- modal visuo-tactile object recognition using robotic active exploration,” in 2017 IEEE International Conference on Robotics and Automation (ICRA), 2017, pp. 5273–5280. DOI: 10.1109/ICRA.2017.7989619.

[163]L. Jamone, G. Metta, F. Nori, and G. Sandini, “James: A humanoid robot acting over an unstructured world,” in 2006 6th IEEE-RAS In- ternational Conference on Humanoid Robots, 2006, pp. 143–150. DOI: 10. 1109/ICHR.2006.321376.

[164]N. Gorges, S. E. Navarro, D. Göger, and H. Wörn, “Haptic object recognition using passive joints and haptic key features,” in 2010 IEEE International Conference on Robotics and Automation, 2010, pp. 2349– 2355. DOI: 10.1109/ROBOT.2010.5509553.

[165]S. E. Navarro, N. Gorges, H. Wörn, J. Schill, T. Asfour, and R. Dill- mann, “Haptic object recognition for multi-fingered robot hands,” in 2012 IEEE Haptics Symposium (HAPTICS), 2012, pp. 497–502. DOI: 10.1109/HAPTIC.2012.6183837.

[166]H. Liu, D. Guo, and F. Sun, “Object recognition using tactile mea- surements: Kernel sparse coding methods,” IEEE Transactions on In- strumentation and Measurement, vol. 65, no. 3, pp. 656–665, 2016, ISSN: 0018-9456. DOI: 10.1109/TIM.2016.2514779.

[167]A. Schmitz, Y. Bansho, K. Noda, H. Iwata, T. Ogata, and S. Sugano, “Tactile object recognition using deep learning and dropout,” in 2014 IEEE-RAS International Conference on Humanoid Robots, 2014, pp. 1044– 1050. DOI: 10.1109/HUMANOIDS.2014.7041493.

[168]Y. Gao, L. A. Hendricks, K. J. Kuchenbecker, and T. Darrell, “Deep learning for tactile understanding from visual and haptic data,” in 2016 IEEE International Conference on Robotics and Automation (ICRA), 2016, pp. 536–543. DOI: 10.1109/ICRA.2016.7487176.

[169]M. Meier, G. Walck, R. Haschke, and H. J. Ritter, “Distinguishing slid- ing from slipping during object pushing,” in 2016 IEEE/RSJ Interna- tional Conference on Intelligent Robots and Systems (IROS), 2016, pp. 5579– 5584. DOI: 10.1109/IROS.2016.7759820.

[170]S. S. Baishya and B. Bäuml, “Robust material classification with a tac- tile skin using deep learning,” in 2016 IEEE/RSJ International Confer- ence on Intelligent Robots and Systems (IROS), 2016, pp. 8–15. DOI: 10. 1109/IROS.2016.7758088.

[171]S. Takamuku, A. Fukuda, and K. Hosoda, “Repetitive grasping with anthropomorphic skin-covered hand enables robust haptic recogni- tion,” in 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2008, pp. 3212–3217. DOI: 10.1109/IROS.2008.4651175.

[172]K. Hosoda and T. Iwase, “Robust haptic recognition by anthropomor- phic bionic hand through dynamic interaction,” in 2010 IEEE/RSJ In- ternational Conference on Intelligent Robots and Systems, 2010, pp. 1236– 1241. DOI: 10.1109/IROS.2010.5649297.

[173]B. Calli, A. Walsman, A. Singh, S. Srinivasa, P. Abbeel, and A. M. Dol- lar, “Benchmarking in manipulation research: Using the yale-cmu- berkeley object and model set,” IEEE Robotics Automation Magazine, vol. 22, no. 3, pp. 36–52, 2015, ISSN: 1070-9932. DOI: 10.1109/MRA.2015.2448951.

[174] Tensorflow, https://www.tensorflow.org/, Accessed: 2018-01-01.

[175] Api keras, https://keras.io/, Accessed: 2018-01-05.

[176]Y. Chebotar, K. Hausman, Z. Su, G. S. Sukhatme, and S. Schaal, “Self- supervised regrasping using spatio-temporal tactile features and rein- forcement learning,” in 2016 IEEE/RSJ International Conference on Intel- ligent Robots and Systems (IROS), 2016, pp. 1960–1966. DOI: 10.1109/ IROS.2016.7759309.

[177]K.-T. Yu and A. Rodriguez, “Realtime state estimation with tactile and visual sensing for inserting a suction-held object,” 2018 IEEE/RSJ In- ternational Conference on Intelligent Robots and Systems (IROS), pp. 1628– 1635, 2018.

[178]A. Montano and R. Suárez, “Manipulation of unknown objects to im- prove the grasp quality using tactile information,” vol. 18, p. 1412, May 2018.

[179]M. Li, H. Yin, K. Tahara, and A. Billard, “Learning object-level impedance control for robust grasping and dexterous manipulation,” in 2014 IEEE International Conference on Robotics and Automation (ICRA), 2014, pp. 6784– 6791. DOI: 10.1109/ICRA.2014.6907861.

[180]J. Kwiatkowski, D. Cockburn, and V. Duchaine, “Grasp stability as- sessment through the fusion of proprioception and tactile signals us- ing convolutional neural networks,” in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2017, pp. 286–292. DOI: 10.1109/IROS.2017.8202170.

[181]T. N. Kipf and M. Welling, Semi-supervised classification with graph con- volutional networks, 2017. arXiv: 1609.02907 [cs.LG].

[182]H. Zhu, A. Gupta, A. Rajeswaran, S. Levine, and V. Kumar, “Dexter- ous manipulation with deep reinforcement learning: Efficient, gen- eral, and low-cost,” in 2019 International Conference on Robotics and Automation (ICRA), 2019, pp. 3651–3657. DOI: 10.1109/ICRA.2019. 8794102.

[183]D. Jain, A. Li, S. Singhal, A. Rajeswaran, V. Kumar, and E. Todorov, “Learning deep visuomotor policies for dexterous hand manipula- tion,” in 2019 International Conference on Robotics and Automation (ICRA), 2019, pp. 3636–3643. DOI: 10.1109/ICRA.2019.8794033.

[184]R. L. Truby, R. K. Katzschmann, J. A. Lewis, and D. Rus, “Soft robotic fingers with embedded ionogel sensors and discrete actuation modes for somatosensitive manipulation,” in 2019 2nd IEEE International Con- ference on Soft Robotics (RoboSoft), 2019, pp. 322–329. DOI: 10 . 1109 / ROBOSOFT.2019.8722722.

[185]A. Garcia-Garcia, B. S. Zapata-Impata, S. Orts-Escolano, P. Gil, and J. Garcia-Rodriguez, “Tactilegcn: A graph convolutional network for predicting grasp stability with tactile sensors,” in 2019 International Joint Conference on Neural Networks (IJCNN), 2019, pp. 1–8. DOI: 10. 1109/IJCNN.2019.8851984.

参考文献をもっと見る

全国の大学の
卒論・修論・学位論文

一発検索!

この論文の関連論文を見る