リケラボ論文検索は、全国の大学リポジトリにある学位論文・教授論文を一括検索できる論文検索サービスです。

リケラボ 全国の大学リポジトリにある学位論文・教授論文を一括検索するならリケラボ論文検索大学・研究所にある論文を検索できる

リケラボ 全国の大学リポジトリにある学位論文・教授論文を一括検索するならリケラボ論文検索大学・研究所にある論文を検索できる

大学・研究所にある論文を検索できる 「Exploring Design Space in Augmented Reality for Multipurpose Daily Used Supernumerary Robotic Limbs」の論文概要。リケラボ論文検索は、全国の大学リポジトリにある学位論文・教授論文を一括検索できる論文検索サービスです。

コピーが完了しました

URLをコピーしました

論文の公開元へ論文の公開元へ
書き出し

Exploring Design Space in Augmented Reality for Multipurpose Daily Used Supernumerary Robotic Limbs

Sun Ming 早稲田大学

2020.03.24

概要

Robotics has been applied in people’s daily life and wearable robotics is one class of the robotics. Wearable robotics allows itself to change its shape so that it could provide multiple functions in different shapes. The potential possibilities of this kind of robotics, especially in daily life tasks, have not been largely investigated. Therefore, we introduce “ARobot”, a system that is designed to help users control the robot in daily use as an interaction interface. We mainly investigate the level of usefulness and preference in differences and performance between different features through following three aspects of interaction: Physical Object Manipulation, Digital Interaction and Human-like agent interaction experience.

The system consists of following components: Augmented Reality (AR) user interface, robot itself, AR head-mount as display, trackers and AR-robot integration software. We apply our AR system to an example wearable robot and use it as our prototype. We design the user interface as a unified interface so that it can handle different kinds of tasks and provide comprehensive experience. Based on users’ needs and their selections, the system will have different modes due to current situations. The robot can be controlled to manipulate with physical objects through AR user interface based on users’ commands. Users can also manipulate the robot’s intrinsic through the AR system. They can retrieve information about the robot and make adjusts and changes to the robot’s settings to satisfy their needs. Users can also interact with the agent which is a representative of the robot and have human-being realistic interaction experience because the agent has human-like physical appearance.

We demonstrate the potential possibilities of using wearable robot which is controlled by AR system in daily life tasks in chosen design space. It shows how AR system provide more functionalities and experiences while using the robot. We evaluate our implemented system by inviting a number of participants in our user study. Based on the feedback from the evaluation, participants prefer having more choices and options in interaction interface and personal customization is also important aspect to consider. Our approach is found to be interesting and innovative. In the end, we discuss about our research’s limitations and future work’s possibilities to provide future research direction.

この論文で使われている画像

参考文献

[1] Urbani, J., Al-Sada, M., Nakajima, T., & Hoglund, T. (2018). Exploring Augmented Reality Interaction for Everyday Multipurpose Wearable Robots. 2018 IEEE 24th International Conference on Embedded and Real-Time Computing Systems and Applications (RTCSA). doi: 10.1109/rtcsa.2018.00033.

[2] Al-Sada, M., Höglund, T., Khamis, M., Urbani, J., & Nakajima, T. (2019). Orochi. Proceedings of the 10th Augmented Human International Conference 2019 on - AH2019. doi: 10.1145/3311823.3311850.

[3] Kolsch, M., Bane, R., Hollerer, T., & Turk, M. (2006). Multimodal interaction with a wearable augmented reality system. IEEE Computer Graphics and Applications, 26(3), 62–71. doi: 10.1109/mcg.2006.66.

[4] Lee, G. A., Yang, U., Kim, Y., Jo, D., Kim, K.-H., Kim, J. H., & Choi, J. S. (2009). Freeze-Set-Go interaction method for handheld mobile augmented reality environments. Proceedings of the 16th ACM Symposium on Virtual Reality Software and Technology - VRST 09. doi: 10.1145/1643928.1643961.

[5] Barakonyi, I., Psik, T., & Schmalstieg, D. (n.d.). Agents That Talk And Hit Back: Animated Agents in Augmented Reality. Third IEEE and ACM International Symposium on Mixed and Augmented Reality. doi: 10.1109/ismar.2004.11

[6] Miyawaki, K., & Sano, M. (n.d.). A Virtual Agent for a Cooking Navigation System Using Augmented Reality. Intelligent Virtual Agents Lecture Notes in Computer Science, 97–103. doi: 10.1007/978-3-540-85483-8_10

[7] Hashimoto, S., Ishida, A., Inami, M., & Igarashi, T. (2012). 2P1-M04 TouchMe : Remote Robot Control Based on Augmented Reality(VR and Interface). The Proceedings of JSME Annual Conference on Robotics and Mechatronics (Robomec), 2012(0). doi: 10.1299/jsmermd.2012._2p1-m04_1

[8] Milgram, P., Rastogi, A., & Grodski, J. (n.d.). Telerobotic control using augmented reality. Proceedings 4th IEEE International Workshop on Robot and Human Communication. doi: 10.1109/roman.1995.531930

[9] Green, S. A., Billinghurst, M., Chen, X., & Chase, J. G. (2008). Human-Robot Collaboration: A Literature Review and Augmented Reality Approach in Design. International Journal of Advanced Robotic Systems, 5(1), 1. doi: 10.5772/5664

[10] Abdi, E., Burdet, E., Bouri, M., & Bleuler, H. (2015). Control of a Supernumerary Robotic Hand by Foot: An Experimental Study in Virtual Reality. Plos One, 10(7). doi: 10.1371/journal.pone.0134501

[11] Bonilla, B. L.-, Parietti, F., & Asada, H. H. (2012). Demonstration-based control of supernumerary robotic limbs. 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems. doi: 10.1109/iros.2012.6386055

[12] Parietti, F., Chan, K. C., Hunter, B., & Asada, H. H. (2015). Design and control of Supernumerary Robotic Limbs for balance augmentation. 2015 IEEE International Conference on Robotics and Automation (ICRA). doi: 10.1109/icra.2015.7139896

[13] Coelho, M., & Zigelbaum, J. (2010). Shape-changing interfaces. Personal and Ubiquitous Computing, 15(2), 161–173. doi: 10.1007/s00779-010-0311-y

[14] Kawahara, Y., Coutrix, C., Alexander, J., & Schmidt, A. (2017). Physical Computing—Flexible and Shape-Changing Interfaces. IEEE Pervasive Computing, 16(4), 25–27. doi: 10.1109/mprv.2017.3971139

[15] Siu, A. F., Gonzalez, E. J., Yuan, S., Ginsberg, J., Zhao, A., & Follmer, S. (2017). shapeShift. Adjunct Publication of the 30th Annual ACM Symposium on User Interface Software and Technology - UIST 17. doi: 10.1145/3131785.3131792

[16] Rasmussen, M. K., Pedersen, E. W., Petersen, M. G., & Hornbæk, K. (2012). Shape- changing interfaces. Proceedings of the 2012 ACM Annual Conference on Human Factors in Computing Systems - CHI 12. doi: 10.1145/2207676.2207781

[17] Nakagaki, K., Follmer, S., & Ishii, H. (2015). LineFORM. Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology - UIST 15. doi: 10.1145/2807442.2807452

[18] Nakagaki, K., Dementyev, A., Follmer, S., Paradiso, J. A., & Ishii, H. (2016). ChainFORM. Proceedings of the 29th Annual Symposium on User Interface Software and Technology - UIST 16. doi: 10.1145/2984511.2984587

[19] ROBOTIS STORE: Robot is... (n.d.). Retrieved January 25, 2020, from http://www.robotis.us/.

[20] The WebSocket Protocol. (n.d.). Retrieved January 27, 2020, from https://tools.ietf.org/html/rfc6455

[21] VIVE Pro: The professional-grade VR headset. (n.d.). Retrieved January 27, 2020, from https://www.vive.com/us/product/vive-pro/

[22] Glass – Glass. (n.d.). Retrieved January 25, 2020, from https://www.google.com/glass/start/.

[23] Bring your imagination to life. (n.d.). Retrieved January 27, 2020, from https://www.stereolabs.com/zed-mini/

[24] CHAN! OFFICIAL WEBSITE. (n.d.). Retrieved October 25, 2018, from https://unity-chan.com/

[25] Technologies, U. (n.d.). Unity. Retrieved October 25, 2018, from https://unity.com/

参考文献をもっと見る