リケラボ論文検索は、全国の大学リポジトリにある学位論文・教授論文を一括検索できる論文検索サービスです。

リケラボ 全国の大学リポジトリにある学位論文・教授論文を一括検索するならリケラボ論文検索大学・研究所にある論文を検索できる

リケラボ 全国の大学リポジトリにある学位論文・教授論文を一括検索するならリケラボ論文検索大学・研究所にある論文を検索できる

大学・研究所にある論文を検索できる 「触対象の物理特性とヒトの触覚認知情報統合のための触覚データマイニング (本文)」の論文概要。リケラボ論文検索は、全国の大学リポジトリにある学位論文・教授論文を一括検索できる論文検索サービスです。

コピーが完了しました

URLをコピーしました

論文の公開元へ論文の公開元へ
書き出し

触対象の物理特性とヒトの触覚認知情報統合のための触覚データマイニング (本文)

長友, 竜帆 慶應義塾大学

2022.03.23

概要

ヒトの感覚には五感と呼ばれる視覚,聴覚,触覚,嗅覚,味覚があり,我々はこれらの五感を介して外界から様々な情報を受け取っている.中でも触覚は五感のうち最も原始的な感覚であり,触覚を認識する「触覚受容体」は全身に分布する人体最大の感覚器官である[1].さらに,触覚は視覚に次いで多くの情報を扱う感覚であるということが分かっている.そのため,触察運動と呼ばれる物に触ったり掴んだりする「触れる」という触覚を用いた運動は,ヒトが対象物の確認・認識を行う際に欠かせない行為であると言える[2].

近年,コンピュータインターフェース技術の発展に伴い,ヒトの情報伝達において大きな役割を担う触覚を用いた情報呈示技術が注目を集めている.触覚への情報呈示には様々な種類が存在し,私たちの身の回りにいろいろな形で活用されている.特に我々が多く利用しているものの例としては,スマートフォンの仮想ボタンが挙げられる.ディスプレイの大型化に伴い入力インターフェースと画面表示ディスプレイが一体化したスマートフォンのようなデバイスでは物理ボタンを廃止し,ユーザーに物理ボタンを押したように知覚させる仮想触覚(pseudo-haptics)を用いた仮想ボタンを採用している[3],[4].また,ゲームコントローラーやスマートフォンでは,ユーザーの行動に応じたインタラクションな情報提示として,デバイス本体を振動させるという情報伝達方法を採用している[5].

一方近年では,バーチャルリアリティ(VR)技術の発展に伴い,単純な触覚情報呈示にとどまらず,物体に触れた際に感じる様々な「触感」を再現呈示しようとする触覚ディスプレイの研究も幅広く行われている[6]–[10].触覚ディスプレイを用いて物体の触感を再現呈示することができれば,例えばVRでの没入感だけにとどまらず,インターネットショッピングにおける商品の手触り感の伝送や,ロボットなどの遠隔操作における操作性や遠隔医療における遠隔治療の向上など,情報・コミュニケーション分野を中心とした幅広い分野に革新をもたらすことが期待される.

このような触覚ディスプレイを用いた触感呈示に関する研究は未だ発展段階にある.その要因として,開発されてきた触覚ディスプレイにおいて呈示された触感を定量的に表すのが難しく,その知見を統一して論じることが困難であるということが挙げられる.なぜなら,ヒトの触感の知覚メカニズムは未だに解明されていない部分が多いからである.

一般に視覚であれば,RGBやCMYKなど基底となる情報がわかっておりこれらのパラメータを変化させることで擬似的に全ての色や視覚情報を再現することができる.一方で触覚においてはこの基底情報が定まっていないというのが現状である.触覚における認知として,硬さ感,粗さ感,湿り気感,温度が基底となっていると言われているが[1],これらの認知に対応する物理パラメータは明確に定められていない.このことが触覚を評価および再現する上で大きな問題となっている.

この問題を解決するデータマイニング手法として近年急速に進歩したビックデータを用いたDeep Learningの技術が注目されている.被験者に実際のマテリアルを触ってもらい,その時の被験者の指の状態やマテリアルの物性などを測定し,それらをデータセットとして集める研究や,さらにそのようなデータセットから触覚の基底情報を選定し,物理パラメータから触覚提示を再現しようとする研究などがある[11]–[14].しかし.これらの研究によって生成されているデータセットは100から150サンプルと非常に少なく,ディープラーニングに用いるには少ないと言える.これらの背景として触覚研究におけるデータ収集が困難であるという点が挙げられる.

データマイニングにおいて,適用対象のデータを正確に集めることは重要である[15].触覚においては,重要な物理パラメータを測定するセンサを触覚センサと呼び,触覚ディスプレイの評価やデータマイニングに用いられている[1],[2].触覚において重要である硬さ感を測定するための剛性触覚(触圧)センサは,歪みゲージ型や圧電効果型などさまざまな検出方法が提案されている[16]–[24].一方で,単一材料の測定においては物体表面の剛性分布はほぼ均一であるが,データマイニングに用いるような測定材料では複合材料を用いることが多く[25],剛性分布測定が重要となる.被験者が物体表面をなぞるように材料をなぞるように触るため,剛性分布測定においてもなぞるように物体表面を走査する必要がある.従来研究においてはファイバーのような境界面で全反射を起こさせる材料中に光をほぼロスなく進行させる光導波路を利用したセンサ(光導波路型センサ)によってなぞるように剛性分布測定が可能であるが,わずかな光の反射を検出する必要があり,CCDカメラなど用いるため,機器が大型化しやすく,測定環境の制限がある.これらの理由からデータマインングのための触覚データセットを構築する上で,接触対象物の剛性分布測定は容易ではなく,ディープラーニングに用いるような大規模データセットを構築するのは困難である.

また,触覚の基底情報推定のためのデータマイニングにおいて,物理パラメータが明確に定まっていない以上,被験者による触覚の官能評価は必要不可欠である[11]–[14].しかし,Deep Learningのような大規模データセットを構築することを考慮すると,被験者一人一人に実サンプルを触ってもらい,それらの官能評価を答えてもらう必要があり,時間的にも人的リソース的にも大きなコストがかかる.

これらの理由から触覚データマイニングには簡易的に剛性分布測定可能な触覚センサと被験者の官能評価によらないデータセット構築手法が必要であると言える.

この論文で使われている画像

参考文献

[1] 下条誠, 前野隆司, 篠田裕之, 佐野明人, 触覚認識メカニズムと応用技術 :触覚センサ・触覚ディスプレイ. 東京: S&T出版, 2014.[Online]. Available: https://ci.nii.ac.jp/ncid/BB15737277

[2] 田中真美, “触覚・触感のメカニズムの解明とセンサシステムの開発に関する研究,” 精密工学会誌, vol. 82, no. 1, pp. 20–25, 2016, doi: 10.2493/jjspe.82.20.

[3] M. Speicher, J. Ehrlich, V. Gentile, D. Degraen, S. Sorce, and A. Krüger, “Pseudo- Haptic Controls for Mid-Air Finger-Based Menu Interaction,” in Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, New York, NY, USA, 2019, pp. 1–6. doi: 10.1145/3290607.3312927.

[4] E. D. Verona, B. R. Brum, C. de Oliveira, S. R. R. Sanches, and C. G. Corrêa, “Pseudo-haptic Perception in Smartphones Graphical Interfaces: A Case Study,” in Virtual, Augmented and Mixed Reality, Cham, 2021, pp. 203–222.

[5] M. Orozco, J. Silva, A. El Saddik, and E. Petriu, “The Role of Haptics in Games,” 2012. doi: 10.5772/32809.

[6] M. Maisto, C. Pacchierotti, F. Chinello, G. Salvietti, A. De Luca, and D. Prattichizzo, “Evaluation of Wearable Haptic Systems for the Fingers in Augmented Reality Applications,” IEEE Trans. Haptics, vol. 10, no. 4, pp. 511–522, Dec. 2017, doi: 10.1109/TOH.2017.2691328.

[7] M. Fukumoto and T. Sugimura, “Active Click: Tactile Feedback for Touch Panels,” in CHI ’01 Extended Abstracts on Human Factors in Computing Systems, New York, NY, USA, 2001, pp. 121–122. doi: 10.1145/634067.634141.

[8] K. Yatani and K. N. Truong, “SemFeel: A User Interface with Semantic Tactile Feedback for Mobile Touch-Screen Devices,” in Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, New York, NY, USA, 2009, pp. 111–120. doi: 10.1145/1622176.1622198.

[9] M. Kawazoe, Y. Kosemura, and N. Miki, “Encoding and presentation of surface textures using a mechanotactile display,” Sens. Actuators Phys., vol. 261, pp. 30–39, Jul. 2017, doi: 10.1016/j.sna.2017.03.035.

[10] M. Tezuka, N. Kitamura, K. Tanaka, and N. Miki, “Presentation of Various Tactile Sensations Using Micro-Needle Electrotactile Display,” PLOS ONE, vol. 11, no. 2, p. e0148410, Feb. 2016, doi: 10.1371/journal.pone.0148410.

[11] H. Zheng, L. Fang, M. Ji, M. Strese, Y. Özer, and E. Steinbach, “Deep Learning for Surface Material Classification Using Haptic and Visual Information,” IEEE Trans. Multimed., vol. 18, no. 12, pp. 2407–2416, Dec. 2016, doi: 10.1109/TMM.2016.2598140.

[12] H. Culbertson, J. J. López Delgado, and K. J. Kuchenbecker, “One hundred data- driven haptic texture models and open-source methods for rendering on 3D objects,” in 2014 IEEE Haptics Symposium (HAPTICS), Feb. 2014, pp. 319–325. doi: 10.1109/HAPTICS.2014.6775475.

[13] M. Strese, J. Lee, C. Schuwerk, Q. Han, H. Kim, and E. Steinbach, “A haptic texture database for tool-mediated texture recognition and classification,” in 2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE) Proceedings, Oct. 2014, pp. 118–123. doi: 10.1109/HAVE.2014.6954342.

[14] M. Strese, Y. Boeck, and E. Steinbach, “Content-based surface material retrieval,” in 2017 IEEE World Haptics Conference (WHC), Jun. 2017, pp. 352–357. doi: 10.1109/WHC.2017.7989927.

[15] T. Washio, “Data Mining - Development and Future -,” Proc. Symp. Chemoinformatics, vol. 2004, no. 0, p. JS-JS, 2004, doi: 10.11545/ciqs.2004.0.JS.0.

[16] S. Omata and Y. Terunuma, “New tactile sensor like the human hand and its applications,” Sens. Actuators Phys., vol. 35, no. 1, pp. 9–15, Oct. 1992, doi: 10.1016/0924-4247(92)87002-X.

[17] J. Engel, J. Chen, and C. Liu, “Development of polyimide flexible tactile sensor skin,” J. Micromechanics Microengineering, vol. 13, no. 3, pp. 359–366, Feb. 2003, doi: 10.1088/0960-1317/13/3/302.

[18] H. Lee, J. Chung, S. Chang, and E. Yoon, “Normal and Shear Force Measurement Using a Flexible Polymer Tactile Sensor With Embedded Multiple Capacitors,” J. Microelectromechanical Syst., vol. 17, no. 4, pp. 934–942, Aug. 2008, doi: 10.1109/JMEMS.2008.921727.

[19] J. G. da Silva, A. A. de Carvalho, and D. D. da Silva, “A strain gauge tactile sensor for finger-mounted applications,” IEEE Trans. Instrum. Meas., vol. 51, no. 1, pp. 18–22, Feb. 2002, doi: 10.1109/19.989890.

[20] J. Dargahi, “A piezoelectric tactile sensor with three sensing elements for robotic, endoscopic and prosthetic applications,” Sens. Actuators Phys., vol. 80, no. 1, pp. 23–30, Mar. 2000, doi: 10.1016/S0924-4247(99)00295-2.

[21] S. Teshigawara, K. Tadakuma, A. Ming, M. Ishikawa, and M. Shimojo, “High sensitivity initial slip sensor for dexterous grasp,” in 2010 IEEE International Conference on Robotics and Automation, May 2010, pp. 4867–4872. doi: 10.1109/ROBOT.2010.5509288.

[22] Y. Hotta, Y. Zhang, and N. Miki, “A Flexible Capacitive Sensor with Encapsulated Liquids as Dielectrics,” Micromachines, vol. 3, no. 1, 2012, doi: 10.3390/mi3010137.

[23] H. Ota et al., “Highly deformable liquid-state heterojunction sensors,” Nat. Commun., vol. 5, no. 1, p. 5032, Sep. 2014, doi: 10.1038/ncomms6032.

[24] A. Nakai, K. Kuwana, K. Saito, T. Dohi, A. Kumagai, and I. Shimoyama, “MEMS 6-axis force-torque sensor attached to the tip of grasping forceps for identification of tumor in thoracoscopic surgery,” in 2017 IEEE 30th International Conference on Micro Electro Mechanical Systems (MEMS), Jan. 2017, pp. 546–548. doi: 10.1109/MEMSYS.2017.7863464.

[25] M. Strese, Y. Boeck, and E. Steinbach, “Content-based surface material retrieval,” in 2017 IEEE World Haptics Conference (WHC), Munich, Germany, Jun. 2017, pp. 352–357. doi: 10.1109/WHC.2017.7989927.

[26] 電子情報通信学会,乾敏郎, 感覚・知覚・認知の基礎. オーム社, 2012, pp. xiv, 263p. [Online]. Available: https://ci.nii.ac.jp/ncid/BB08027012

[27] H. Yamaguchi, “Touch sensation- Embodied psychology for skin and heart,” Zen Nihon Shinkyu Gakkai Zasshi J. Jpn. Soc. Acupunct. Moxibustion, vol. 58, no. 5, pp. 732–741, Nov. 2008, doi: 10.3777/jjsam.58.732.

[28] E. R. Kandel et al., カンデル神経科学. メディカル・サイエンス・インターナシ ョ ナ ル , 2014, pp. xliii, 1649p. [Online]. Available: https://ci.nii.ac.jp/ncid/BB15356477

[29] M. A. Srinivasan and K. Dandekar, “An Investigation of the Mechanics of Tactile Sense Using Two-Dimensional Models of the Primate Fingertip,” J. Biomech. Eng., vol. 118, no. 1, pp. 48–55, Feb. 1996, doi: 10.1115/1.2795945.

[30] Y. C. (Yuan-cheng) Fung, Biomechanics : mechanical properties of living tissues, 2nd ed. Springer, 2010, pp. xviii, 568 p. [Online]. Available: https://ci.nii.ac.jp/ncid/BB12588818

[31] 白土寛和,前野隆司, “「触る」ということ -ヒトとロボットの触覚-,” 表面, vol. 41, no. 5, pp. 145–152, 2003.

[32] A. P. Sripati, S. J. Bensmaia, and K. O. Johnson, “A Continuum Mechanical Model of Mechanoreceptive Afferent Responses to Indented Spatial Patterns,” J. Neurophysiol., vol. 95, no. 6, pp. 3852–3864, Jun. 2006, doi: 10.1152/jn.01240.2005.

[33] 入口克己, 川村貞夫, H.-Y. Han, “人間の手指組織の剛性解析と人工指との比較,” 日本ロボット学会誌, vol. 17, no. 8, pp. 1141–1148, Nov. 1999.

[34] T. Maeno, K. Kobayashi and N. Yamazaki, “Relationship between the Structure of Human Finger Tissue and the Location of Tactile Receptors,” JSME Int. J. Ser. C, vol. 41, no. 1, pp. 94–100, 1998, doi: 10.1299/jsmec.41.94.

[35] R. S. Johansson, “Tactile sensibility in the human hand: receptive field characteristics of mechanoreceptive units in the glabrous skin area.,” J. Physiol., vol. 281, no. 1, pp. 101–125, Aug. 1978, doi: 10.1113/jphysiol.1978.sp012411.

[36] R. S. Johansson and A. B. Vallbo, “Tactile sensibility in the human hand: relative and absolute densities of four types of mechanoreceptive units in glabrous skin.,” J. Physiol., vol. 286, no. 1, pp. 283–300, Jan. 1979, doi: 10.1113/jphysiol.1979.sp012619.

[37] A. B. Vallbo and R. Johansson, “Properties of cutaneous mechanoreceptors in the human hand related to touch sensation,” Hum. Neurobiol., vol. 3, pp. 3–14, Feb. 1984.

[38] R. S. Johansson and Å. B. Vallbo, “Tactile sensory coding in the glabrous skin of the human hand,” Trends Neurosci., vol. 6, pp. 27–32, Jan. 1983, doi: 10.1016/0166- 2236(83)90011-5.

[39] 前野隆司, “ヒト指腹部と触覚受容器の構造と機能,” 日本ロボット学会誌, vol. 18, no. 6, pp. 772–775, 2000, doi: 10.7210/jrsj.18.772.

[40] O. Franzén and J. Nordmark, “Vibrotactile frequency discrimination,” Percept. Psychophys., vol. 17, no. 5, pp. 480–484, Sep. 1975, doi: 10.3758/BF03203298.

[41] R. T. Verrillo, A. J. Fraioli, and R. L. Smith, “Sensation magnitude of vibrotactile stimuli,” Percept. Psychophys., vol. 6, no. 6, pp. 366–372, Nov. 1969, doi: 10.3758/BF03212793.

[42] A. B. Vallbo and R. Johansson, “Properties of cutaneous mechanoreceptors in the human hand related to touch sensation,” Hum. Neurobiol., vol. 3, pp. 3–14, Feb. 1984.

[43] J. C. Stevens and K. K. Choo, “Spatial acuity of the body surface over the life span.,” Somatosens. Mot. Res., vol. 13, no. 2, pp. 153–166, 1996, doi: 10.3109/08990229609051403.

[44] K. O. Johnson, “Tactile Spatial Resolution. I. Two-Point Discrimination, Gap Detection, Grating Resolution, and Letter Recognition,” J. Neurophysiol., vol. 46, no. 6, pp. 1177–1191, 1981.

[45] 田中真美, 東山篤規, 宮岡徹, 谷口俊治, 佐藤愛子著, “「触覚と痛み」”,ブレーン出版, ISBN4-89242-642-3, 全328頁, 日本AEM学会誌, vol. 14, no. 2, p. 243, 2006.

[46] Xiaojuan Chen, Fei Shao, Cathy Barnes, Tom Childs, and Brian Henson, “Exploring Relationships between Touch Perception and Surface Physical Properties,” Int. J. Des. Vol 3 No 2 2009, 2009, Accessed: Jan. 01, 2009. [Online]. Available: http://www.ijdesign.org/index.php/IJDesign/article/view/596/261

[47] X. Chen, C. J. Barnes, T. H. C. Childs, B. Henson, and F. Shao, “Materials’ tactile testing and characterisation for consumer products’ affective packaging design,” Mater. Des., vol. 30, no. 10, pp. 4299–4310, Dec. 2009, doi: 10.1016/j.matdes.2009.04.021.

[48] S. Okamoto, H. Nagano, and H.-N. Ho, “Psychophysical Dimensions of Material Perception and Methods to Specify Textural Space,” 2016.

[49] D. T. Bake, S. S. Hsiao, and K. O. Johnson, “Neural Coding Mechanisms in Tactile Pattern Recognition: The Relative Contributions of Slowly and Rapidly Adapting Mechanoreceptors to Perceived Roughness,” J. Neurosci., vol. 17, no. 19, p. 7480, Oct. 1997, doi: 10.1523/JNEUROSCI.17-19-07480.1997.

[50] C. Connor, S. Hsiao, J. Phillips, and K. Johnson, “Tactile roughness: neural codes that account for psychophysical magnitude estimates,” J. Neurosci., vol. 10, no. 12, p. 3823, Dec. 1990, doi: 10.1523/JNEUROSCI.10-12-03823.1990.

[51] T. Yoshioka, B. Gibb, A. K. Dorsch, S. S. Hsiao, and K. O. Johnson, “Neural Coding Mechanisms Underlying Perceived Roughness of Finely Textured Surfaces,” J. Neurosci., vol. 21, no. 17, p. 6905, Sep. 2001, doi: 10.1523/JNEUROSCI.21-17-06905.2001.

[52] E. M. Meftah, L. Belingard, and C. E. Chapman, “Relative effects of the spatial and temporal characteristics of scanned surfaces on human perception of tactile roughness using passive touch,” Exp. Brain Res., vol. 132, no. 3, pp. 351–361, Jun. 2000, doi: 10.1007/s002210000348.

[53] S. J. Lederman, “Tactile roughness of grooved surfaces: The touching process and effects of macro- and microsurface structure,” Percept. Psychophys., vol. 16, no. 2, pp. 385–395, Mar. 1974, doi: 10.3758/BF03203958.

[54] M. Hollins and S. R. Risner, “Evidence for the duplex theory of tactile texture perception,” Percept. Psychophys., vol. 62, no. 4, pp. 695–705, Jan. 2000, doi: 10.3758/BF03206916.

[55] C. J. Cascio and K. Sathian, “Temporal cues contribute to tactile perception of roughness,” J. Neurosci. Off. J. Soc. Neurosci., vol. 21, no. 14, pp. 5289–5296, Jul. 2001, doi: 10.1523/JNEUROSCI.21-14-05289.2001.

[56] A. Bicchi, E. P. Scilingo, and D. De Rossi, “Haptic discrimination of softness in teleoperation: the role of the contact area spread rate,” IEEE Trans. Robot. Autom., vol. 16, no. 5, pp. 496–504, Oct. 2000, doi: 10.1109/70.880800.

[57] K. Fujita, “A New Softness Display Interface by Dynamic Fingertip Contact Area Control,” 5th World Multiconference Syst. Cybern. Inform. 2001, pp. 78–82, 2001.

[58] W. R. Provancher and N. D. Sylvester, “Fingerpad Skin Stretch Increases the Perception of Virtual Friction,” IEEE Trans. Haptics, vol. 2, no. 4, pp. 212–223, Dec. 2009, doi: 10.1109/TOH.2009.34.

[59] M. Konyo, H. Yamada, S. Okamoto, and S. Tadokoro, Alternative Display of Friction Represented by Tactile Stimulation without Tangential Force, vol. 5024. 2008, p. 629. doi: 10.1007/978-3-540-69057-3_79.

[60] Y. Nonomura et al., “Tactile impression and friction of water on human skin,” Colloids Surf. B Biointerfaces, vol. 69, no. 2, pp. 264–267, Mar. 2009, doi: 10.1016/j.colsurfb.2008.11.024.

[61] S. Okamoto, H. Nagano, and Y. Yamada, “Psychophysical Dimensions of Tactile Perception of Textures,” IEEE Trans. Haptics, vol. 6, no. 1, pp. 81–93, First Quarter 2013, doi: 10.1109/TOH.2012.32.

[62] T. Matsuoka, H. Kanai, H. Tsuji, T. Shinya, and T. Nishimatsu, “Predicting Texture Image of Covering Fabric for Car Seat by Physical Properties,” J. Text. Eng., vol. 54, no. 3, pp. 63–74, 2008, doi: 10.4188/jte.54.63.

[63] H. Shirado and T. Maeno, “Modeling of human texture perception for tactile displays and sensors,” in First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference, Mar. 2005, pp. 629–630. doi: 10.1109/WHC.2005.92.

[64] S. Ballesteros, J. M. Reales, L. P. de Leon, and B. Garcia, “The perception of ecological textures by touch: does the perceptual space change under bimodal visual and haptic exploration?,” in First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference, Mar. 2005, pp. 635–638. doi: 10.1109/WHC.2005.134.

[65] S. Guest, A. Mehrabyan, G. Essick, N. PHILLIPS, A. HOPKINSON, and F. Mcglone, “Physics and tactile perception of fluid-covered surfaces,” J. Texture Stud., vol. 43, Feb. 2012, doi: 10.1111/j.1745-4603.2011.00318.x.

[66] S. Hirai and N. Miki, “Thermal Sensation Display with Controllable Thermal Conductivity,” in 2019 20th International Conference on Solid-State Sensors, Actuators and Microsystems & Eurosensors XXXIII (TRANSDUCERS & EUROSENSORS XXXIII), Jun. 2019, pp. 1659–1661. doi: 10.1109/TRANSDUCERS.2019.8808369.

[67] M. Matusiak, “Investigation of the thermal insulation properties of multilayer textiles,” Fibres Text. East. Eur., vol. 14, pp. 98–102, Jan. 2006.

[68] M. J. Pac, M.-A. Bueno, M. Renner, and S. El Kasmi, “Warm-Cool Feeling Relative to Tribological Properties of Fabrics,” Text. Res. J., vol. 71, no. 9, pp. 806–812, Sep. 2001, doi: 10.1177/004051750107100910.

[69] R. J. Schepers and M. Ringkamp, “Thermoreceptors and thermosensitive afferents,” Touch Temp. PainItch Pleas., vol. 34, no. 2, pp. 177–184, Feb. 2010, doi: 10.1016/j.neubiorev.2009.10.003.

[70] C. Shiota, “DATA MINING-TECHNIQUES AND APPLICATIONS,” Bull. Comput. Stat. Jpn., vol. 10, no. 2, pp. 127–144, 1998, doi: 10.20551/jscswabun.10.2_127.

[71] Y. Sugita, “On the characteristics and application of data mining(Data mining methodologies and application for libraries),” J. Inf. Sci. Technol. Assoc., vol. 60, no. 6, pp. 218–223, 2010, doi: 10.18919/jkg.60.6_218.

[72] 沼尾雅之, 清水周一, 木村雅彦, “Datamining for Causal Analysis.,” 全国大会講演論文集, vol. 第51回, no. データベース, pp. 195–196, Sep. 1995.

[73] A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet Classification with Deep Convolutional Neural Networks,” in Advances in Neural Information Processing Systems, 2012, vol. 25. [Online]. Available: https://proceedings.neurips.cc/paper/2012/file/c399862d3b9d6b76c8436e924a68c4 5b-Paper.pdf

[74] G. Ke et al., “LightGBM: A Highly Efficient Gradient Boosting Decision Tree,” in Advances in Neural Information Processing Systems, 2017, vol. 30. [Online]. Available: https://proceedings.neurips.cc/paper/2017/file/6449f44a102fde848669bdd9eb6b76 fa-Paper.pdf

[75] R. Agrawal and R. Srikant, “Fast Algorithms for Mining Association Rules in Large Databases,” in Proceedings of the 20th International Conference on Very Large Data Bases, San Francisco, CA, USA, 1994, pp. 487–499.

[76] L. van der Maaten and G. Hinton, “Visualizing Data using t-SNE,” J. Mach. Learn. Res., vol. 9, no. 86, pp. 2579–2605, 2008.

[77] M. Nassar and W. Anbtawi, A 3D Playground for t-SNE With Explainable Classification. 2020. doi: 10.13140/RG.2.2.36358.52806.

[78] E. Tjoa and C. Guan, “A Survey on Explainable Artificial Intelligence (XAI): Toward Medical XAI,” IEEE Trans Neural Netw Learn Syst, vol. PP, Oct. 2020, doi: 10.1109/tnnls.2020.3027314.

[79] Y. J. Phua, 井上克巳, “説明可能な論理規則のグラフ埋め込みによる学習,”人工知能学会全国大会論文集, vol. JSAI2020, p. 3E1GS202-3E1GS202, 2020, doi: 10.11517/pjsai.JSAI2020.0_3E1GS202.

[80] G. Garcia, J. A. Corrales Ramon, J. Pomares, and T. Fernando, “Survey of Visual and Force/Tactile Control of Robots for Physical Interaction in Spain,” Sensors, vol. 9, Dec. 2009, doi: 10.3390/s91209689.

[81] J. G. da Silva, A. A. de Carvalho, and D. D. da Silva, “A strain gauge tactile sensor for finger-mounted applications,” IEEE Trans. Instrum. Meas., vol. 51, no. 1, pp. 18–22, Feb. 2002, doi: 10.1109/19.989890.

[82] J. Dargahi, “A piezoelectric tactile sensor with three sensing elements for robotic, endoscopic and prosthetic applications,” Sens. Actuators Phys., vol. 80, no. 1, pp. 23–30, Mar. 2000, doi: 10.1016/S0924-4247(99)00295-2.

[83] S. Teshigawara, K. Tadakuma, Aiguo Ming, M. Ishikawa, and M. Shimojo, “High sensitivity initial slip sensor for dexterous grasp,” in 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, May 2010, pp. 4867– 4872. doi: 10.1109/ROBOT.2010.5509288.

[84] Y. Hotta, Y. Zhang, and N. Miki, “A Flexible Capacitive Sensor with Encapsulated Liquids as Dielectrics,” Micromachines, vol. 3, no. 1, pp. 137–149, Mar. 2012, doi: 10.3390/mi3010137.

[85] H. Ota et al., “Highly deformable liquid-state heterojunction sensors,” Nat. Commun., vol. 5, no. 1, Dec. 2014, doi: 10.1038/ncomms6032.

[86] M. Ohka, H. Kobayashi, J. Takata, and Y. Mitsuya, “An Experimental Optical Three-axis Tactile Sensor Featured with Hemispherical Surface,” J. Adv. Mech. Des. Syst. Manuf., vol. 2, no. 5, pp. 860–873, 2008, doi: 10.1299/jamdsm.2.860.

[87] J. Jiao, Y. Zhang, D. Wang, X. Guo, and X. Sun, “HapTex: A Database of Fabric Textures for Surface Tactile Display,” in 2019 IEEE World Haptics Conference (WHC), Tokyo, Japan, Jul. 2019, pp. 331–336. doi: 10.1109/WHC.2019.8816167.

[88] T. Nakadegawa, H. Ishizuka, and N. Miki, “Three-axis scanning force sensor with liquid metal electrodes,” Sens. Actuators Phys., vol. 264, pp. 260–267, Sep. 2017, doi: 10.1016/j.sna.2017.08.014.

[89] R. Etzi, C. Spence, M. Zampini, and A. Gallace, “When Sandpaper Is ‘Kiki’ and Satin Is ‘Bouba’: an Exploration of the Associations Between Words, Emotional States, and the Tactile Attributes of Everyday Materials,” Multisensory Res., vol. 29, no. 1–3, pp. 133–155, 2016, doi: 10.1163/22134808-00002497.

[90] M. Miyazaki, S. Hidaka, M. Imai, K. Kantartzis, H. Okada, and S. Kita, “The facilitatory role of sound symbolism in infant word learning,” in the Annual Meeting of the Cognitive Science Society, 2013, pp. 3080–3085.

[91] M. Arata, M. Imai, J. Okuda, H. Okada, and T. Matsuda, “Gesture in language: How sound symbolic words are processed in the brain,” in the Annual Meeting of the Cognitive Science Society, 2010, pp. 1374–1379.

[92] M. Sakamoto, “System to quantify the impression of sounds expressed by onomatopoeias,” Acoust. Sci. Technol., vol. 41, no. 1, pp. 229–232, Jan. 2020, doi: 10.1250/ast.41.229.

[93] M. Sakamoto, T. Tahara, and J. Watanabe, “A System to Visualize Individual Variation in Tactile Perception using Onomatopoeia Map,” Trans. Virtual Real. Soc. Jpn., vol. 21, no. 2, pp. 213–216, 2016, doi: 10.18974/tvrsj.21.2_213.

[94] 早川智彦, 松井茂, 渡邊淳司, “オノマトペを利用した触り心地の分類手法(<特集>アート&エンタテインメント2),” 日本バーチャルリアリティ学会論文誌, vol. 15, no. 3, pp. 487–490, 2010, doi: 10.18974/tvrsj.15.3_487.

[95] 貴之星, 裕之篠田, “接触力と接触面積を計測する非線形触覚素子,” 計測自動制御学会論文集, vol. 42, no. 7, pp. 727–735, Jul. 2006.

[96] P. Zhang, Q. Wan, C. Feng, and H. Wang, “All Regimes Parasitic Capacitances Extraction Using a Multi-Channel CBCM Technique,” IEEE Trans. Semicond. Manuf., vol. 30, no. 2, pp. 121–125, May 2017, doi: 10.1109/TSM.2017.2669317.

[97] L.-J. Sun et al., “Extraction of geometry-related interconnect variation based on parasitic capacitance data,” IEEE Electron Device Lett., vol. 35, no. 10, pp. 980– 982, Oct. 2014, doi: 10.1109/LED.2014.2344173.

[98] C. Hebedean, C. Munteanu, A. Racasan, and C. Pacurar, “Parasitic capacitance removal with an embedded ground layer,” in Eurocon 2013, Zagreb, Croatia, Jul. 2013, pp. 1886–1891. doi: 10.1109/EUROCON.2013.6625235.

[99] M. I. Tiwana, S. J. Redmond, and N. H. Lovell, “A review of tactile sensing technologies with applications in biomedical engineering,” Sens. Actuators Phys., vol. 179, pp. 17–31, Jun. 2012, doi: 10.1016/j.sna.2012.02.051.

[100] S. McKinley et al., “A single-use haptic palpation probe for locating subcutaneous blood vessels in robot-assisted minimally invasive surgery,” Aug. 2015, pp. 1151–1158. doi: 10.1109/CoASE.2015.7294253.

[101] A. Mata, A. J. Fleischman, and S. Roy, “Characterization of Polydimethylsiloxane (PDMS) Properties for Biomedical Micro/Nanosystems,” Biomed. Microdevices, vol. 7, no. 4, pp. 281–293, Dec. 2005, doi: 10.1007/s10544-005-6070-2.

[102] D. Bodas and C. Khan-Malek, “Formation of more stable hydrophilic surfaces of PDMS by plasma and chemical treatments,” Microelectron. Eng., vol. 83, no. 4– 9, pp. 1277–1279, Apr. 2006, doi: 10.1016/j.mee.2006.01.195.

[103] D. Meyerhofer, “Characteristics of resist films produced by spinning,” J. Appl. Phys., vol. 49, no. 7, pp. 3993–3997, Jul. 1978, doi: 10.1063/1.325357.

[104] Miwa S. and Ohtake Y., “Chemical Changes in Cross-linked Silicone Rubber by Ozone-water Treatments,” NIPPON GOMU KYOKAISHI, vol. 87, no. 5, pp. 161– 167, 2014, doi: 10.2324/gomu.87.161.

[105] H. Wu, B. Huang, and R. N. Zare, “Construction of microfluidic chips using polydimethylsiloxane for adhesive bonding,” Lab. Chip, vol. 5, no. 12, p. 1393, 2005, doi: 10.1039/b510494g.

[106] G. Taguchi, “Quality engineering (Taguchi methods) for the development of electronic circuit technology,” IEEE Trans. Reliab., vol. 44, no. 2, pp. 225–229, Jun. 1995, doi: 10.1109/24.387375.

[107] Itagaki M., “Principle and Analytical Method of Impedance Spectroscopy,” Hyomen Kagaku, vol. 33, no. 2, pp. 64–68, 2012, doi: 10.1380/jsssj.33.64.

[108] T. Mikolov, K. Chen, G. Corrado, and J. Dean, “Efficient Estimation of Word Representations in Vector Space,” in Proceeding of the 1st International Conference on Learning Representations, Workshops Track, 2013, p. 12 pages.

[109] P. Bojanowski, E. Grave, A. Joulin, and T. Mikolov, “Enriching Word Vectors with Subword Information,” Trans. Assoc. Comput. Linguist., vol. 5, pp. 135–146, 2017, doi: 10.1162/tacl_a_00051.

[110] M. E. Peters et al., “Deep contextualized word representations,” ArXiv180205365 Cs, Mar. 2018, Accessed: Jan. 08, 2021. [Online]. Available: http://arxiv.org/abs/1802.05365

[111] J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding,” in Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Jun. 2019, vol. 1, pp. 4171–4186.

[112] A. Vaswani et al., “Attention is All You Need,” in Proceedings of the 31st International Conference on Neural Information Processing Systems, 2017, pp. 6000–6010.

[113] Z. Yang, Z. Dai, Y. Yang, J. G. Carbonell, R. Salakhutdinov, and Q. V. Le, “XLNet: Generalized Autoregressive Pretraining for Language Understanding,” CoRR, vol. abs/1906.08237, 2019, [Online]. Available: http://arxiv.org/abs/1906.08237

[114] Y. Liu et al., “RoBERTa: A Robustly Optimized BERT Pretraining Approach,” CoRR, vol. abs/1907.11692, 2019, [Online]. Available: http://arxiv.org/abs/1907.11692

[115] Z. Lan, M. Chen, S. Goodman, K. Gimpel, P. Sharma, and R. Soricut, “ALBERT: A Lite BERT for Self-supervised Learning of Language Representations,” 2020. [Online]. Available: https://openreview.net/forum?id=H1eA7AEtvS

[116] K. Clark, M.-T. Luong, Q. V. Le, and C. D. Manning, “ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators,” CoRR, vol. abs/2003.10555, 2020, [Online]. Available: https://arxiv.org/abs/2003.10555

[117] T. B. Brown et al., “Language Models are Few-Shot Learners,” CoRR, vol. abs/2005.14165, 2020, [Online]. Available: https://arxiv.org/abs/2005.14165

[118] A. Ramesh et al., “Zero-Shot Text-to-Image Generation,” CoRR, vol. abs/2102.12092, 2021, [Online]. Available: https://arxiv.org/abs/2102.12092

[119] awesome-embedding-model. [Online]. Available: https://github.com/Hironsan/awesome-embedding-models

[120] bert-japanese. [Online]. Available: https://github.com/cl-tohoku/bert-japanese

参考文献をもっと見る

全国の大学の
卒論・修論・学位論文

一発検索!

この論文の関連論文を見る