リケラボ論文検索は、全国の大学リポジトリにある学位論文・教授論文を一括検索できる論文検索サービスです。

リケラボ 全国の大学リポジトリにある学位論文・教授論文を一括検索するならリケラボ論文検索大学・研究所にある論文を検索できる

リケラボ 全国の大学リポジトリにある学位論文・教授論文を一括検索するならリケラボ論文検索大学・研究所にある論文を検索できる

大学・研究所にある論文を検索できる 「In-Vehicle Customized AR-HUD Design to Provide Driving Safety Information Based on User Mental Model」の論文概要。リケラボ論文検索は、全国の大学リポジトリにある学位論文・教授論文を一括検索できる論文検索サービスです。

コピーが完了しました

URLをコピーしました

論文の公開元へ論文の公開元へ
書き出し

In-Vehicle Customized AR-HUD Design to Provide Driving Safety Information Based on User Mental Model

張, 寒 筑波大学 DOI:10.15068/0002006315

2023.02.16

概要

In recent years, the continuous development of in-vehicle information systems has dramatically enriched the driver’s driving experience while also occupying the driver’s cognitive resources to varying degrees and causing driving distractions. With this complex information system, effectively managing the complexity of information and further improving driving safety has become a critical issue that needs to be addressed in-vehicle information systems. At present, a new interaction method that incorporates AR (augmented reality) and HUD (head-up display) into in-vehicle information systems is gaining widespread attention. It superimposes each in-vehicle information into a real driving scenario, meeting the needs of complex tasks while improving driving safety. As a result, studying AR HUD information architecture is essential for understanding how the in-vehicle information system can efficiently handle information complexity while improving driving safety.

This study applied AR-HUD technology to the in-vehicle navigation system and designed a customized AR-HUD interface based on beginner and skilled drivers’ driving behavior and information needs. The design interface of the system is studied to construct information organization guidelines and visual design strategies for the in-vehicle AR-HUD interface.

In order to verify the effectiveness of the proposed custom AR-HUD interface in the target group, this study used the AR-HUD interface to assist driving behavior as an entry point to prove that the AR-HUD interface can improve drivers’ driving behavior and reduce the information recognition load.

The main research work is structured as follows:

1. Researched and analyzed target users’ driving situations and behaviors through qualitative research methods of questionnaires and interviews. Indepth exploration of users’ information needs and conduct information structure layering based on the user’s mental model to avoid the problem of too much or too little information displayed during the driving process.

2. Completed the information module construction (HUD) and the ARHUD interface’s status module (AR) design based on the interface design principles. Additionally, the route and interaction designs are based on four of Japan’s most common road situations. In the icon design, we tested the cognitive evaluation of visual elements to ensure that users could rapidly understand the semantics of the HUD.

3. The impacts of the in-vehicle AR-HUD system on the driver’s eye movement behavior and responsiveness while driving was analyzed using a mix of descriptive statistics, box-line plots, and non-parametric tests. Additionally, we evaluated the effectiveness of the AR-HUD interface by developing a hierarchical evaluation index system for the AR-HUD interface based on driver driving behavior. The study results show that in terms of eye-movement behavior, AR-HUD can significantly reduce the attention time of drivers looking down at the H area while can improve the attention of beginners to the dangerous interest area when driving in the urban road. The AR-HUD interface can significantly reduce the participants’ perception of hazardous driving situations in terms of reaction time.

この論文で使われている画像

参考文献

[1] Kareem Othman. “Public acceptance and perception of autonomous vehicles: a comprehensive review”. In: AI and Ethics 1.3 (2021), pp. 355–387. ISSN: 2730-5953. DOI: 10.1007/s43681-021-00041-8.

[2] Joseph L. Gabbard, Gregory M. Fitch, and Hyungil Kim. “Behind the glass: Driver challenges and opportunities for AR automotive applica- tions”. In: Proceedings of the IEEE 102.2 (2014), pp. 124–136. ISSN: 00189219. DOI: 10.1109/JPROC.2013.2294642.

[3] B. Jurklies et al. “Electrophysiological evaluation in intermediate uveitis”. In: Investigative Ophthalmology and Visual Science 37.3 (2016). ISSN: 01460404.

[4] Marcos Maroto et al. “Head-up Displays (HUD) in driving”. In: 1 (2018). arXiv: 1803.08383. URL: http://arxiv.org/abs/1803.08383.

[5] Longhai Yang et al. “Research on risky driving behavior of novice drivers”. In: Sustainability (Switzerland) 11.20 (2019), pp. 1–20. ISSN: 20711050. DOI: 10.3390/su11205556.

[6] Coleman Merenda et al. “Augmented reality interface design approaches for goal-directed and stimulus-driven driving tasks”. In: IEEE Transactions on Visualization and Computer Graphics 24.11 (2018), pp. 2875–2885. ISSN: 19410506. DOI: 10.1109/TVCG.2018.2868531.

[7] Lotfi Abdi, Faten Ben Abdallah, and Aref Meddeb. “In-Vehicle Augmented Reality Traffic Information System: A New Type of Communication between Driver and Vehicle”. In: Procedia Computer Science 73.Awict (2015), pp. 242–249. ISSN: 18770509. DOI: 10.1016/j.procs.2015.12. 024.

[8] Hye Sun Park et al. “In-Vehicle AR-HUD system to provide drivingSafety information”. In: ETRI Journal 35.6 (2013), pp. 1038–1047. ISSN: 12256463. DOI: 10.4218/etrij.13.2013.0041.

[9] Menno Nijboer et al. “Driving and multitasking: The good, the bad, and the dangerous”. In: Frontiers in Psychology 7.NOV (2016), pp. 1–16. ISSN: 16641078. DOI: 10.3389/fpsyg.2016.01718.

[10] Tina Cvahte Ojsterek and Darja Topolek. “Influence of drivers’ visual and cognitive attention on their perception of changes in the traffic environment”. In: European Transport Research Review 11.1 (2019), pp. 1–9. ISSN: 18668887. DOI: 10.1186/s12544-019-0384-2.

[11] A. Pauzié. “A method to assess the driver mental workload: The driving activity load index (DALI)”. In: IET Intelligent Transport Systems 2.4 (2008), pp. 315–322. ISSN: 1751956X. DOI: 10.1049/iet-its:20080023.

[12] Daniel R. Tufano. “Automotive HUDs: The overlooked safety issues”. In: Human Factors 39.2 (1997), pp. 303–311. ISSN: 00187208. DOI: 10.1518/ 001872097778543840.

[13] Vassilis Charissis et al. “Employing emerging technologies to develop and evaluate in-vehicle intelligent systems for driver support: Infotainment AR hud case study”. In: Applied Sciences (Switzerland) 11.4 (2021), pp. 1–28. ISSN: 20763417. DOI: 10.3390/app11041397.

[14] Raymond J. Kiefer. Effect of a head-up versus head-down digital speedome- ter on visual sampling behavior and speed control performance during daytime automobile driving. Tech. rep. 1991. DOI: 10.4271/910111.

[15] Yung Ching Liu. “Effects of using head-up display in automobile context on attention demand and driving performance”. In: Displays 24.4-5 (2003), pp. 157–165. ISSN: 0148-7191. DOI: 10.1016/j.displa.2004.01. 001.

[16] Chrystinne Oliveira Fernandes and Carlos José Pereira De Lucena. An internet of things application with an accessible interface for remote monitoring patients. Vol. 9188. 2015, pp. 651–661. ISBN: 9783319208886. DOI: 10.1007/ 978-3-319-20889-3_60.

[17] Marcus Tönnis and Gudrun Klinker. “Effective control of a car driver’s attention for visual and acoustic guidance towards the direction of imminent dangers”. In: Proceedings - ISMAR 2006: Fifth IEEE and ACM International Symposium on Mixed and Augmented Reality October (2006), pp. 13– 22. DOI: 10.1109/ISMAR.2006.297789.

[18] Andreas Riegler, Andreas Riener, and Clemens Holzmann. “Augmented Reality for Future Mobility: Insights from a Literature Review and HCI Workshop”. In: I-Com 20.3 (2021), pp. 295–318. ISSN: 1618-162X. DOI: 10. 1515/icom-2021-0029.

[19] Xiangdong Ma et al. “Does Augmented-Reality Head-Up Display Help? A Preliminary Study on Driving Performance through a VR-Simulated Eye Movement Analysis”. In: IEEE Access 9 (2021), pp. 129951–129964. ISSN: 21693536. DOI: 10.1109/ACCESS.2021.3112240.

[20] Andreas Riener. “Assessment of simulator fidelity and validity in sim- ulator and on-the-road studies”. In: International Journal on Advances in Systems and Measurements 3-4.February (2010), pp. 110–124.

[21] Laura Pomarjanschi, Michael Dorr, and Erhardt Barth. “Gaze guidance reduces the number of collisions with pedestrians in a driving simulator”. In: ACM Transactions on Interactive Intelligent Systems 1.2 (2012), pp. 1–14. ISSN: 21606463. DOI: 10.1145/2070719.2070721.

[22] Hyungil Kim et al. “Driver behavior and performance with augmented reality pedestrian collision warning: An outdoor user study”. In: IEEE Transactions on Visualization and Computer Graphics 24.4 (2018), pp. 1515– 1524. ISSN: 10772626. DOI: 10.1109/TVCG.2018.2793680.

[23] Matthias Schneider et al. “A real-world driving experiment to collect expert knowledge for the design of AR HUD navigation that covers less”. In: Mensch und Computer 2019 - Workshopband (2019), pp. 410–420.

[24] Cheng Liang, Yue Li, and Jia Wei Luo. “A Novel Method to Detect Functional microRNA Regulatory Modules by Bicliques Merging”. In: IEEE/ACM Transactions on Computational Biology and Bioinformatics 13.3 (2016), pp. 549– 556. ISSN: 15455963. DOI: 10.1109/TCBB.2015.2462370.

[25] Hyuksoo Han. “A study on the Effects of Mental model on Interactive system design.pdf”. In: Korean Journal of The Science of Emotion and Sensibility (1998), pp. 105–111.

[26] Abdulrazaq Alsuhail Almutairi. “Mapping Mental Models into Interfaces of Interactive Systems”. In: International Journal of Innovative Busi- ness Strategies 4.2 (2018), pp. 203–207. DOI: 10.20533/ijibs.2046.3626. 2018.0028.

[27] Donald Arthur Norman. “Emotional Design: Why We Love (or Hate) Everyday Things”. In: The Journal of American Culture 27.2 (2004), pp. 234– 234. ISSN: 1542-7331. DOI: 10.1111/j.1537-4726.2004.133_10.x.

[28] Prajval Kumar Murali, Mohsen Kaboli, and Ravinder Dahiya. “Intelligent InVehicle Interaction Technologies”. In: Advanced Intelligent Systems 2100122 (2021), p. 2100122. ISSN: 2640-4567. DOI: 10.1002/aisy. 202100122.

[29] James Reason et al. “Errors and violations on the roads: A real distinc- tion?” In: Ergonomics 33.10-11 (1990), pp. 1315–1332. ISSN: 13665847. DOI: 10.1080/00140139008925335.

[30] Miaoxuan Zhang. “Optimization Analysis of AR-HUD Technology Application in Automobile Industry”. In: Journal of Physics: 1746.1 (2021). ISSN: 17426596. DOI: 10.1088/1742-6596/1746/1/012062.

[31] Angelos Amditis et al. “Towardsthe automotive HMI of the future: Overview of the AIDE - Integrated project results”. In: IEEE Transactions on Intelligent Transportation Systems 11.3 (2010), pp. 567–578. ISSN: 15249050. DOI: 10.1109/TITS.2010.2048751.

[32] J. L. Campbell et al. “Human factors design principles for level 2 and level 3 automated driving concepts”. In: Highway Traffic Safety Adminis- tration, National Department of Transportation, August (2018), p. 122.

[33] Tingting Wu et al. “Hick-hyman law is mediated by the cognitive control network in the brain”. In: Cerebral Cortex 28.7 (2018), pp. 2267–2282. ISSN: 14602199. DOI: 10.1093/cercor/bhx127.

[34] Lotfi Abdi and Aref Meddeb. “In-vehicle augmented reality system to provide driving safety information”. In: Journal of Visualization 21.1 (2018), pp. 163–184. ISSN: 18758975. DOI: 10.1007/s12650-017-0442-6.

[35] Dean Mohamedally, Panayiotis Zaphiris, and Helen Petrie. “A Web Based Tool for HCI-Orientated Massive Asynchronous Linear Card Sorting”. In: In the Proceedings of British HCI Conference (Volume 2) 2.September (2003), pp. 99–103.

[36] Melanie Volkamer and Karen Renaud. “Mental models-general introduction and review of their application to human-centred security”. In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 8260 LNCS.April 2015 (2013), pp. 255–280. ISSN: 03029743. DOI: 10.1007/978-3-642-42001- 6_18.

[37] Jianming Dong, Shirley Martin, and Paul Waldo. “A user input and analysis tool for information architecture”. In: Conference on Human Factors in Computing Systems - Proceedings (2001), pp. 23–24. DOI: 10.1145/634067. 634085.

[38] Changxu Wu et al. “An investigation of perceived vehicle speed from a driver’s perspective”. In: PLoS ONE 12.10 (2017), pp. 1–11. ISSN: 19326203. DOI: 10.1371/journal.pone.0185347.

[39] Shoaib R. Soomro and Hakan Urey. “Visual acuity response when using the 3D head-up display in the presence of an accommodation-convergence conflict”. In: Journal of Information Display 21.2 (2020), pp. 93–101. ISSN: 21581606. DOI: 10.1080/15980316.2019.1697766. URL: https://doi. org/10.1080/15980316.2019.1697766.

[40] Courtney Stevens and Daphne Bavelier. “The role of selective attention on academic foundations: A cognitive neuroscience perspective”. In: Developmental Cognitive Neuroscience 2.SUPPL. 1 (2012), S30–S48. ISSN: 18789307. DOI: 10.1016/j.dcn.2011.11.001.

[41] Nachiappan Valliappan et al. “Accelerating eye movement research via accurate and affordable smartphone eye tracking”. In: Nature Communications 11.1 (2020), pp. 1–12. ISSN: 20411723. DOI: 10.1038/s41467-020- 18360-5. URL: http://dx.doi.org/10.1038/s41467-020-18360-5.

[42] Ashton Graybiel Webster et al. “Analysis of the electrocardiograms obtained from 1000 young healthy aviators”. In: American Heart Journal 27.4 (1944), pp. 524–549. DOI: https://doi.org/10.1016/S0002-8703(44) 90546-6.

[43] Eric Thorn, Shawn Kimmel, and Michelle Chaka. “A Framework for Automated Driving System Testable Cases and Scenarios”. In: Dot Hs 812 623 September (2018), p. 180. URL: https://www .nhtsa .gov /sites/ nhtsa.dot.gov/files/documents/13882-automateddrivingsystems_ 092618_v1a_tag.pdf.

[44] Mark A. Wetton, Andrew Hill, and Mark S. Horswill. “The development and validation of a hazard perception test for use in driver licensing”. In: Accident Analysis and Prevention 43.5 (2011), pp. 1759–1770. ISSN: 00014575. DOI: 10.1016/j.aap.2011.04.007.

[45] Jing Liu et al. “Analysis of Factors Affecting a Driver’s Driving Speed Selection in Low Illumination”. In: Journal of Advanced Transportation 2020 (2020). ISSN: 20423195. DOI: 10.1155/2020/2817801.

[46] Ronen Hershman, Avishai Henik, and Noga Cohen. “A novel blink detection method based on pupillometry noise”. In: Behavior Research Meth- ods 50.1 (2018), pp. 107–114. ISSN: 15543528. DOI: 10.3758/s13428-017- 1008-1.

参考文献をもっと見る