リケラボ論文検索は、全国の大学リポジトリにある学位論文・教授論文を一括検索できる論文検索サービスです。

リケラボ 全国の大学リポジトリにある学位論文・教授論文を一括検索するならリケラボ論文検索大学・研究所にある論文を検索できる

リケラボ 全国の大学リポジトリにある学位論文・教授論文を一括検索するならリケラボ論文検索大学・研究所にある論文を検索できる

大学・研究所にある論文を検索できる 「Detecting internal emotions using video cameras」の論文概要。リケラボ論文検索は、全国の大学リポジトリにある学位論文・教授論文を一括検索できる論文検索サービスです。

コピーが完了しました

URLをコピーしました

論文の公開元へ論文の公開元へ
書き出し

Detecting internal emotions using video cameras

KEYA DAS TILOTTOMA 埼玉大学 DOI:info:doi/10.24561/00019153

2020

概要

Automatically recognizing of human emotion is an interesting and challenging task with many applications such as human robot interaction, movie marketing, and more. Emotion analysis and recognition has become an interesting topic of research among the computer vision research community, Human-Computer Interaction (HCI) and Human-Robot Interaction (HRI) is one of the interesting challenges in the community of human-computer interaction today to make computers be more human-like for intelligent user interfaces. Emotion, one of the users affects, has been recognized as one of the most important ways of people to communicate with each other. Given the importance and potential of the emotions, effective interfaces using the emotion of the human user are gradually more desirable in intelligent user interfaces such as Human-Robot Interactions. Thus, there has been much work on systems that identify emotional states. Human emotion recognition using facial expressions is a common approach and there are many researchers working in this direction. However, there are times when facial expressions can either be faked or hidden. That is, one's "apparent emotions" from say, facial expressions may not be a reflection of one's genuine "inner emotions". Thus, other modalities such as physiological responses should be investigated for detecting and recognizing a person's inner emotions. This thesis aims to build a practical system for detecting such emotions by observing physiological responses. We focus on sensing physiological changes through visual means using only conventional cameras, as this would not require specialized equipment.
As mentioned earlier, the computer vision community has made many advancements in apparent emotion recognition through facial expressions. On the other hand, the psychophysiology community has conducted several studies on detecting and recognizing internal emotions using different physiological channels. Recognition of emotion has been done using many physiological signs such as heart rate change, eye movements, eye blinks, change in skin conductance, and change of skin temperature. Although many different physiological signs are used, many researchers find that cardiac activity is useful for emotion recognition. As a result, this thesis explores the use of cardiac activity for emotion recognition. However, most past studies use electrocardiography (ECG) for reading cardiac activity. ECGs are effective but have their limitations due to the need for attached sensors and higher cost. To realize practical application system for many real-world settings, we need a method that can sense cardiac activity without requiring any wearable attachments or devices.
Fortunately, in recent years, remote photo plethysmography (PPG) algorithms have received attention. These techniques allow one to read cardiac activity such as heart rates (HR) from conventional cameras by typically observing small changes in skin color over time. This thesis aims to use the sensed cardiac activity from remote PPG to detect and recognize emotions. Since remote PPG has been shown to work with conventional cameras, our proposed approach has the benefit that cameras such as webcams, surveillance cameras, and cellphone cameras could be used. With the ability to see cardiac activity without contact sensors, we present a convenient system for detecting internal emotions. Like in the psychophysiology literature, we chose to evaluate our approach by recognizing emotional reactions to emotionally stimulating videos such as horror and comedy clips. In the first phase of our work, we showed video content to human subjects and collected HR data using an attached sensor (Fitbit) for three emotional states (normal resting, funny, and horror). We then confirmed that the average HR between normal resting states and the emotionally stimulated states exhibit a statistically significant difference.
The first phase of our work showed that HRs could be used to detect changes in emotional state but did not explore the recognition of what kinds of emotions were present. In the next phase of our work, we investigated the feasibility of using cardiac pulse signals for recognizing different emotional states (joy vs. fear) and how this compares with the use of facial expressions. Specifically, we used the Open Face facial landmark tracker to estimate the average facial action unit intensities for each subject on the 30 second segments in both the comedy and horror cases. In this Thesis, we use a remote video based cardiac activity sensing technique to obtain physiological data to identify emotional states. We show that from the remotely sensed cardiac pulse patterns alone, emotional states can be differentiated. Specifically, we conducted an experimental study on recognizing the emotions of people watching video clips. We recorded all volunteers that all watched the same comedy and horror video clips and then we estimated their cardiac pulse signals from the video footage. From the cardiac pulse signal alone, we were able to classify whether the subjects were watching the comedy or horror video clip. We also compare against classifying for the same task using facial action units and discuss how the two modalities compare. In experimental period all subjects are watching two different kinds of emotional status changes video clips, like comedy and horror video clips. By using their pulse signal, we have analysis and tries to find their emotional status or emotional classification during watching the video clips. We have compared HR method with various types of wearable sensors and features, like Wii fit balanced board, pulse oximeter and GSR. Those wearable sensors are used to detect physiological signal.
In short, the two main contributions of this PhD thesis include sensing emotional changes in humans and determining the effectiveness of emotion recognition using only remote PPG sensed data relative to conventional facial expression analysis. To our knowledge, we are one of the first to bridge the gap between computer vision and psychophysiology through presentation of a promising system for visual detection of internal emotions.

この論文で使われている画像

参考文献

Interventions in Clinical Practice for Rehabilitation of Upper Limbs & quot;. Applied Sciences. 9 (13): 2586. doi:10.3390/app9132586. ISSN 2076-3417.

[74] Marian S. Bartlett. Face Image Analysis by Unsupervised Learning, volume 612

of The Kluwer International Series on Engineering and Computer Science. Kluwer

Academic Publishers, Boston, 2001.

[75] M. Pantic and J.M. Rothcrantz. Automatic analysis of facial expressions: State

of the art. IEEE Transactions on Pattern Analysis and Machine Intelligence,

22(12):1424–1445, 2000.

[76] Crane, E. A., Shami, N. S., and Peter, C. 2007. Let’s get emotional: emotion

research in human computer interaction. In ACM CHI ’07 Extended Abstracts on

Human Factors in Computing Systems. San Jose, CA, USA, April 28 - May 03.

pp.2101-2104

[77] Herbon A., Oehme A., and Zentsch E., Jan 2006. Emotions in ambient intelligence–an experiment on how to measure affective states. zmms.tu-berlin.de.

[78] Money A.G., Agius H., 2008. Are Affective Video Summaries Feasible? Emotion

in HCI: Joint Proceedings of the 2005, 2006, and 2007 International Workshops,

page 142-149.

[79] Money A.G., Agius H., 2008. Video Playing with Our Emotions. Emotion in

HCI: Joint Proceedings of the 2005, 2006, and 2007 International Workshops,

page 168-171.

[80] Peter C., Beale R. (eds.): Affect and Emotion in Human Computer Interaction.

LNCS, vol. 4868. Springer, Heidelberg (2008) (to appear).

[81] Emotion-in-HCI website (2008). http://www.emotion-inhci.net.

58

...

参考文献をもっと見る