Please use this identifier to cite or link to this item:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/31886
Full metadata record
???org.dspace.app.webui.jsptag.ItemTag.dcfield??? | Value | Language |
---|---|---|
dc.contributor.advisor | 陳志宏 | |
dc.contributor.author | CHI-TE SHEN | en |
dc.contributor.author | 慎基德 | zh_TW |
dc.date.accessioned | 2021-06-13T03:23:22Z | - |
dc.date.available | 2007-07-31 | |
dc.date.copyright | 2006-07-31 | |
dc.date.issued | 2006 | |
dc.date.submitted | 2006-07-28 | |
dc.identifier.citation | [1] J. J. Gross and R. W. Levenson, “Emotion elicitation using films,” Cognition and Emotion, vol. 9, no. 1, pp. 87–108, 1995.
[2] D. Palomba, M. Sarlo, A. Angrilli, and A. Mini, “Cardiac responses associated with affective processing of unpleasant film stimuli,” International Journal of Psychophysiology, vol. 36, no. 1, pp. 45–57, 2000. [3] K.H. Kim, S.E.Bang, S.R.Kim “Emotion recognition system using short-term monitoring of physiological signals” Med. Biol. Eng. Comput., 2004, 42, 419-427 [4] Lang, P.J., Bradley, M.M., & Cuthbert, B.N. ” International Affective Picture System (IAPS):Technical Manual and Affective Ratings” NIMH Center for the Study of Emotion and Attention 1997 [5] R.W. Picard, E. Vyzas, and J. Healey, “Toward machine emotional intelligence: analysis of affective physiological state,” IEEE Transactions Pattern Analysis and Machine Intelligence, vol. 23, no. 10, pp. 1175–1191, 2001. [6] Christine Lætitia Lisetti, Fatma Nasoz,” Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals” EURASIP Journal on Applied Signal Processing 2004:11, 1672–1687 [7] Patrick Gomez, Werner A. Stahel, Brigitta Danuser ”Respiratory response during affective picture viewing ” Biological Psychology 67 (2004) 359-373. [8] Aha, D., and D. Kibler (1991) 'Instance-based learning algorithms', Machine Learning, vol.6, pp. 37-66 [9] Marek Malik, PhD, MD, Chairman, ”Heart Rate Variability. Standards of Measurement, Physiological Interpretation, and Clinical Use” Writing Committee of the Task Force, Department of Cardiological Sciences, St George's Hospital Medical School, [10] H. P. Huang, et al, “DSP-Based Controller for a Muti-Degree Prosthetic Hand”, Proceedings of IEEE International Conference on Robotics and Automation, pp.1378-1383,2000 [11] Chang-Wei Hsieh, Chi-Wu Mao, Ming-Shing Young, Tzung-Lieh Yeh, 'Assessment of Parasympathetic Control of Blood Vessel by Pulsation Spectrum and Comparison with Spectral Method of RR intervals', Biomedical Engineering, Application, Basis, Communication, Vol. 15, No.1, pp.8-16, Feb. 2003 [12] 李建德,「敵意特質與自律神經功能之相關性研究」,國立成功大學行為醫學研究所臨床心理組,民國90年 [13] 謝長倭,'自主神經功能研究用脈搏波形諧波分析系統之設計 ' 國立成功大學電機工程學系博士論文, 民國九十二年 | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/31886 | - |
dc.description.abstract | 在人機介面之互動上,透過使用者的情緒表現來瞭解其感受回饋是相當重要的。本研究之目的即在於擷取及分析生理訊號並進行情緒辨識分類。研究中透過IAPS(International Affective Picture System)誘發多位受試者強烈正面、微弱正面、強烈負面、微弱負面等四類情緒之產生,另也透過事先挑選之影片誘發多位及單一使用者好笑、愉悅、噁心、害怕四種情緒之產生,並同時利用生理訊號感測器量測肌電圖、心電圖、呼吸訊號及脈搏訊號等生理訊號。收集之生理訊號經過正規化、分析處理及特徵值擷取後,將33個特徵參數透過KNN (k-nearest neighbor algorithm)分類器進行分類,以達到辨識情緒的目的。
結果顯示,利用IAPS圖片系統、利用影片刺激多位使用者,及利用影片刺激單一位使用者,以KNN為演算法將所得之33個特徵值進行情緒分類,其辨識率為90.87%、95.32%、96.58 %。再透過資訊增益(information gain)選出前10個重要的特徵值,分類辨識率則為91.45%、96.1%、97.61%。 結果中也發現由於情緒表現於生理訊號上較具有個體特異性,因此若是利用非本身之情緒資料來判斷其情緒類別,辨識能力僅約31.93 %。但若是利用其受試者本身之情緒資料來進行分類,其辨識率則有95.47 %。因此顯示個人差異性對於辨識率具有極大的影響,需要持續增加使用者個數以降低個人差異性。 本研究最後對於研究使用者情緒辨識時所碰到的難題加以討論,並對未來進行即時使用者情緒辨識系統的研究提供方向與策略。 | zh_TW |
dc.description.abstract | It is important to understand the user’s feeling and feedback by user’s emotional expression in human-computer interaction. The aim of this study is to develop a affective response recognition system by bio-signals measurement, feature extraction and classification. The IAPS (International Affective Picture System) is adopted to elicit user’s affective responses included high valence high arousal, high valence low arousal, low valence high arousal, and low valence low arousal. Moreover, the prepared video clips are also used to elicit multi-user’s and single user’s affective responses included laughing, pleasure, disgust, and fear. The user’s physiological signals, EMG, ECG, blood pulse, and respiration signal, would be measured and recorded simultaneous. By normalization, signal post-processing and feature extraction, biophysical signals would be classified by KNN classifier to indicate the corresponding affective response.
The results show that the accuracy of using IAPS, Video to elicit multi-users’ affective response and using Video to elicit single user’s affective response are 90.87%, 95.32%, and 96.58 %. If data with only top-10 important features which obtained by evaluating the information gain is used to classify, the accuracy would become 91.45%, 96.1%, and 97.61%。 If the data of specific one user is used as the testing dataset and other is used as the training dataset, the accuracy would drop off only 31.93%. The main reason is the number of user is not enough, and the differences between each individual are obviously. Besides, if we separate the data of specific one user by n-fold cross-validation as the training dataset and testing dataset, the accuracy would maintain higher about 95.47%. Therefore, in order to overcome it, more experiments and more biophysical signals are necessary due to the difference between each individual would affect the accuracy of recognition seriously. At the last of this thesis we have discussed about the tough question of studying user’s affective response recognition system and provided the suggestions and strategies to a real-time user’s affective response recognition system. | en |
dc.description.provenance | Made available in DSpace on 2021-06-13T03:23:22Z (GMT). No. of bitstreams: 1 ntu-95-R93921126-1.pdf: 1293169 bytes, checksum: 4deef06d968b2daaa9f199e16cdd67bb (MD5) Previous issue date: 2006 | en |
dc.description.tableofcontents | 中文摘要 2
Abstract 3 致謝 4 目錄 5 圖目錄 7 表目錄 8 第一章 緒論 9 1.1 研究背景 9 1.2 研究主題 10 1.3 情緒反應辨識相關文獻回顧 10 1.4 論文架構 12 第二章 情緒反應辨識簡介及理論背景 13 2.1 情緒反應產生原理 13 2.1.1 情緒的定義 13 2.2 自律神經系統 14 2.2.1 交感神經系統 14 2.2.2 副交感神經系統 14 2.3 使用者生理訊號及其特徵值 15 2.3.1 心電圖 15 2.3.2 肌電圖 17 2.3.3 呼吸 18 2.3.4 脈搏 19 2.3.5 訊號濾波 19 2.3.6 生理訊號特徵值 21 第三章 情緒反應辨識方法之研究 22 3.1 系統概述 22 3.2 以多媒體環境引發多位使用者之情緒反應 23 3.2.1 多媒體環境-IAPS 23 3.2.2 多媒體環境-影片 25 3.3 以多媒體環境引發單一使用者之情緒反應 25 3.4 實驗設計 26 3.4.1 使用者與實驗環境 26 3.4.2 生理訊號感測器安裝 26 3.4.3 使用者實驗程序與問卷 27 第四章 分類器與降低特徵維度演算法 29 4.1 特徵值計算與正規化 29 4.2 訊號特徵值正規化 30 4.3 分類演算法---K Nearest Neighbor 30 4.4 交叉比對 31 4.5 降低特徵維度演算法—資訊增益 32 第五章 實驗結果 35 5.1 多位使用者以圖片刺激之KNN分類結果 35 5.2 多位使用者以影片刺激之KNN分類結果 39 5.3 影片刺激下使用self evaluation class進行分類 43 5.4 單一使用者之情緒反應實驗結果 45 第六章 討論、結論與未來展望 49 6.1 討論 49 6.1.1 實驗結果討論 49 6.1.2 多人實驗中不同使用者差異探討 50 6.1.3 單人實驗中特徵值分佈情形情緒關係之探討 53 6.1.4 相似特性之生理訊號刪除探討 55 6.1.5 不同於交叉比對的模型建立與測試方法之探討 58 6.1.6 呼吸訊號於情緒分析的探討 61 6.1.7 使用者移動之探討 63 6.1.8 情緒發生時間點之探討 64 6.2 結論 65 6.3 未來展望 67 附錄A 生理訊號特徵值與資訊增益排序 68 附錄B 情緒實驗問卷 72 Reference 73 | |
dc.language.iso | zh-TW | |
dc.title | 以生理訊號探討多媒體環境之使用者情感反應 | zh_TW |
dc.title | Study of User’s Affective Response on Multimedia Contents Using Physiological Signal | en |
dc.type | Thesis | |
dc.date.schoolyear | 94-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 梁庚辰,吳家麟,陳中明,柳煌 | |
dc.subject.keyword | 人機介面,情緒辨識,IAPS,生理訊號,KNN, | zh_TW |
dc.subject.keyword | human-computer interface,emotion recognition,physiological signal,IAPS,KNN, | en |
dc.relation.page | 74 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2006-07-30 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 電機工程學研究所 | zh_TW |
Appears in Collections: | 電機工程學系 |
Files in This Item:
File | Size | Format | |
---|---|---|---|
ntu-95-1.pdf Restricted Access | 1.26 MB | Adobe PDF |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.