Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電機工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/43644
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳志宏
dc.contributor.authorHsuan-Kai Wangen
dc.contributor.author王炫凱zh_TW
dc.date.accessioned2021-06-15T02:25:01Z-
dc.date.available2009-12-31
dc.date.copyright2009-08-19
dc.date.issued2009
dc.date.submitted2009-08-18
dc.identifier.citation[1] R. J. Davidson, K. R. Sherer and H. H. Goldsmith, “HANDBOOK OF
AFFECTIVE SCIENCES,” OXFORD.
[2] G. E. Schwartz, P. L. Fair, P. S. Greenberg, M. J. Friedman, and G. L. Klerman,
“Facial EMG in the assessment of emotion,” Psychophysiology, vol. 11, no. 2, pp.
237, 1974.
[3] P. Ekman, R. W. Levenson, and W. V. Friesen, “Autonomic nervous system
activity distinguishes among emotions,” Science, vol. 221, no. 4616, pp. 1208–1210,
1983.
[4] J. T. Lanzetta and S. P. Orr, “Excitatory strength of expressive faces: effects of
happy and fear expressions and context on the extinction of a conditioned fear
response,” Journal of Personality and Social Psychology, vol. 50, no. 1, pp. 190–194,
1986.
[5] A. Pecchinenda and C. Smith, “Affective significance of skin conductance
activity during difficult problem-solving task,” Cognition and Emotion, vol. 10, no. 5,
pp. 481–503, 1996.
[6] J. J. Gross and R. W. Levenson, “Hiding feelings: the acute effects of inhibiting
negative and positive emotion,” Journal of Abnormal Psychology, vol. 106, no. 1, pp.
95–103, 1997.
[7] M. P. Tarvainen, A. S. Koistinen, M. Valkonen-Korhonen, J. Partanen, and P. A.
Karjalainen, “Analysis of galvanic skin responses with principal components and
clustering techniques,” IEEE Transactions on Biomedical Engineering, vol. 48,no. 10,
pp. 1071–1079, 2001.
[8] J. Scheirer, R. Fernandez, J. Klein, and R. W. Picard, “Frustrating the user on
purpose: a step toward building an affective computer,” Interacting with Computers,
vol. 14, no. 2, pp.93–118, 2002.
[9] P. J. Lang, M. M. Bradley and B. N. Cuthbert, “International Affective Picture
System (IAPS):Technical Manual and Affective Ratings,” NIMH Center for the Study
of Emotion and Attention 1997
[10] J. Kim and E. Andre, “Emotion Recognition based on Physiological Changes in
Music Listening,” IEEE Transactions on Pattern Analysis and Machine Intelligence.
2008
[11] J. A. Healey, “Wearable and Automotive Systems for Affect Recognition from
Physiology,” PhD thesis, MIT, Cambridge, MA, May 2000
[12] R. W. Picard, E. Vyzas and J. A. Healey, “Toward machine emotional
intelligence: analysis of affective physiological state,” IEEE Transactions Pattern
- 55 -
Analysis and Machine Intelligence, vol. 23, no. 10, pp. 1175–1191, 2001.
[13] A. Haag, S. Goronzy, P. Schaich and J. Williams, “Emotion Recognition Using
Bio-Sensors: First Step Towards an Automatic System,” Affective Dialogue Systems,
Tutorial and ResearchWorkshop, Kloster Irsee, Germany, June 14-16, 2004
[14] K. H. Kim, S. E. Bang and S. R. Kim, “Emotion recognition system using
short-term monitoring of physiological signals,” Med. Biol. Eng. Computer, 2004, 42,
419-427
[15] C. L. Lisetti and F. Nasoz, “Using Noninvasive Wearable Computers to
Recognize Human Emotions from Physiological Signals,” EURASIP Journal on
Applied Signal Processing 2004:11, 1672–1687
[16] E. Leon, G. Clarke, V. Callaghan and F. Sepulveda, “A user-independent
real-time emotion recognition system for software agents in domestic environments,”
Engineering Applications of Artificial Intelligence, 2007
[17] R. B. Malmo, “Activation: a neurophysiological dimension,” Psychol. Rev. 66,
367– 386, 1959
[18] 慎基德,”以生理訊號探討多媒體環境之使用者情緒反應,” 國立台灣大學電
機工程學系碩士論文, 民國九十五年
[19] H. P. Huang and C. Y. Chiang, “DSP-Based Controller for a Muti-Degree
Prosthetic Hand,” Proceedings of IEEE International Conference on Robotics and
Automation, pp.1378-1383, 2000
[20] P. Gomez, W. A. Stahel and B. Danuser, “Respiratory response during affective
picture viewing,” Biological Psychology, 2004
[21] R. L. Mandryk and M. S. Atkins, “A fuzzy physiological approach for
continuously modeling emotion during interaction with play technologies,”
International Journal of Human-Computer Studies, 2007
[22] Y. Nagai, H. D. Critchley, E. Featherstone and M. R. Trimble, “Activity in
ventromedial prefrontal cortex covaries with sympathetic skin conductance level - a
physiological account of a physiological account of a ‘‘default mode’’ of brain
function,” Neuroimage, 2004
[23] C. L. Krumhansl, “An Exploratory Study of Musical Emotions and
Psychophysiology,” Canadian J. Experimental Psychology, vol. 51, pp. 336-352,
1997.
[24] L. B. Meyer, “Emotion and Meaning in Music,” Univ. of Chicago Press, 1956.
[25] W. B. Davis and M. H. Thaut, “The Influence of Preferred Relaxing Music on
Measures of State Anxiety, Relaxation, and Physiological Responses,” J. Music
Therapy, vol. 26, no. 4, pp. 168-187, 1989.
[26] J. S. Taylor and N. Cristianini, “Kernel Methods for Pattern Analysis,”
Cambridge University Press, 2004.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/43644-
dc.description.abstract情緒辨識技術結合手機等可攜式裝置,可提供人與人之間更完整的溝通資訊,同時增加人機互動的豐富性。利用相關的生理資訊可以建立單人的即時情緒辨識系統。本研究以音樂來改變使用者的情緒,使他們產生逐漸放鬆、愉快(正向)及不愉快(負向)三種情緒反應,同時蒐集使用者的肌電圖、呼吸、脈搏及表面皮膚導電度。蒐集的生理訊號經過濾波、切割、校正與正規化後可以得到相關特徵。同時,利用生理特徵配合分類器,找出對於情緒判定最有用的生理訊號。
離線結果部分,單人放鬆 vs. 強烈反應的辨識正確率達95.61%,正向 vs. 負向辨識正確率達91.69%。另外,利用單人多次實驗結果觀察使用者皮膚導電度在不同情緒狀態下的反應趨勢,結果得到在放鬆狀態,導電度逐漸下降,而強烈反應下,導電度會有上升情形,符合文獻結果。
即時結果部分,單人即時放鬆 vs. 強烈反應的辨識正確率達94.69%,正向vs. 負向辨識正確率達81.00%。
本研究最後對於利用生理訊號即時判斷單人情緒所遇到的種種問題加以探討,並對未來建立單人即時情緒辨識系統所需的努力方向提出說明。
zh_TW
dc.description.abstractIntegration of emotion recognition and portable devices such as cell phone could provide more completed information for people communication and better human-computer interaction. A real-time emotion recognition system for individuals could be implemented with related bio-information. In this research, specific music is chosen to elicit the user’s emotions (relaxed, positive and negative). The physiological signals were acquired through four biosensors: electromyogram, skin conductance, respiration and pulse. Physiological features are acquired by features extraction methods such as filtering, segmentation, calibration and normalization. At the same time, physiological features are classified using pattern recognition techniques.
The accuracy of off-line analysis achieved 95.61% and 91.69% on recognition of “relaxed vs. excited” and “positive vs. negative”, respectively. Besides, our results show the tendency of user’s skin conductance responses matches other research results.
Furthermore, the accuracy of real-time analysis are 94.69% and 81.00% on recognition of “relaxed vs. excited” and “positive vs. negative”, respectively.
Finally, the limitations of real-time emotion recognition for individual are listed and will be solved in the future; there are still some works need to be optimized for
implementation of a real-time emotion recognition system for individual.
en
dc.description.provenanceMade available in DSpace on 2021-06-15T02:25:01Z (GMT). No. of bitstreams: 1
ntu-98-R95921104-1.pdf: 3829692 bytes, checksum: 05e4478a3a98ab492f020681d25c5367 (MD5)
Previous issue date: 2009
en
dc.description.tableofcontents目錄
口試委員會審定書….…………………………………………… i
誌謝 ……………………………………………………………... ii
中文摘要 ………………………………………………………... iii
英文摘要 ……………………………………………………….. iv
圖目錄 ………………………………………………………….. v
表目錄 …………………………………………………………... vi
第一章 緒論 ………………………………………………….. 1
1.1 研究背景 ……………………………………………….. 1
1.2 研究主題 ……………………………………………….. 2
1.3 情緒辨識相關文獻回顧………………………………… 3
1.4 論文架構 ……………………………………………… 7
第二章 生理訊號相關理論……………………………………. 8
2.1 交感神經與副交感神經系統…………………………… 8
2.2 情緒狀態與生理反應關聯性…………………………… 9
2.3 使用者生理訊號及其特徵 …………………………….. 11
2.3.1 肌電圖 ……………………………………………. 11
2.3.2 呼吸 ………………………………………………. 13
2.3.3 脈搏 ………………………………………………. 14
2.3.4 表面皮膚導電反應 ………………………………. 16
2.3.5 生理訊號特徵值 …………………………………. 17
第三章 研究方法………………………………………………. 19
3.1 研究方法流程 ………………………………………….. 19
3.2 實驗設計 ……………………………………………….. 20
3.3 生理訊號濾波與切割 ………………………………….. 22
3.4 生理訊號特徵校正與正規化 ………………………….. 25
3.5 資訊交叉比對 ………………………………………….. 26
3.6 分類演算法: KNN ……………………………………… 27
3.7 特徵分析演算法-資訊增益 ………………………….. 28
3.8 單人即時情緒分析系統 ……………………………….. 30
第四章 結果 …………………………………………………... 31
4.1 單人的離線分類結果 ………………………………….. 31
4.2 單人情緒狀態與特徵關聯性 ………………………….. 38
4.3 單人即時情緒辨識結果 ……………………………….. 41
第五章 討論、結論與未來展望 ……………………………… 47
5.1 討論 …………………………………………………….. 47
5.2 結論 …………………………………………………….. 52
5.3 未來工作 ……………………………………………….. 52
參考文獻 ………………………………………………………... 54
附錄 ……………………………………………………………... 56
dc.language.isozh-TW
dc.subject人機互動zh_TW
dc.subject音樂zh_TW
dc.subject情緒辨識zh_TW
dc.subject即時zh_TW
dc.subject生理訊號zh_TW
dc.subjectPhysiological signalsen
dc.subjectHuman-computer interactionen
dc.subjectMusicen
dc.subjectEmotion recognitionen
dc.subjectReal-timeen
dc.title以生理訊號分析系統即時評估音樂環境之使用者情感反應zh_TW
dc.titleEstimation of User’s Affective Response on Music
Contents Using Real-Time Analysis System of Physiological Signals
en
dc.typeThesis
dc.date.schoolyear97-2
dc.description.degree碩士
dc.contributor.oralexamcommittee楊泮池,周泰立,徐良育,李琳山
dc.subject.keyword音樂,情緒辨識,即時,生理訊號,人機互動,zh_TW
dc.subject.keywordMusic,Emotion recognition,Real-time,Physiological signals,Human-computer interaction,en
dc.relation.page74
dc.rights.note有償授權
dc.date.accepted2009-08-18
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept電機工程學研究所zh_TW
顯示於系所單位:電機工程學系

文件中的檔案:
檔案 大小格式 
ntu-98-1.pdf
  未授權公開取用
3.74 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved