Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊網路與多媒體研究所
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/95325
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor洪一平zh_TW
dc.contributor.advisorYi-Ping Hungen
dc.contributor.author呂靜zh_TW
dc.contributor.authorChing Luien
dc.date.accessioned2024-09-05T16:10:40Z-
dc.date.available2024-09-06-
dc.date.copyright2024-09-05-
dc.date.issued2024-
dc.date.submitted2024-08-12-
dc.identifier.citation[1] Face tracking for movement sdk for unity. [Online], Aug. 2024. Available: https:// developer.oculus.com/documentation/unity/move-face-tracking.
[2] Integrate an avatar creator into your game in days - ready player me. [Online], Aug. 2024. Available: https://readyplayer.me/.
[3] M. Bluman, K. L. Snider, G. Baratz, A. Cohen, D. Canetti, and B. S. Hasler. Virtual reality-based joy induction: the role of interactivity and prior mood. Cyberpsychology, Behavior, and Social Networking, 26(4):229–237, 2023.
[4] M. M. Bradley and P. J. Lang. Measuring emotion: The self-assessment manikin and the semantic differential. Journal of Behavior Therapy and Experimental Psychiatry, 25(1):49–59, 1994.
[5] C.-M. Cheng, H.-C. Chen, Y.-C. Chan, Y.-C. Su, and C.-C. Tseng. 台灣地區華人情 緒與相關心理生理資料庫 ─ 中文笑話評定常模. 中華心理學刊, 55(4):555–569, Dec 2013.
[6] A. Dey, A. Barde, B. Yuan, E. Sareen, C. Dobbins, A. Goh, G. Gupta, A. Gupta, and M. Billinghurst. Effects of interacting with facial expressions and controllers in different virtual environments on presence, usability, affect, and neurophysiological signals. International Journal of Human-Computer Studies, 160:102762, 2022.
[7] A. Dirin, M. Nieminen, T. H. Laine, L. Nieminen, and L. Ghalebani. Emotional contagion in collaborative virtual reality learning experiences: an esports approach. Education and Information Technologies, 28(11):15317–15363, 2023.
[8] V. Erb, H. Kim, T. Chibisova, J. Lee, and Y. Y. Doh. Play with your emotions: Exploring possibilities of emotions as game input in nero. In Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems, CHI EA ’22, New York, NY, USA, 2022. Association for Computing Machinery.
[9] L. Graf, S. Abramowski, F. Born, and M. Masuch. Emotional virtual characters for improving motivation and performance in vr exergames. Proc. ACM Hum.-Comput. Interact., 7(CHI PLAY), oct 2023.
[10] S. Hong, Y. Choi, Y. Sung, Y. Jin, Y. Y. Doh, and J. Lee. Evoker: Narrativebased facial expression game for emotional development of adolescents. In CHI Conference on Human Factors in Computing Systems Extended Abstracts, pages 1–8, 2022.
[11] S. Kimmel, F. Jung, A. Matviienko, W. Heuten, and S. Boll. Let's face it: Influence of facial expressions on social presence in collaborative virtual reality. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, pages 1–16, 2023.
[12] L. O. Kroczek and A. Mühlberger. Time to smile: How onset asynchronies between reciprocal facial expressions influence the experience of responsiveness of a virtual agent. Journal of Nonverbal Behavior, 47(3):345–360, 2023.
[13] A.-S. Milcent, A. Kadri, and S. Richir. Using facial expressiveness of a virtual agent to induce empathy in users. International Journal of Human–Computer Interaction, 38(3):240–252, 2022.
[14] P. H. Mui, M. B. Goudbeek, C. Roex, W. Spierts, and M. G. Swerts. Smile mimicry and emotional contagion in audio-visual computer-mediated communication. Frontiers in Psychology, 9:2077, 2018.
[15] N. Norouzi, K. Kim, J. Hochreiter, M. Lee, S. Daher, G. Bruder, and G. Welch. A systematic survey of 15 years of user studies published in the intelligent virtual agents conference. In Proceedings of the 18th international conference on intelligent virtual agents, pages 17–22, 2018.
[16] T. Numata, H. Sato, Y. Asa, T. Koike, K. Miyata, E. Nakagawa, M. Sumiya, and N. Sadato. Achieving affective human–virtual agent communication by enabling virtual agents to imitate positive expressions. Scientific reports, 10(1):5977, 2020.
[17] S. Y. Oh, J. Bailenson, N. Krämer, and B. Li. Let the avatar brighten your smile: Effects of enhancing facial expressions in virtual environments. PloS one, 11(9):e0161794, 2016.
[18] A. Palacios-Ibáñez, M. Contero, and J. D. Camba. Emotion recognition in product evaluation: Leveraging face tracking data in virtual reality environments. In International conference on The Digital Transformation in the Graphic Engineering, pages 571–577. Springer, 2023.
[19] F. Pallavicini, A. Pepe, A. Ferrari, G. Garcea, A. Zanacchi, and F. Mantovani. What is the relationship among positive emotions, sense of presence, and ease of interaction in virtual reality systems? an on-site evaluation of a commercial virtual experience. Presence, 27(2):183–201, 2020.
[20] R. R. Provine. Contagious laughter: Laughter is a sufficient stimulus for laughs and smiles. Bulletin of the Psychonomic Society, 30(1):1–4, 1992.
[21] E. Ružickỳ, J. Lacko, J. Mašán, and M. Šramka. Use of virtual reality for stress reduction in nanoarthroscopy. In 2022 Cybernetics & Informatics (K&I), pages 1–6. IEEE, 2022.
[22] T. Schubert, F. Friedmann, and H. Regenbrecht. The experience of presence: Factor analytic insights. Presence: Teleoperators & Virtual Environments, 10(3):266–281, 2001.
[23] I. Steenstra, P. Murali, R. B. Perkins, N. Joseph, M. K. Paasche-Orlow, and T. Bickmore. Engaging and entertaining adolescents in health education using llmgenerated fantasy narrative games and virtual agents. In Extended Abstracts of the CHI Conference on Human Factors in Computing Systems, pages 1–8, 2024.
[24] J. Tsai, E. Bowring, S. Marsella, W. Wood, and M. Tambe. A study of emotional contagion with virtual characters. In Intelligent Virtual Agents: 12th International Conference, IVA 2012, Santa Cruz, CA, USA, September, 12-14, 2012. Proceedings 12, pages 81–88. Springer, 2012.
[25] J. Wendsche, A. Lohmann-Haislah, and J. Wegge. The impact of supplementary short rest breaks on task performance–a meta-analysis. Sozialpolitik. ch, (2/2016):2– 3, 2016.
[26] Y. Wu, S. V. Babu, R. Armstrong, J. W. Bertrand, J. Luo, T. Roy, S. B. Daily, L. C. Dukes, L. F. Hodges, and T. Fasolino. Effects of virtual human animation on emotion contagion in simulated inter-personal experiences. IEEE transactions on visualization and computer graphics, 20(4):626–635, 2014.
[27] S. Zhong, Z. Huang, S. Gao, W. Wen, L. Lin, M. Zitnik, and P. Zhou. Let’s think outside the box: Exploring leap-of-thought in large language models with creative humor generation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 13246–13257, 2024.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/95325-
dc.description.abstract當代社會的情緒議題日漸受到關注,人們對於情感體驗的需求也逐漸迫切。而在虛擬實境的技術日益發展的情況下,如何提升使用者在虛擬環境中的情感體驗成為了一個重要的研究方向。我們希望通過結合臉部表情偵測技術,創造出更加自然和愉快的互動體驗,從而促進正面情緒的傳播。研究目的是了解虛擬代理如何通過實時臉部表情偵測影響使用者的情緒,以及如何設計有效引導情緒之虛擬環境。研究設計了兩個使用者研究,分別招募了 16 位和 12 位受試者。研究分別探討了不同的虛擬代理呈現方式對使用者情緒的影響,以及在不同互動反饋下使用者的情緒變化。而在研究中我們使用了由台灣動畫家兼導演邱立偉所設計的本土 IP 小貓巴克里作為實驗中的任務角色。而在研究結果顯示,高互動性和沉浸感的互動介面能有效提升使用者的正面情緒感染效果。同時,研究強調了與環境風格一致的介面選擇對增強正向情緒的重要性,並建議未來引入更多元的文字轉語音技術及多模態大型模型以提升研究的準確性和個人化體驗。本研究的成果為創新治療方法的開發者以及尋求改善情緒和心理健康的大眾,提供了如何在虛擬環境中促進正向情緒感染的實驗結果和設計考量,進一步推動個性化情緒支持和互動的發展。zh_TW
dc.description.abstractIn contemporary society, the demand for emotional experiences is growing, especially as virtual reality (VR) technology advances. This study explores the integration of facial expression detection to enhance user emotional experiences in VR, aiming to understand how virtual agents can influence emotions through real-time detection and how to design environments that effectively guide emotions. Through two user studies with 16 and 12 participants, using the local IP character Barkley the Cat, the research found that highly interactive and immersive interfaces significantly enhance positive emotional contagion. The study underscores the importance of interface design that aligns with environmental style and recommends incorporating diverse text-to-speech technologies and multimodal models for improved accuracy and personalization. These findings offer valuable insights for developers and the public in promoting emotional and mental well-being in virtual environments.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2024-09-05T16:10:40Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2024-09-05T16:10:40Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontentsVertification Letter from the Oral Examination Committee - Page i
Acknowledgements - Page ii
中文摘要 - Page iv
Abstract - Page v
Contents - Page vi
List of Figures - Page ix
List of Tables - Page xi
Chapter 1: Introduction - Page 1
Chapter 2: Related Works - Page 4
2.1 Facial Expression Recognition in VR - Page 4
2.2 Emotion Contagion in Virtual Environment - Page 6
2.3 The Application of Virtual Agent - Page 7
Chapter 3: Design Consideration - Page 9
3.1 Positive Emotion Induction - Page 9
3.2 Feedback on Interaction Experience - Page 10
Chapter 4: System Design and Implementation - Page 12
4.1 Real-time Facial Expression Recognition in VR - Page 13
4.2 Smile Trigger Algorithm and Avatar Bubble Animation Feedback - Page 14
4.2.1 Facial Movement Data Acquisition - Page 15
4.2.2 Calculation of Emotion Status - Page 15
4.2.3 Avatar Bubble Animation Feedback - Page 16
4.3 Virtual Agent Architecture - Page 17
4.4 Stimuli: Emotional-Inducing Narratives - Jokes - Page 19
Chapter 5: User Studies - Page 20
5.1 User Study 1 - Representation of Agent for Amusement - Page 22
5.1.1 Tasks - Page 23
5.1.2 Participants - Page 24
5.1.3 Procedure - Page 24
5.2 User Study 2 - Agent Feedback When Interacting with Agent - Page 26
5.2.1 Tasks - Page 27
5.2.2 Participants - Page 28
5.2.3 Procedure - Page 29
Chapter 6: Result - Page 31
6.1 User Study 1 - Page 31
6.1.1 SAM Questionnaire - Page 31
6.1.2 Interview - Page 32
6.2 User Study 2 - Page 34
6.2.1 SAM Questionnaire - Page 34
6.2.2 IPQ - Page 35
6.2.3 Interview - Page 40
Chapter 7: Discussion - Page 42
7.1 Sources of Positive Emotional Induction in Users - Page 42
7.2 Impact of Different Interaction Experiences on User Experience - Page 43
Chapter 8: Conclusion and Future Work - Page 44
References - Page 46
-
dc.language.isoen-
dc.title在虛擬環境中結合臉部表情偵測以創造正向情緒感染zh_TW
dc.titleCreating Positive Emotion Contagion in Virtual Environment by Utilizing Facial Expression Recognitionen
dc.typeThesis-
dc.date.schoolyear112-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee范丙林;許峻誠zh_TW
dc.contributor.oralexamcommitteePing-Lin Fan;Chun-Cheng Hsuen
dc.subject.keyword情緒感染,臉部表情偵測,虛擬實境,虛擬化身,zh_TW
dc.subject.keywordEmotion Contagion,Facial Expression Recognition,Virtual Reality,Avatar,en
dc.relation.page50-
dc.identifier.doi10.6342/NTU202404076-
dc.rights.note未授權-
dc.date.accepted2024-08-13-
dc.contributor.author-college電機資訊學院-
dc.contributor.author-dept資訊網路與多媒體研究所-
顯示於系所單位:資訊網路與多媒體研究所

文件中的檔案:
檔案 大小格式 
ntu-112-2.pdf
  目前未授權公開取用
24.27 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved