請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/67911完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 洪一平(Yi-Ping Hung) | |
| dc.contributor.author | He-Lin Luo | en |
| dc.contributor.author | 羅禾淋 | zh_TW |
| dc.date.accessioned | 2021-06-17T01:57:35Z | - |
| dc.date.available | 2020-07-21 | |
| dc.date.copyright | 2017-07-21 | |
| dc.date.issued | 2017 | |
| dc.date.submitted | 2017-07-20 | |
| dc.identifier.citation | [1] C. Darwin, P. Ekman and P. Prodger, “The expression of the emotions in man and animals,” Oxford University Press, USA, 1998.
[2] P. Ekman, “Facial expressions of emotion: New findings, new questions,” Psychological science, 3(1), 1992, pp.34-38. [3] C. E. Izard, “Four systems for emotion activation: cognitive and noncognitive processes,” Psychological review, 100(1), 68, 1993. [4] A. Ortony, G. L. Clore and A. Collins, “The cognitive structure of emotions,” Cambridge university press, 1990. [5] C. Nass, J. Steuer and E. R. Tauber, “Computers are social actors,” in Proc. of ACM conference on Human factors in computing systems, 1994, pp. 72-78. [6] R. W. Picard and R. Picard, “Affective computing,” Cambridge: MIT press, 252, 1997. [7] G. H. Bower, “Mood and memory,” American psychologist, 36(2), 129, 1981. [8] E. Hatfield, J. T. Cacioppo and R. L. Rapson, “Emotional contagion,” Cambridge university press, 1994. [9] K. Höök, P. Sengers, and G. Andersson, “Sense and sensibility: evaluation and interactive art,” in Proc. of ACM conference on Human factors in computing systems, 2003, pp. 241-248. [10] C. D. Frith and D. Wolpert, “The Neuroscience of Social Interaction: Decoding, influencing, and imitating the actions of others,” 2004. [11] R. E. Petty and J. T. Cacioppo, “Involvement and persuasion: Tradition versus integration,” 1990. [12] T. Gonsalves, C. Frith, B. Averbeck, Y. Kashef, A. N. Mahmoud, R. El Kaliouby and H. Sloan, “The chameleon project: An art installation exploring emotional contagion,” Institute of Electrical and Electronics Engineers, 2009. [13] P. Bourgeois and U. Hess, “The impact of social context on mimicry,” Biological psychology, 77(3), 2008, pp.343-352. [14] M. Iacobini, T. Gonsalves, N. B. Berthouze and C. Frith, “Creating emotional communication with interactive artwork,” 2009. [15] E. A. Haggard and K. S. Isaacs, “Micromomentary facial expressions as indicators of ego mechanisms in psychotherapy,” In Methods of research in psychotherapy Springer US, 1966, pp. 154-165. [16] U. Dimberg, “Facial reactions to facial expressions,” Psychophysiology, 19(6), 1982, pp.643-647. [17] J. M. Haviland, and M. Lelwica, “The induced affect response: 10-week-old infants’ responses to three emotion expressions,” Developmental Psychology, 23(1), 1987, pp.97. [18] R. El Kaliouby and P. Robinson, “Real-time inference of complex mental states from facial expressions and head gestures, In Real-time vision for human-computer interaction,” Springer US, 2005, pp. 181-200. [19] J. T. Cacioppo, R. E. Petty, M. E. Losch, and H. S. Kim, “Electromyographic activity over facial muscle regions can differentiate the valence and intensity of affective reactions,” Journal of personality and social psychology, 50(2), 1986, pp. 260. [20] J. Dimas, G. Pereira, P. A. Santos, R. Prada and A. Paiva, “I’m happy if you are happy: a model for emotional contagion in game characters,” in Proc. of the 8th ACM International Conference on Advances in Computer Entertainment Technology, 2011. [21] J. Bordas, “OMNEMOTION: the propagation of emotions,” in Proc. of ACM Conference on Virtual Reality, 2014, pp. 14. [22] T. Matsumoto, S. Seko, R. Aoki, A. Miyata, T. Watanabe and T. Yamada, “Affective agents for enhancing emotional experience,” in Proc. of the second ACM international conference on Human-agent interaction, 2014, pp. 169-172. [23] A. D. Kramer, “The spread of emotion via Facebook,” in Proc. of ACM Conference on Human Factors in Computing Systems, 2012, pp. 767-770. [24] W. Sasaki, Y. Furukawa, Y. Nishiyama, T. Okoshi, J. Nakazawa and H. Tokuda, “SmileWave-Sensing and Analysis of Smile-Based Emotional Contagion over Social Network,” in Proc. of the 15th ACM/IEEE International Conference on Information Processing in Sensor Networks, 2016, pp. 1-2. [25] J. C. Smith, “ABC relaxation theory: an evidence-based approach,” Springer Publishing Company, 1999. [26] E. L. van den Broek and J. H. Westerink, “Biofeedback systems for stress reduction: towards a bright future for a revitalized field,” 2012. [27] A. F. Seay, D. Gromala, L. Hodges and C. Shaw, “The meditation chamber: a debriefing,” in Proc. of ACM conference on SIGGRAPH abstracts and applications, 2002, pp. 263-263. [28] J. Vidyarthi, B. E. Riecke, D. Gromala, “Sonic Cradle: designing for an immersive experience of meditation by connecting respiration to music,” in Proc. of ACM Conference on designing interactive systems, 2012, pp. 408-417. [29] M. C. Yu, J. L. Liou, S. W. Kuo, M. S. Lee, Y. P. Hung, “Noncontact respiratory measurement of volume change using depth camera,” in Proc. of Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2012, pp. 2371-2374. [30] L. W. Chan, Y. F. Chuang, M. C. Yu, Y. L. Chao, M. S. Lee, Y. P. Hung, and J. Hsu, “Gesture-based interaction for a magic crystal ball,” in Proc. Of ACM symposium on Virtual reality software and technology, 2007, pp. 157-164. [31] J. A. Healey and R. W. Picard, “Detecting stress during real-world driving tasks using physiological sensors,” IEEE Transactions on intelligent transportation systems, 6(2), 2005, pp.156-166. [32] W. C. Liao, “Breath-based multimedia system for relaxation,” National Taiwan University, 2016. [33] A. Warhol, “Campbell’s Soup Cans,” The Museum of Modern Art, New York, 1962. [34] F. Bacon, “Two studies for self-portrait,” DACS, UK ,1977, Oil paint on Canvas. [35] J. Ahlberg, “Candide-3-an updated parameterised face,” 2001. [36] T. F. Cootes, C. J. Taylor, D. H. Cooper and J. Graham, “Active shape models-their training and application,” Computer vision and image understanding, 61(1), 1995, pp. 38-59. [37] Y. Shan, Z. Liu and Z. Zhang, “Model-based bundle adjustment with application to face modeling,” in Proc. of the 8th IEEE International Conference on Computer Vision, 2001, pp. 644-651. [38] J. J. Moré, “The Levenberg-Marquardt algorithm: implementation and theory,” Springer Berlin Heidelberg, 1978, pp. 105-116. [39] J. Gall and V. Lempitsky, “Class-specific hough forests for object detection,” Springer London, 2013, pp. 143-157. [40] W. Y. Chang, C. S. Chen and Y. D. Jian, “Visual tracking in high-dimensional state space by appearance-guided particle filtering,” IEEE Transactions on Image Processing, 17(7), 2008, pp.1154-1167. [41] F. Dornaika, and F. Davoine, “On appearance based face and facial action tracking,” IEEE transactions on circuits and systems for video technology, 16(9), 2006, pp.1107-1124. [42] T. Beier and S. Neely, “Feature-based image metamorphosis,” ACM SIGGRAPH Computer Graphics, 26(2), 1992, pp. 35-42. [43] P. Viola and M. J. Jones, “Robust real-time face detection,” International journal of computer vision, 57(2), 2004, pp.137-154. [44] C. C. Chang and C. J. Lin, “LIBSVM: a library for support vector machines,” ACM Transactions on Intelligent Systems and Technology, 2(3), 27, 2011. [45] S. D. Pugh, “Service with a smile: Emotional contagion in the service encounter,” Academy of management journal, 44(5), 2001, pp. 1018-1027. [46] D. Michelis and J. Müller, “The audience funnel: observations of gesture based interaction with multiple large displays in a city center,” International Journal of Human–Computer Interaction, 27(6), 2011, pp.562-579. [47] E. T. Hall, “A system for the notation of proxemic behavior,” American anthropologist, 65(5), 1963, pp.1003-1026. [48] E. T. Hall, “The hidden dimension,” 1966. [49] T. Ballendat, N. Marquardt and S. Greenberg, “Proxemic interaction: designing for a proximity and orientation-aware environment,” in Proc. Of ACM International Conference on Interactive Tabletops and Surfaces, 2010, pp. 121-130. [50] H. J. Huang, “On Improving Depth Perception with Motion Parallax and Occlusion and its Application to Interactive Displays,” National Taiwan University, 2014. [51] B. Rogers and M. Graham, “Motion parallax as an independent cue for depth perception,” Perception, 8(2), 1979, pp.125-134. [52] D. Vogel and R. Balakrishnan, “Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users,” in Proc. Of the 17th annual ACM symposium on User interface software and technology, 2004, pp. 137-146. [53] J. T. Enns, E. L. Austen, V. Di Lollo, R. Rauschenberger and S. Yantis, “New objects dominate luminance transients in setting attentional priority,” Journal of Experimental Psychology: Human Perception and Performance, 27(6), 2001. [54] M. R. Miller, J. A. T. S. Hankinson, V. Brusasco, F. Burgos, R. Casaburi, A. Coates, and R. Jensen, “Standardisation of spirometry,” European respiratory journal, 26(2), 2005, pp.319-338. [55] J. P. Scharlemann, C. C. Eckel, A. Kacelnik and R. K. Wilson, “The value of a smile: Game theory with a human face,” Journal of Economic Psychology, 22(5), 2001, pp.617-640. [56] H. Tsujita and J. Rekimoto, “Smiling makes us happier: enhancing positive mood and communication with smile-encouraging digital appliances,” in Proc. of the 13th ACM international conference on Ubiquitous computing, 2011, pp. 1-10. [57] U. Hess and P. Bourgeois, “You smile–I smile: emotion expression in social interaction,” Biological psychology, 84(3), 2010, pp.514-520. [58] N. Dalal and B. Triggs, “Histograms of oriented gradients for human detection,” in Proc. of IEEE Computer Society Conference on Computer Vision and Pattern Recognition,2005, pp. 886-893. [59] R. E. Fan, K. W. Chang, C. J. Hsieh, X. R. Wang, and C. J. Lin, “LIBLINEAR: a library for large linear classification,” Journal of machine learning research, 9(Aug), 2008, pp. 1871-1874. [60] F. Alt, S. Schneegaß, A. Schmidt, J. Müller and N. Memarovic, “How to evaluate public displays,” in Proc. of ACM International Symposium on Pervasive Displays, 2012, pp. 17 | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/67911 | - |
| dc.description.abstract | 情緒(Emotion)直接反應著人們當下的心情,透過情感運算科技,可使電腦解讀人們的心情狀態,並產生視覺或聽覺的回饋。在人機互動領域,情感運算(Affective Computing)技術使得電腦可如同真人一般與使用者溝通。網路科技的進步,社群網路在此虛擬世界造成多變性。人們所面對的不只是單一電腦,而是與雲端系統連結的大量使用者。許多情緒傳達的研究,探討人們的情緒對於其他人的影響力。因此,本研究探討「情緒感染(Emotional Contagion)」的理論,其中人們的情緒具有擴散至他人情緒的能力。
情緒感染(Emotional Contagion)是人們在社交互動時,傾向模仿其他人的表情、發聲、姿勢與動作,而此種模仿亦轉變他們原本的情緒。此外,情緒感染的媒介可能是特定的符號、圖像或聲音,例如在社群網站傳送貼圖或按讚的行為。此虛擬世界所帶來的多變性,也是人機互動所需要思考的議題。在本研究中,聚焦在正向情緒,並探討快樂情緒的情緒感染。本研究設計三件互動藝術裝置,包含《靜坐金字塔》、《微笑牆》、《微笑四方》。收集使用者參與裝置互動過程中的情緒狀態,以了解其情緒感染的情況。藉由分析其「感染速度」了解情緒感染的互動裝置對使用者的影響,是否能夠幫助使用者增加其主動參與作品的意願。 本研究所設計的互動藝術裝置,其影響媒介分別為「代理(Agent)」、「影像(Video)」及「視訊(Live Cam)」,透過不同的媒介來實驗情緒感染的可能性。在《靜坐金字塔》中,透過靜坐的方式,讓使用者當下的正面或負面情緒回歸正常。由此作品的實驗結果發現,情緒感染對使用者的情緒造成不同的影響。給予不同情緒刺激時,使用者的情緒感染程度,會隨著使用者已被感染的程度而有所變化。《微笑牆》設計微笑擴散的互動流程,藉由記錄使用者互動過程所產生的微笑次數,了解其微笑的情緒感染。最後,在《微笑四方》,觀察使用者之間即時的情緒感染。藉由計算使用者微笑的次數了解使用者之間的情緒感染狀態,並計算使用者之間產生微笑所需的時間,得到使用者之間的「感染速度」。於互動體驗結束後,透過問卷的方式,了解使用者於互動過程感受到微笑所造成的情緒感染狀態,體驗作品的意願,及使用者之間所感受到關係的距離。因此本研究透過互動裝置設計,以觀察及驗證情緒感染的特性。 | zh_TW |
| dc.description.abstract | Emotion reflects current mood of people directly. Using technology of affective computing, computer could interpret their moods and provide visual or auditory feedback. In human-computer interaction field, the technology enables computers to communicate with users as a real person. For the improvement of internetwork, social media causes variety in the virtual world. People not only face a single computer, but also connect with a large number of users via cloud system. Some researches in emotion conveyance discover the how human emotion influences the others. Hence, this study focuses on the theory of emotional contagion, in which the emotion of individuals are spread among the others.
Emotional contagion is the phenomenon where people mimic the expressions, vocalizations, postures and movements of others in social activities, and the imitation also changes their original emotions. Emotional contagion can be transmitted by specific symbols, images or sounds, for example sending animation or clicking positive feedback on social websites. That also brings the main issue in human-computer interaction for the variety in internet world. In the work, only positive emotion was considered, and the emotional contagion of happy emotions were discovered. Three interactive art installations were designed: Meditation Pyramid, Smiling Wall and Quartic Smile. The emotion state of users was collected during interaction for understand the phenomenon of emotional contagion. The contagion speed was analyzed for discovering the influences of emotional contagion in the design of interactive devices on users, if the design increases the willingness of users for participating the interaction process actively. The media of emotional contagion in the three interactive devices are agent, video and live cam, respectively. In Meditation Pyramid, positive or negative emotion state of users were back to normal through meditation. From the experimental results of Meditation Pyramid, emotional contagion causes different influences on the emotion of users. Giving different emotional stimulations, the contagion level of users changes depending on the current emotion state for they have been affected. Smiling Wall was designed for spreading smiles, and the smile emotional contagion was calculated by counting smile count during interaction. Finally, in Quartic Smile, the real-time emotional contagion between users were observed. Emotional contagion state was calculated from smile count, and contagion speed were calculated from the total duration of smiles between users. A questionnaire was used after the interactive experience for understanding the emotional contagion state of users during the interaction, the willingness of users in participation, and the distance between users. Hence, in the work, the properties of emotional contagion were observed and verified through the design of interactive installations. | en |
| dc.description.provenance | Made available in DSpace on 2021-06-17T01:57:35Z (GMT). No. of bitstreams: 1 ntu-106-D99944005-1.pdf: 4024817 bytes, checksum: d15ffcba64e15703225ba52cbaee4fd2 (MD5) Previous issue date: 2017 | en |
| dc.description.tableofcontents | TABLE OF CONTENTS
誌謝 i 摘要 iii ABSTRACT v TABLE OF CONTENTS vii TABLE OF FIGURES x TABLE OF TABLES xiii CHAPTER 1 INTRODUCTION 14 1.1 Background and Motivation 14 1.2 Definition of Emotional Contagion 16 1.3 Outline of this Research 18 CHAPTER 2 Related Work 21 2.1 Research on Emotional Contagion 21 2.2 Emotional Contagion in Interactive Works 24 CHAPTER 3 Interactive Artwork - Meditation Pyramid 28 3.1 Concept of the Work 28 3.2 System Design 34 3.2.1 Interactive Installation Design 34 3.2.2 Display Design 36 3.3 Method of Breath Detection and Bioinformation Management 38 3.3.1 Breathing Detection and Bioinformation Management Using Depth Camera, PVB and SCR 38 3.3.2 Lotus Flower and Guided Breathing 40 3.4 Experiments and Discussion 42 CHAPTER 4 Interactive Artwork - Smiling Wall 53 4.1 Concept of the Work 53 4.2 System Design 55 4.2.1 System Flow of Smiling Wall 55 4.2.2 Steps of Emotional Contagion 57 4.3 Method of Human Attention Detection Based on Head Pose Estimation 60 4.3.1 Deformable Face Model and Shape-free Facial Representation 60 4.3.2 Model Initialization 62 4.3.3 Head Pose Tracking with Eye Constraints 63 4.3.5 Automatic Head Pose Tracking 66 4.3.6 Data Collection 68 4.3.7 Water Transition Effect 69 4.3.7 Smile Detection 70 4.4 Experiments and Discussion 71 CHAPTER 5 Interactive Artwork - Quartic Smile 89 5.1 Concept of the Work 89 5.2 System Design 92 5.3 System Flow 94 5.4 Method of Quartic Smile 96 5.4.1 Breathing LED Lighting 96 5.4.2 Virtual Window View Using Motion Detection 98 5.4.3 Video Recording Using Drone 98 5.4.4 Wormhole for Smile Connection 101 5.4.5 Smile Detection Using HOG 102 5.4.6 Interaction Design Between Users 103 5.5 Experiments and Discussion 105 5.5.1 Experiments 105 5.5.2 Joint Smiling and Response Time 109 5.5.3 Questionnaire Design 121 CHAPTER 6 Conclusion and Future Work 130 6.1 Summary 130 6.2 Future Directions 135 LIST OF REFERENCES 137 | |
| dc.language.iso | en | |
| dc.subject | 情緒感染 | zh_TW |
| dc.subject | 互動藝術 | zh_TW |
| dc.subject | 人機互動 | zh_TW |
| dc.subject | 情感運算 | zh_TW |
| dc.subject | 呼吸偵測 | zh_TW |
| dc.subject | 微笑偵測 | zh_TW |
| dc.subject | 公共顯示裝置 | zh_TW |
| dc.subject | Public display | en |
| dc.subject | Emotional contagion | en |
| dc.subject | Interactive art | en |
| dc.subject | Human - computer interaction | en |
| dc.subject | Affective computing | en |
| dc.subject | Breathing detection | en |
| dc.subject | Smile detection | en |
| dc.title | 基於情緒感染在互動藝術裝置之研究 | zh_TW |
| dc.title | A Study in Interactive Art Installation Based on Emotional Contagion | en |
| dc.type | Thesis | |
| dc.date.schoolyear | 105-2 | |
| dc.description.degree | 博士 | |
| dc.contributor.oralexamcommittee | 詹景裕(Ching-Yuh Jan),張智星(Jyh-Shing Jang),陳玲鈴(Lin-Lin Chen),王照明(Chao-Ming Wang),袁廣鳴(Goang-Ming Yuan) | |
| dc.subject.keyword | 情緒感染,互動藝術,人機互動,情感運算,呼吸偵測,微笑偵測,公共顯示裝置, | zh_TW |
| dc.subject.keyword | Emotional contagion,Interactive art,Human - computer interaction,Affective computing,Breathing detection,Smile detection,Public display, | en |
| dc.relation.page | 146 | |
| dc.identifier.doi | 10.6342/NTU201701757 | |
| dc.rights.note | 有償授權 | |
| dc.date.accepted | 2017-07-21 | |
| dc.contributor.author-college | 電機資訊學院 | zh_TW |
| dc.contributor.author-dept | 資訊網路與多媒體研究所 | zh_TW |
| 顯示於系所單位: | 資訊網路與多媒體研究所 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-106-1.pdf 未授權公開取用 | 3.93 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
