Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電機工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/67381
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor傅立成(Li-Chen Fu)
dc.contributor.authorEdwinn Gamborinoen
dc.contributor.author愛德溫zh_TW
dc.date.accessioned2021-06-17T01:29:58Z-
dc.date.available2020-08-08
dc.date.copyright2017-08-08
dc.date.issued2017
dc.date.submitted2017-08-04
dc.identifier.citation[1] Blount, R., T. Piira, and L. Cohen, “Management of Pediatric Pain and Distress Due to Medical Procedures” in Handbook of Pediatric Psychology, pp. 216–233, 2003.
[2] Jeong, S., “Developing a Social Robotic Companion for Stress and Anxiety Mitigation in Pediatric Hospitals”, Master Thesis, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology (MIT), 2014.
[3] DeMore, M. and L. L. Cohen, “Distraction for Pediatric Immunization Pain: A Critical Review” in Journal of Clinical Psychology Medical Settings, vol. 12, no. 4, pp. 281–291, Dec. 2005.
[4] Christenfeld, N., “Memory for Pain and the Delayed Effects of Distraction” in Heal. Psychology, vol. 16, no. 4, pp. 327–330, 1997.
[5] Dahlquist, L. M., K.E. Weiss, L.D. Ciendaniel, E.F. Law, C.S. Ackerman and K.D. McKenna, “Effects of Videogame Distraction using a Virtual Reality Type Head-mounted Display Helmet on Cold Pressor Pain in Children” in Journal of Pediatric Psychology, vol. 34, no. 5, pp. 574–584, 2009.
[6] Schechter, N. L., W.T. Zempsky, L.L. Cohen, P.J. McGrath, C.M. McMurtry and N.S. Bright, “Pain Reduction During Pediatric Immunizations: Evidence-based Review and Recommendations” in Pediatrics, vol. 119, no. 5, pp. 1184–1198, 2007.
[7] Beran, T. N., A. Ramirez-Serrano, O. G. Vanderkooi and S. Kuhn, “Reducing Children’s Pain and Distress Towards Flu Vaccinations : A Novel and Effective Application of Humanoid Robotics” in Vaccine, vol. 31, no. 25, pp. 2772–2777, 2013.
[8] Chambers, C. T., “The Impact of Maternal Behavior on Children’s Pain Experiences: An Experimental Analysis” in Journal of Pediatric Psychology, vol. 27, no. 3, pp. 293–301, 2002.
[9] Braun, C., T. Stangler, J. Narveson, and S. Pettingell, “Animal-assisted Therapy as a Pain Relief Intervention for Children” in Journal of Complementary Therapies in Clinical Practice., vol. 15, no. 2, pp. 105–9, 2009.
[10] Kaminski, M., T. Pellino and J. Wish, “Play and Pets: The Physical and Emotional Impact of Child Life and Pet Therapy on Hospitalized Children” in Journal of Child Health Care, vol. 31, no. 4, pp. 321–335, Dec. 2002.
[11] Bers, M. U., E. Ackermann, and J. Cassell, “Interactive Storytelling Environments: Coping with Cardiac Illness at Boston’s Children’s Hospital” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 603–610, 1998.
[12] “Child Life: Empowering Children and Families” from www.childlife.org.
[13] Breazeal, C., “Designing Sociable Robots” in The MIT Press, 2002.
[14] Dragone M., B. R. Duffy, and G. M. P. O’Hare, “Social Interaction between Robots, Avatars and Humans” in Proceedings of the IEEE International Conference on Affective Computing and Intelligent Interaction (ACII 2009), pp. 24–29, 2009.
[15] Jost, C., V. Andre, B. Le Pevedic, A. Lemasson, M. Hausberger and D. Duhaut, “Ethological Evaluation of Human-Robot Interaction: Are Children More Efficient and Motivated with Computer, Virtual Agent or Robots?” in Proceedings of the IEEE International Conference on Robotics and Biomimetics, 2012.
[16] Baum, M.M., N. Bergstrom, N. F. Langston, and L. Thoma, “Physiological Effects of Human/Companion Animal Bonding” in Journal of Nursing Research, vol. 33, no. 3, pp. 126–129, 1984.
[17] Gammonley, J. and J. Yates, “Pet Projects Animal Assisted Therapy in Nursing Homes,” in Journal of Gerontological Nursing, vol. 17, no. 1, pp. 12–15, 1991.
[18] Wada, K. and T. Shibata, “Living With Seal Robots — Its Sociopsychological and Physiological Influences on the Elderly at a Care House” in IEEE Transactions on Robotics, Vol. 23, No. 5, 2007.
[19] Yamamoto, S. and R. Kimura “Investigation of Playing with Entertainment Robotic Pet of Pre-school Aged Child in Nursery School by Video Observation” in Proceedings of the SICE Annual Conference, 2007.
[20] Wada K., T. Shibata and Y. Kawaguchi “Long-Term Robot Therapy in a Health Service Facility for the Aged - A Case Study for 5 Years –” in Proceedings of the IEEE International Conference on Rehabilitation Robotics (ICORR), 2009.
[21] Kawaguchi, Y., K. Wada, M. Okamoto, T. Tsujii, T. Shibata and K. Sakatani, “Investigation of brain Activity During Interaction with Seal Robot by fNIRS” in Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 2011.
[22] Belpaeme, T., P.E. Baxter, R. Read, R. Wood, H. Cuayahuitl, B. Kiefer, S. Racioppa, I. Kruijff-Korbayova, G. Athanasopoulos, V. Enescu, R. Looije, M. Neerincx, Y. Demiris, R. Ros-Espinosa, A. Beck, L. Canamero, A. Hiolle, M. Lewis, I. Baroni, M. Nalin, P. Cosi, G. Paci, F. Tesser, G. Sommavilla and R. Humbert, “Multimodal Child-Robot Interaction: Building Social Bonds” in Journal of Human-Robot Interaction, Vol. 1, No. 2, pp. 33-53, 2012.
[23] Ros-Espinoza, R., M. Nalin, R. Wood, P. Baxter, R. Looije and Y. Demiris “Child-Robot Interaction in the Wild – Advice to the Aspiring Experimenter” in Proceedings of the International Conference on Multimodal Interaction (ICMI’11), 2011.
[24] Coninx, A., P. Baxter, E. Oleari, S. Bellini, B. Bierman, O. Blanson Henkemans, L. Canamero, P. Cosi, V. Enescu, R. Ros-Espinoza, A. Hiolle, R. Humbert, B. Kiefer, I. Kurjff-Korbayova, R. Looije, M. Mosconi, M. Neerincx, G. Paci, G. Patsis, C. Pozzi, F. Sacchitelli, H. Sahli, A. Sanna, G. Sommavilla, F. Tesser, Y. Demiris and T. Belpaeme, “Towards Long-Term Social Child-Robot Interaction: Using Multi-Activity Switching to Engage Young Users” in Journal of Human-Robot Interaction, Vol. 5, No. 1, pp. 32-67, 2016.
[25] Jeong, S., K. Dos Santos, S. Graca, B. O’Connell, L. Anderson, N. Stenquist, K. Fitzpatrick, H. Goodenough, D. Logan, P. Weinstock and C. Brezeal, “Designing a Socially Assistive Robot for Pediatric Care” in Proceedings of the International Design and Children Conference (IDC’15), 2015.
[26] Ullrich, D. , S. Diefenbach and A. Butz, “Murphy Miserable Robot – A Companion to Support Children’s Well-being in Emotionally Difficult Situations” in CHI’16 Extended Abstracts, 2016.
[27] Schachter, S., “The Psychology of Affiliation” Academic Press, San Diego, CA, 1959.
[28] Jain, D., L. Mosenlechner an M. Beetz, “Equipping Robot Control Programs with First-Order Probabilistic Reasoning Capabilities” in IEEE International Conference on Robotics and Automation (ICRA), 2009.
[29] Stefano, D., F. Ferreira, P. S. Martins, A. G. S. Conceicao and A. L. da Costa, “Propositional Temporal Logic for Planning in an Embedded Concurrent Autonomous Agent” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2015.
[30] Frampton, M. and O. Lemon, “Recent Research Advances in Reinforcement Learning in Spoken Dialogue Systems,” in Knowledge Engineering Review, vol. 24, no. 4, pp. 375–408, 2009.
[31] Garbarino, M., M. Lai, D. Bender, R. W. Picard and S. Tognetti, 'Empatica E3 — A Wearable Wireless Multi-Sensor Device for Real-Time Computerized Biofeedback and Data Acquisition' in Proceedings of the EAI 4th International Conference on Wireless Mobile Communication and Healthcare (Mobihealth), pp. 39-42, 2014.
[32] Short, E., K. Swift-Spong, J. Greczek, A. Ramachandran, A. Litoiu, E.C. Grigoire, D. Feil-Seifer, S. Shuster, J.J. Lee, S. Huang, S. Levonisova, S. Litz, J. Li, G. Ragusa, D. Spruijt-Metz, M. Mataric and B. Scassellati, “How to Train Your DragonBot: Socially Assistive Robots for Teaching Children About Nutrition Through Play”, In Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication, 2014.
[33] Cruz, F., S. Magg, C. Webber and S. Wermter, “Training Agents with Interactive Reinforcement Learning and Contextual Affordances” in IEEE Transactions in Cognitive and Developmental Systems, Vol. 8, No. 4, December 2016.
[34] Cuayahuitl, H., S. Renals, O. Lemon, and H. Shimodaira, “Evaluation of a Hierarchical Reinforcement Learning Spoken Dialogue System,” in Computer Speech and Language, vol. 24, no. 2, pp. 395–429, 2010.
[35] Kruijff-Korbayova, I., H. Cuayahuitl, B. Kiefer, M. Schroder, P. Cosi, G. Paci, G. Sommavilla, F. Tesser, H. Sahli, G. Athanasopoulos, W. Wang, V. Enescu and W. Verhelst, “Spoken Language Processing in a Conversational System for Child-Robot Interaction” in 3rd Workshop on Child, Computer and Interaction (WOCCI), 2012.
[36] Ekman, P., and W. V. Friesen, “Constants across Cultures in the Face and Emotion” in Journal of Personality and Social Psychology, vol. 17, no. 2, pp. 124-129, 1971.
[37] Matsumoto, D. and H. Hyi-Sung, “Reading Facial Expressions of Emotion” in Psychological Science Agenda, American Psychological Association, 2011.
[38] Bann, E.Y. and J. J. Bryson, “The Conceptualization of Emotion Qualia: Semantic Clustering of Emotional Tweets.” in Proceedings of the 13th Neural Computation and Psychology Workshop, 2013.
[39] Zhang, S., H. Jiang, and J. M. Carroll, “Integrating Online and Offline Community Through Facebook” in Proceedings of the International Conference on Collaboration Technologies and Systems (CTS), 2011.
[40] Fahlman, S., K.B. Mercer-Lynn, D.B. Flora and J.D. Eastwood, “Development and Validation of the Multidimensional State Boredom Scale (MSBS)” Advance online publication, 2011.
[41] Doherty-Sneddon, G., V. Bruce, L. Bonner, S. Longbotham and C. Doyle “Development of Gaze Aversion as Disengagement from Visual Information” in Journal of Developmental Psychology, Vol 38(3), 438-445, 2002.
[42] Darwin, C., “Expression of the Emotions in Man and Animals” in Oxford University Press, Inc., 2002.
[43] Asthana, A., J. Saragih, M. Wagner, and R. Goecke, “Evaluating AAM Fitting Methods for Facial Expression Recognition” in Proceedings of the International Conference on Affective Computing and Intelligent Interaction (ACII), 2009.
[44] Arroyo, I., D.G. Cooper, W. Burleson, B.P. Woolf, K. Muldner, and R. Christopherson, “Emotion Sensors Go to School” in Proceedings of the 14th Conference in Artificial Intelligence in Education, pp. 17-24, 2009.
[45] Zeng, Z., M. Pantic, G.I. Roisman, and T.S. Huang, “A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions” in IEEE Transactions in Pattern Analysis and Machine Intelligence, vol. 31, no. 1, pp. 39-58, 2009.
[46] Monkaresi, H., N. Bosch, R. A. Calvo and S. K. D’Mello “Automated Detection of Engagement using Video-Based Estimation of Facial Expressions and Heart Rate” in IEEE Transactions on Affective Computing, Vol. 8, Issue 1, January-March, pp. 15-28, 2017.
[47] D’Mello, S.K. and J. Kory, “A Review and Meta-Analysis of Multimodal Affect Detection Systems” in ACM Computing Surveys, Vol. 47, Issue 3, 2015.
[48] Dearden, R., N. Friedman, S. Russel, “Bayesian Q-Learning” in American Association for Artificial Intelligence, 1998.
[49] Van Seijen, H., H. Van Hasselt, S. Whiteson and M. Wiering “A Theoretical and Empirical Analysis of Expected Sarsa” in Proceedings of the IEEE Symposium on Adaptive Dynamic Programming and Reinforcement Learning, 2009.
[50] Mnih, V., K. Kavukcuoglu, D. Silver, A. Graves, I. Antonoglou, D. Wierstra and M. Riedmiller, “Playing Atari with Deep Reinforcement Learning” DeepMind Technologies, in NIPS Deep Learning Workshop, 2013.
[51] Knox, W.B., and P. Stone, “Interactively Shaping Agents Via Human Reinforcement: The TAMER Framework,” in Proceedings of the 5th International Conference in Knowledge Capture, pp. 9–16, 2009.
[52] Cruz, F., S. Magg, C. Webber and S. Wermter, “Training Agents with Interactive Reinforcement Learning and Contextual Affordances” in IEEE Transactions on Cognitive and Developmental Systems, Vol. 8 No. 4, 2016.
[53] Thomaz, A.L. and C. Breazeal, “Reinforcement Learning with Human Teachers: Evidence of Feedback and Guidance with Implications for Learning Performance,” in Proceedings of the 21st National Conference in Artificial Intelligence, vol. 1. pp. 1000–1005, 2006.
[54] Knox, W. B. and P. Stone, “Reinforcement Learning from Human Reward: Discounting in Episodic Tasks,” in Proceedings of the IEEE International Symposium in Robot-Human Interactive Communication RO-MAN, pp. 878–885, 2012.
[55] “Mapping Expressions to Emotions – Affectiva Developer Portal”, from https://developer.affectiva.com/mapping-expressions-to-emotions/
[56] Sutton, R.S. and A. G. Barto, “Reinforcement Learning: An Introduction.” MIT Press, 1998.
[57] Tanaka, F. and T. Kimura, “Care-Receiving Robot as a Tool of Teachers in Child Education” in Journal of Interaction Studies, Vol. 11:2, pp. 263-268, 2010.
[58] Tanaka, F. and S. Matsuzoe, “Children Teach a Care-Receiving Robot to Promote Their Learning: Field Experiments in a Classroom for Vocabulary Learning”, in Journal of Human-Robot Interaction, Vol. 1 No. 1, pp. 78-95, 2012.
[59] Tanaka, F., K. Isshiki, F. Takahashi, M. Uekusa, R. Sei and K. Hayashi, “Pepper Learns Together with Children: Development of an Educational Application”, in Proceeding of the IEEE-RAS International Conference on Humanoid Robots, 2015.
[60] Hatava, P., G. L. Olsson and M. Lagerkranser “Preoperative Psychological Preparation for Children Undergoing ENT Operations: A Comparison of Two Methods” in Journal of Paediatric Anaesthesia 10(5):477-86, 2005.
[61] Jay, S. M., M. Ozolins, C. H. Elliot and S. Caldwell, “Assessment of Children's Distress During Painful Medical Procedures” in Journal of Health Psychology, Vol 2(2), 133-147, 1983.
[62] Kuo, J. R., P. R. Goldin, K. Wermer, R. G. Heimberg, J. J. Gross, “Childhood Trauma and Current Psychological Functioning in Adults with Social Anxiety Disorder” in Journal of Anxiety Disorders, 25(4): 467–473, 2011.
[63] National Institute of Child Health and Human Development (NICHD) Protocol (Revised, 2014).
[64] Lamb, M.E., K.J. Sternberg, Orbach Y., Esplin P.W., Stewart H. and Mitchell S. “Age Differences in Young Children's Responses to Open-Ended Invitations in the Course of Forensic Interviews.” in Journal of Consulting and Clinical Psychology, 71(5):926-34, 2003.
[65] Sternberg K.J., Lamb M.E., Hershkowitz I., Yudilevitch L., Orbach Y., Esplin P.W. and Hovav M., “Effects of Introductory Style on Children's Abilities to Describe Experiences of Sexual Abuse” in Journal on Child Abuse and Neglect; 21(11):1133-46, 1997.
[66] Hershkowitz, I. and M. E. Lamb, “Allegation Rates in Forensic Child Abuse Investigations: Comparing the Revised and Standard NICHD Protocols” in American Psychology Association, 2014.
[67] Cautilli, J., T. C. Riley-Tillman, S. Axelrod and P. Hineline “The Role of Verbal Conditioning in Third Generation Behavior Therapy” in The Behavior Analyst Today, 6(2), 137-146, 2005.
[68] “認定開発パートナー企業様 (Certified Developer Partners)” (in Japanese) from: https://robohon.com/co/link.php
[69] “Affectiva – Emotion Recognition Software and Analysis” from: https://www.affectiva.com/
[70] Cerekovic, A., O. Aran, and D. Gatica-Perez, “How Do You Like Your Virtual Agent: Human-Agent Interaction Experience Through Nonverbal Features and Personality Traits,” in Human Behavior Understanding, 2014.
[71] Cerekovic, A., O. Aran, and D. Gatica-Perez, “Rapport with Virtual Agents: What do Human Social Cues and Personality Explain?” in IEEE Transactions on Affective Computing, Vol. PP, Issue: 99, 2017.
[72] Ivaldi, S., S. Lefort, J. Peters, M. Chetouani, J. Provasi, and E. Zibetti, “Towards Engagement Models That Consider Individual Factors in HRI: On the Relation of Extroversion and Negative Attitude Towards Robots to Gaze and Speech During a Human-Robot Assembly Task,” in International Journal on Social Robotics, vol. 9, no. 1, pp. 63-68, 2017.
[73] Celiktutan, O. and H. Gunes, “Computational Analysis of Human-Robot Interactions through First-Person Vision: Personality and Interaction Experience,' in Proceedings of the 24th IEEE International Symposium of Robot Human Interactive Communication (RO-MAN), pp. 815-820, 2015.
[74] John, O. P. and Srivastava, S. “The Big Five Trait Taxonomy: History, Measurement, and Theoretical Perspectives” in L. A. Pervin & O. P. John (Eds.), Handbook of Personality: Theory and Research (2nd ed., pp. 102-138), New York: Guilford Press, 1999.
[75] Laurent, J. P., and S. Catanzaro, “A Measure of Positive and Negative Affect for Children: Scale Development and Preliminary Validation” in Journal of Psychology Assessment, vol. 11, no. 3, p. 326, 1999.
[76] Tickle-Dengen, L. and R. Rosenthal, “The Nature of Rapport and Its Nonverbal Correlates” in Psychological Inquiry, vol. 1, pp. 285–293, 1990.
[77] Lakens, D. and M. Stel, “If They Move in Sync, They Must Feel in Sync: Movement Synchrony Leads to Attributions of Rapport and Entitativity,” in Social Cognition, vol. 29(1), pp. 1–14, 2011.
[78] Bernieri, F. J., J. Davis, R. Rosenthal, and C. Knee, “Interactional Synchrony and Rapport: Measuring Synchrony in Displays Devoid of Sound and Facial Affect,” in Personality and Social Psychology Bulletin, vol. 20, pp. 303–311, 1994.S. L. Christenson, A. L. Reschly, and C. Wylie, Eds., Handbook of Research in Student Engagement. New York: Springer, 2012.
[79] Calvo, R. A., and D. Peters, “Positive Computing: Technology for Wellbeing and Human Potential” The MIT Press, 2014.
[80] Salam, H., O. Celikutan, I. Hupont, H. Gunes and M. Chetouani, “Fully Automatic Analysis of Engagement and its Relationship to Personality in Human-Robot Interactions” in IEEE Access, 2016.
[81] Lombard, M., L. Weinstein, and T. Ditton, “Measuring Telepresence: The Validity of the Temple Presence Inventory (TPI) in a Gaming Context” in Proceedings of the Annual Conference of the International Society of Presence Research (ISPR), pp. 1-9, 2011.
[82] Cohen, J. “Statistical Power Analysis for Behavioral Sciences” (Revised Edition). New York: Academic Press, 1977.
[83] Punch, S., “Research with Children – The Same or Different from Research with Adults?” in Childhood, 2002.
[84] Kline, P. “The Handbook of Psychological Testing” (2nd Edition.). London: Routledge, 2000.
[85] DeVellis, R. F., “Scale Development: Theory and Applications.” Los Angeles: Sage, 2012.
[86] Tomkins, S. S., “Affect Imagery Consciousness: Volume I, The Positive Affects.” London: Tavistock, 1962.
[87] Tomkins, S. S., “Affect Imagery Consciousness: Volume II, The Negative Affects.” London: Tavistock, 1963.
[88] Ketal, R., “Affect, Mood, Emotion and Feeling: Semantic Considerations”, in The American Journal of Psychiatry, vol. 132, Issue 11, pp. 1215-1217, 1975.
[89] Russell, J.A., “Culture and the Categorization of Emotions' in Psychological Bulletin, Vol. 110 (3), pp. 426–50, 1991.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/67381-
dc.description.abstract社交輔助機器人(SAR)為新興的多學科研究領域,在教育、健康管理和高齡照護有著應用潛能。傳統上來說,機器人的研究題目多著重如動作、自動導航或電腦視覺等等,相較傳統而言,社交輔助機器人能運用人工智慧和機器人學習工具不只能賦予機器人和人類在社會情境下互動,而且能用於情感支持,特別是在人口中較脆弱的一群,如未成年者和高齡者。
在過去幾年,有些研究致力於探討機器人如何以自然語言處理和輔助機器以融入孩童的人際互動。但尚未有方法提出以情緒理論為行動計劃系統的核心要件。以情感理論為行動計劃制度的核心要件,能使機器人之行動計劃模組選擇最有可能改善小兒科患者情感狀況。
本論文以開發RoBoHoN互動決策計畫模組,來挑戰混合互動強化學習的典範。RoBoHoN為在安卓OS執行的社交型機器人平台,目標為探索機器人是否有參與,並且改善孩童情感狀態之可能性。為了分類病童當下的情感狀態,孩童的臉部表情將被捕捉並且經由處理。接著,運用兒童醫療輔導師的專門知識,配合遠端方式操作,行動計劃模組能夠以互動性的方式學習,在病童的特殊情感狀態下做出最適合的行為與反應。
為了應證所提出研究方法的可用性,我們首先執行健康小學生的試驗研究。我們以應用兒童心理學為基礎的結構型問卷,評估機器人對受試者一般性的情感狀況影響。我們的結果顯示:不只已提出的架構能促使孩童和機器人可靠和流暢的社交互動,加上我們使用的方法,機器人還能增強下列三點能力
• 具有能力改善兒童的情感狀態
• 能夠建立並維持和孩童的社交參與感
• 建立與孩童的友好關係
根據上述研究發現,我們相信此平台具有運用以發展Wizard-of-Oz (遠端操控)於真實環境的潛能(如課程或醫院)。此外,為了改進孩童的情感狀況,機器人將於學習最佳行為後,發行自動反應的版本。
zh_TW
dc.description.abstractSocially Assistive Robotics (SAR) is an emerging multidisciplinary field of study that has potential applications in education, health management and elder care. Traditionally, research in Robotics focuses on topics like motion, navigation or robot vision. A SAR on the other hand, may leverage tools from the fields of Artificial Intelligence and Machine Learning to endow a robot with the ability to not only interact with humans in a social context, but also provide emotional support, in particular for vulnerable populations, such as infants or the elderly.
In the past few years, there have been a few research efforts exploring how a robot can socially engage a child from the perspectives of natural language processing and socially assistive robotics. However, none of the proposed methodologies has made use of emotion theory as a core component of the action-planning system. Doing so could enable the robot’s action-planning module to choose those actions that are the most likely to change the affective state of a child in order to improve his/her mood.
This work challenges the Interactive Reinforcement Learning paradigm by implementing an interactive decision-planning module developed for RoBoHoN, a social robotic platform that runs on Android OS, with the goal of exploring the feasibility of using a robot to engage with children. Facial features of the patient are captured and processed, determining the emotional reaction of the child to a behavior performed by the robot. Then, these emotions are classified as affective states in a multi-dimensional model. Leveraging the expertise of a child life specialist trainer, the action-planning module interactively learns those actions that are the most appropriate to perform when the child subject is in a specific affective state.
In order to validate the usefulness of the proposed methodology, as a first step, we have conducted a pilot study on healthy elementary school aged children. We evaluated the impact of the robot on the participant’s mood through structured questionnaires based on applied pediatric psychology. Our findings show not only that the proposed framework enables a believable and fluid social interaction between child and robot but also that, with our methodology, the robot:
• has the ability to change the affective state of children, improving their mood.
• is able to establish and maintain social engagement with the child.
• can build rapport with the child, as reported by both the child participants and their parents.
Based on these results, we believe that our platform has the potential of being implemented using the developed Wizard-of-Oz interface in a real environment (e.g. classroom or hospital), with the potential of releasing a standalone version after the robot has learned the optimal way to act in order to improve the mood of pediatric patients who visit the hospital.
en
dc.description.provenanceMade available in DSpace on 2021-06-17T01:29:58Z (GMT). No. of bitstreams: 1
ntu-106-R04921098-1.pdf: 3370956 bytes, checksum: 4e60fe401108c492cb85d239bf84219d (MD5)
Previous issue date: 2017
en
dc.description.tableofcontentsAcknowledgements ii
摘要 iii
Abstract iv
List of Figures ix
List of Tables xii
1 Introduction 1
1.1 Motivation 1
1.2 Research Questions and Goals 3
1.3 Contributions 4
1.4 Thesis Overview 4
2 Background and Related Works 6
2.1 Emotional Support Methods for Children 6
2.1.1 Non-Pharmacological Support Methods 6
2.1.2 Child Life Specialists 8
2.2 Social Robots for Emotional Support 8
2.2.1 Paro, The Robotic Seal 9
2.2.2 ALIZ-E Project 11
2.2.3 Huggable Project (Ver. 5) 12
2.2.4 Murphy Miserable Robot 15
2.3 Action Planners for Social Robots 16
2.3.1 Wizard-of-Oz Approaches 18
2.3.2 Machine Learning Approaches 20
3 Affect Classification and Interactive Reinforcement Learning 23
3.1 Preliminary 23
3.1.1 Automatic Emotion Detection 23
3.1.2 Interactive Reinforcement Learning 29
3.2 Proposed Algorithm Overview 35
3.3 Affective State Classification 36
3.3.1 Emotion Detection from Facial Features 37
3.3.2 State Classification 39
3.4 Hybrid Interactive Reinforcement Learning 41
3.4.1 Action Selection 42
4 Implementation of an Assistive Action-Planning Module 46
4.1 Behavior Blocks 46
4.1.1 Behavior Modeling Concepts 48
4.1.2 Development of the Behaviors Library 52
4.2 Devices Overview 54
4.2.1 System Architecture 54
4.2.2 RoBoHoN 56
4.2.3 Tablet Controller and UI 58
4.2.4 Emotion Recognition using Affectiva SDK for Android 61
5 Experiment Design 63
5.1 Pilot Study 63
5.1.1 Overview 63
5.1.2 Participants 64
5.2 Experiment Setup 67
5.3 Evaluation Metrics 72
5.3.1 Aspects of IRL 73
5.3.2 Aspects of cHRI 74
6 Results and Discussion 79
6.1 Results 79
6.1.1 State Classification 79
6.1.2 Interactive Reinforcement Learning 81
6.1.3 cHRI Performance Metrics 85
6.2 Discussion 93
6.2.1 Limitations 93
6.2.2 Future Works 96
6.3 Conclusion 98
References 100
Appendix A. Participant Consent Form 109
Appendix B. Experiment Questionnaires and Interview Items 117
Appendix C. Behavior Blocks Transcription 125
dc.language.isoen
dc.subject儿童机器人交互zh_TW
dc.subject以混合互動強化學習zh_TW
dc.subjectChild-Robot Interactionen
dc.subjectHybrid Interactive Reinforcement Learningen
dc.title以混合互動強化學習方式輔助機器人行動計劃用於孩童情感支持zh_TW
dc.titleHybrid Interactive Reinforcement Learning based Assistive Robot Action-Planner for the Emotional Support of Childrenen
dc.typeThesis
dc.date.schoolyear105-2
dc.description.degree碩士
dc.contributor.oralexamcommittee呂立(Frank Lu),岳修平(Hisu-Ping Yueh),郭重顯(Chung-Hsien Kuo),林維真(Weijane Lin)
dc.subject.keyword儿童机器人交互,以混合互動強化學習,zh_TW
dc.subject.keywordChild-Robot Interaction,Hybrid Interactive Reinforcement Learning,en
dc.relation.page132
dc.identifier.doi10.6342/NTU201702543
dc.rights.note有償授權
dc.date.accepted2017-08-04
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept電機工程學研究所zh_TW
顯示於系所單位:電機工程學系

文件中的檔案:
檔案 大小格式 
ntu-106-1.pdf
  未授權公開取用
3.29 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved