請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/91359
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 林維真 | zh_TW |
dc.contributor.advisor | Weijane Lin | en |
dc.contributor.author | 陳家荷 | zh_TW |
dc.contributor.author | Chia-Ho Chen | en |
dc.date.accessioned | 2024-01-12T16:11:09Z | - |
dc.date.available | 2024-01-13 | - |
dc.date.copyright | 2024-01-12 | - |
dc.date.issued | 2023 | - |
dc.date.submitted | 2023-08-07 | - |
dc.identifier.citation | Abbott, R., Orr, N., McGill, P., Whear, R., Bethel, A., Garside, R., Stein, K., & Thompson-Coon, J. (2019). How do "robopets" impact the health and well-being of residents in care homes? A systematic review of qualitative and quantitative evidence. International Journal of Older People Nursing, 14(3), 23. https://doi.org/10.1111/opn.12239
Abou Allaban, A., Wang, M. Z., & Padir, T. (2020). A systematic review of robotics research in support of in-home care for older adults [Review]. Information, 11(2), 24. https://doi.org/10.3390/info11020075 Abrams, L., & Farrell, M. T. (2011). Language processing in normal aging. In The handbook of psycholinguistic and cognitive processes: Perspectives in communication disorders. (pp. 49-73). Psychology Press. https://doi.org/10.4324/9780203848005.ch3 Alexandris, C. (2015). Signalizing and predicting turn-taking in multilingual contexts: Using data from transcribed international spoken journalistic texts in human-robot interaction. AAAI Spring Symposium - Technical Report, https://www.scopus.com/inward/record.uri?eid=2-s2.0-84987623821&partnerID=40&md5=730d78b7e873da24df239717277d4889 Andrist, S., Tan, X. Z., Gleicher, M., & Mutlu, B. (2014). Conversational gaze aversion for humanlike robots Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction, Bielefeld, Germany. https://doi.org/10.1145/2559636.2559666 Argyle, M., & Cook, M. (1976). Gaze and mutual gaze. Cambridge U Press. Auer, P. (2018). Gaze, addressee selection and turn-taking in three-party interaction. Eye-tracking in interaction: Studies on the role of eye gaze in dialogue, 197-231. Ball, P. (1975). Listeners' responses to filled pauses in relation to floor apportionment. British Journal of Social & Clinical Psychology. Bandura, A. (1977). Self-efficacy: toward a unifying theory of behavioral change. Psychological review, 84(2), 191. Bandura, A. (1986). Social foundations of thought and action. Englewood Cliffs, NJ, 1986(23-28). Bandura, A., Freeman, W. H., & Lightsey, R. (1999). Self-efficacy: The exercise of control. In: Springer. Barata, A. N. (2019). Social robots as a complementary therapy in chronic, progressive diseases. In J. S. Sequeira (Ed.), Robotics in Healthcare: Field Examples and Challenges (Vol. 1170, pp. 95-102). Springer International Publishing Ag. https://doi.org/10.1007/978-3-030-24230-5_5 Bartneck, C., Kulić, D., Croft, E., & Zoghbi, S. (2009). Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International Journal of Social Robotics, 1(1), 71-81. https://doi.org/10.1007/s12369-008-0001-3 Bateson, M. C. (1975). Mother‐infant exchanges: the epigenesis of conversational interaction. Annals of the New York Academy of sciences, 263(1), 101-113. Bavelas, J. B., Coates, L., & Johnson, T. (2002). Listener responses as a collaborative process: The role of gaze. Journal of Communication, 52(3), 566-580. Bemelmans, R., Gelderblom, G. J., Jonker, P., & de Witte, L. (2012). Socially assistive robots in elderly care: A systematic review into effects and effectiveness. Journal of the American Medical Directors Association, 13(2), 114-142. https://doi.org/10.1016/j.jamda.2010.10.002 Ben-David, B. M., Eidels, A., & Donkin, C. (2014). Effects of aging and distractors on detection of redundant visual targets and capacity: Do older adults integrate visual targets differently than younger adults? PLoS ONE, 9(12). https://doi.org/10.1371/journal.pone.0113551 Bernstein, I. H., Chu, P. K., Briggs, P., & Schurman, D. L. (1973). Stimulus intensity and foreperiod effects in intersensory facilitation. Quarterly Journal of Experimental Psychology, 25(2), 171-181. https://doi.org/10.1080/14640747308400336 Billig, M. (1997). The dialogic unconscious: Psychoanalysis, discursive psychology and the nature of repression. British Journal of Social Psychology, 36(2), 139-159. https://doi.org/https://doi.org/10.1111/j.2044-8309.1997.tb01124.x Bortfeld, H., Leon, S. D., Bloom, J. E., Schober, M. F., & Brennan, S. E. (2001). Disfluency Rates in Conversation: Effects of Age, Relationship, Topic, Role, and Gender. Language and Speech, 44(2), 123-147. https://doi.org/10.1177/00238309010440020101 Breazeal, C. (2003). Toward sociable robots. Robotics and Autonomous Systems, 42(3-4), 167-175. https://doi.org/10.1016/S0921-8890(02)00373-1 Brooks, C. J., Chan, Y. M., Anderson, A. J., & McKendrick, A. M. (2018). Audiovisual temporal perception in aging: The role of multisensory integration and age-related sensory loss. Frontiers in Human Neuroscience, 12. https://doi.org/10.3389/fnhum.2018.00192 Bucur, B., Madden, D. J., & Allen, P. A. (2005). Age-related differences in the processing of redundant visual dimensions. Psychol Aging, 20(3), 435-446. https://doi.org/10.1037/0882-7974.20.3.435 Calisgan, E., Haddadi, A., Loos, H. F. M. V. d., Alcazar, J. A., & Croft, E. A. (2012, 9-13 Sept. 2012). Identifying nonverbal cues for automated human-robot turn-taking. 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, Paris, France. Cassell, J. (2001). Embodied conversational agents: representation and intelligence in user interfaces. AI Magazine, 22(4), 67-67. Cassell, J., & Thorisson, K. R. (1999). The power of a nod and a glance: Envelope vs. emotional feedback in animated conversational agents. Applied Artificial Intelligence, 13(4-5), 519-538. https://doi.org/10.1080/088395199117360 Chao, C. (2012). Timing multimodal turn-taking for human-robot cooperation Proceedings of the 14th ACM international conference on Multimodal interaction, Santa Monica, California, USA. https://doi.org/10.1145/2388676.2388744 Chao, C., Lee, J., Begum, M., & Thomaz, A. (2011). Simon plays Simon says: The timing of turn-taking in an imitation game. https://doi.org/10.1109/ROMAN.2011.6005239 Chao, C., & Thomaz, A. (2010, 01/01). Turn-taking for human-robot interaction AAAI Fall Symposium: Dialog with Robots, Chen, N., Song, J., & Li, B. (2019). Providing aging adults social robots' companionship in home-based elder care. Journal of Healthcare Engineering, 2019. https://doi.org/10.1155/2019/2726837 Chen, S. C., Jones, C., & Moyle, W. (2018). Social robots for depression in older adults: A systematic review. Journal of Nursing Scholarship, 50(6), 612-622. https://doi.org/10.1111/jnu.12423 Cheng, P.-L. (2017). The timing of turn-taking in question-response neighboring adjacency pairs in Taiwan Mandarin. Chinese language education and research(25), 251-273. Chowdhury, S., Stepanov, E., & Riccardi, G. (2016). Predicting user satisfaction from turn-taking in spoken conversations. https://doi.org/10.21437/Interspeech.2016-859 Clark, E. V., & Lindsey, K. L. (2015). Turn-taking: a case study of early gesture and word use in answering WHERE and WHICH questions. Frontiers in Psychology, 6. https://doi.org/10.3389/fpsyg.2015.00890 Clark, H. H. (2005). Coordinating with each other in a material world. Discourse Studies, 7(4-5), 507-525. https://doi.org/10.1177/1461445605054404 Clark, H. H., & Fox Tree, J. E. (2002). Using uh and um in spontaneous speaking. Cognition, 84(1), 73-111. https://doi.org/https://doi.org/10.1016/S0010-0277(02)00017-3 Clough, S., & Duff, M. C. (2020). The role of gesture in communication and cognition: Implications for understanding and treating neurogenic communication disorders. Frontiers in Human Neuroscience, 14. https://doi.org/10.3389/fnhum.2020.00323 Communication in the real world : An introduction to communication studies. (2016). Open Textbook Library University of Minnesota Libraries Publishing. https://doi.org/https://doi.org/10.24926/8668.0401 Coulson, M. (2004). Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence. Journal of Nonverbal Behavior, 28(2), 117-139. https://doi.org/10.1023/B:JONB.0000023655.25550.be Cuijpers, R. H., & Van Den Goor, V. J. P. (2017). Turn-taking cue delays in human-robot communication. CEUR Workshop Proceedings, Lisbon. Cummins, F. (2012). Gaze and blinking in dyadic conversation: A study in coordinated behaviour among individuals. Language and Cognitive Processes, 27(10), 1525-1549. https://doi.org/10.1080/01690965.2011.615220 D'Onofrio, G., Fiorini, L., Hoshino, H., Matsumori, A., Okabe, Y., Tsukamoto, M., Limosani, R., Vitanza, A., Greco, F., Greco, A., Giuliani, F., Cavallo, F., & Sancarlo, D. (2019). Assistive robots for socialization in elderly people: results pertaining to the needs of the users. Aging Clinical and Experimental Research, 31(9), 1313-1329. https://doi.org/10.1007/s40520-018-1073-z Dautenhahn, K. (1995). Getting to know each other - Artificial social intelligence for autonomous robots. Robotics and Autonomous Systems, 16(2-4), 333-356. https://doi.org/Doi 10.1016/0921-8890(95)00054-2 Dautenhahn, K., & Billard, A. (1999). Bringing up robots or—the psychology of socially intelligent robots: from theory to implementation. Proceedings of the third annual conference on Autonomous Agents, Seattle, Washington, USA. https://doi.org/10.1145/301136.301237 de Araujo, B. S., Fantinato, M., Peres, S. M., de Melo, R. C., Batistoni, S. S. T., Cachioni, M., & Hung, P. C. K. (2021). Effects of social robots on depressive symptoms in older adults: a scoping review. Library Hi Tech, 19. https://doi.org/10.1108/lht-09-2020-0244 de Kok, I., & Heylen, D. (2009). Multimodal end-of-turn prediction in multi-party meetings. Proceedings of the 2009 international conference on Multimodal interfaces, Cambridge, Massachusetts, USA. https://doi.org/10.1145/1647314.1647332 De Ruiter, J.-P., Mitterer, H., & Enfield, N. J. (2006). Projecting the end of a speaker's turn: A cognitive cornerstone of conversation. Language, 82(3), 515-535. Diederich, A., & Colonius, H. (2004). Bimodal and trimodal multisensory enhancement: Effects of stimulus onset and intensity on reaction time. Perception & Psychophysics, 66(8), 1388-1404. https://doi.org/10.3758/BF03195006 Doty, R. L., & Kamath, V. (2014). The influences of age on olfaction: a review. Front Psychol, 5, 20. https://doi.org/10.3389/fpsyg.2014.00020 Dragone, M., Duffy, B. R., & Hare, G. M. P. O. (2005, 13-15 Aug. 2005). Social interaction between robots, avatars & humans. ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, Nashville, TN, USA. Duffy, B. R. (2003). Anthropomorphism and the social robot. Robotics and Autonomous Systems, 42(3), 177-190. https://doi.org/https://doi.org/10.1016/S0921-8890(02)00374-3 Duncan, S. (1972). Some signals and rules for taking speaking turns in conversations. Journal of Personality and Social Psychology, 23, 283-292. Duncan, S. (1974). On the structure of speaker–auditor interaction during speaking turns. Language in Society, 3(2), 161-180. Edlund, J., & Heldner, M. (2005). Exploring prosody in interaction control. Phonetica, 62(2-4), 215-226. https://doi.org/doi:10.1159/000090099 Eggins, S., & Slade, D. (1997). Analysing casual conversation. Cassell. Erden, M. S. (2013). Emotional postures for the humanoid-robot nao. International Journal of Social Robotics, 5(4), 441-456. https://doi.org/10.1007/s12369-013-0200-4 Fisk, A. D., Czaja, S. J., Rogers, W. A., Charness, N., & Sharit, J. (2018). Designing for older adults: Principles and creative human factors approaches (second edition ed.). CRC Press. https://books.google.com.na/books?id=uSXmGIHXyZUC Fong, T., Nourbakhsh, I., & Dautenhahn, K. (2003). A survey of socially interactive robots. Robotics and Autonomous Systems, 42(3-4), 143-166. https://doi.org/10.1016/S0921-8890(02)00372-X Ford, C., & Thompson, S. (1996). Interactional units in conversation: Syntactic, intonational and pragmatic resources. Interaction and grammar(13), 134. Fuente, L. A., Ierardi, H., Pilling, M., & Crook, N. T. (2015). Influence of upper body pose mirroring in human-robot interaction. In A. Tapus, E. André, J.-C. Martin, F. Ferland, & M. Ammi, Social Robotics Cham. Garrod, S., & Pickering, M. J. (2015). The use of content and timing to predict turn transitions [Hypothesis and Theory]. Frontiers in Psychology, 6. https://doi.org/10.3389/fpsyg.2015.00751 Gasteiger, N., Loveys, K., Law, M., & Broadbent, E. (2021). Friends from the Future: A Scoping Review of Research into Robots and Computer Agents to Combat Loneliness in Older People [Review]. Clinical Interventions in Aging, 16, 941-971. https://doi.org/10.2147/cia.S282709 Ghafurian, M., Hoey, J., & Dautenhahn, K. (2021). Social robots for the care of persons with dementia: A systematic review. Acm Transactions on Human-Robot Interaction, 10(4), 31, Article 41. https://doi.org/10.1145/3469653 Gilmartin, E., Cowan, B. R., Vogel, C., & Campbell, N. (2018). Explorations in multiparty casual social talk and its relevance for social human machine dialogue. Journal on Multimodal User Interfaces, 12(4), 297-308. https://doi.org/10.1007/s12193-018-0274-2 Gilmartin, E., Wade, V., Saam, C., Campbell, N., & Vogel, C. (2018). Just talking-modelling casual conversation. https://doi.org/10.18653/v1/W18-5006 Glikson, E., & Woolley, A. W. (2020). Human trust in artificial intelligence: review of empirical research. Academy of Management Annals, 14(2), 627-660. https://doi.org/10.5465/annals.2018.0057 Godfrey, J. J., Holliman, E. C., & McDaniel, J. (1992, 23-26 March 1992). SWITCHBOARD: telephone speech corpus for research and development. ICASSP-92: 1992 IEEE International Conference on Acoustics, Speech, and Signal Processing, San Francisco, CA, USA. Gonzalez-Gonzalez, C. S., Violant-Holz, V., & Gil-Iranzo, R. M. (2021). Social robots in hospitals: A systematic review [Review]. Applied Sciences-Basel, 11(13), 23, Article 5976. https://doi.org/10.3390/app11135976 Graham, J. A., & Argyle, M. (1975). A cross-cultural study of the communication of extra-verbal meaning by gesture. International Journal of Psychology, 10(1), 57-67. https://doi.org/10.1080/00207597508247319 Gravano, A., & Hirschberg, J. (2011). Turn-taking cues in task-oriented dialogue. Computer Speech & Language, 25(3), 601-634. Guest, D., Howard, C. J., Brown, L. A., & Gleeson, H. (2015). Aging and the rate of visual information processing. Journal of Vision, 15(14), 10-10. https://doi.org/10.1167/15.14.10 Hall, E. T. (1966). The hidden dimension. Doubleday. Harada, C. N., Love, M. C. N., & Triebel, K. L. (2013). Normal cognitive aging. Clinics in Geriatric Medicine, 29(4), 737-+. https://doi.org/10.1016/j.cger.2013.07.002 Hart, J. W., Gleeson, B. T., Pan, M. K. X. J., Moon, A., MacLean, K. E., & Croft, E. A. (2014). Gesture, gaze, touch, and hesitation: Timing cues for collaborative work. Hasher, L., & Zacks, R. T. (1988). Working memory, comprehension, and aging: A review and a new view. In G. H. Bower (Ed.), Psychology of Learning and Motivation (Vol. 22, pp. 193-225). Academic Press. https://doi.org/https://doi.org/10.1016/S0079-7421(08)60041-9 Heerink, M., Krose, B., Evers, V., & Wielinga, B. (2008, 1-3 Aug. 2008). The influence of social presence on enjoyment and intention to use of a robot and screen agent by elderly users. RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication, Heldner, M., & Edlund, J. (2010). Pauses, gaps and overlaps in conversations. Journal of Phonetics, 38(4), 555-568. https://doi.org/https://doi.org/10.1016/j.wocn.2010.08.002 Hjalmarsson, A., & Oertel, C. (2011, 2012). Gaze direction as a back-channel inviting cue in dialogue IVA 2012 workshop on Realtime Conversational Virtual Agents, September 15th, 2012, Santa Cruz, California, Santa Cruz, CA, USA. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-109388 Holler, J. (2022). Visual bodily signals as core devices for coordinating minds in interaction. Philosophical Transactions of the Royal Society B: Biological Sciences, 377(1859), 20210094. https://doi.org/doi:10.1098/rstb.2021.0094 Holler, J., Kendrick, K. H., & Levinson, S. C. (2018). Processing language in face-to-face conversation: Questions with gestures get faster responses. Psychonomic Bulletin & Review, 25(5), 1900-1908. https://doi.org/10.3758/s13423-017-1363-z Huang, C.-M., & Mutlu, B. (2013). Modeling and evaluating narrative gestures for humanlike robots. https://doi.org/10.15607/RSS.2013.IX.026 Hurtado, L. C., Vinas, P. F., Zalama, E., Gomez-Garcia-Bermejo, J., Delgado, J. M., & Garcia, B. V. (2021). Development and usability validation of a social robot platform for physical and cognitive stimulation in elder care facilities [Article]. Healthcare, 9(8), 13, Article 1067. https://doi.org/10.3390/healthcare9081067 Ishii, R., Otsuka, K., Kumano, S., & Yamato, J. (2014). Analysis of respiration for prediction of" who will be next speaker and when?" in multi-party meetings. Proceedings of the 16th international conference on multimodal interaction, Ishii, R., Otsuka, K., Kumano, S., & Yamato, J. (2016). Prediction of who will be the next speaker and when using gaze behavior in multiparty meetings. ACM Transactions on Interactive Intelligent Systems (TIIS), 6(1), 1-31. Janeczko, Z., & Foster, M. E. (2022). A study on human interactions with robots based on their appearance and behaviour. ACM International Conference Proceeding Series, Johansson, M., & Skantze, G. (2015). Opportunities and obligations to take turns in collaborative multi-party human-robot interaction. https://doi.org/10.18653/v1/W15-4642 Johansson, M., Skantze, G., & Gustafson, J. (2013). Head pose patterns in multiparty human-robot team-building interactions. https://doi.org/10.1007/978-3-319-02675-6_35 Johansson, M., Skantze, G., & Gustafson, J. (2014). Comparison of human-human and human-robot turn-taking behaviour in multiparty situated interaction. UM3I 2014 - Proceedings of the 2014 ACM Workshop on Understanding and Modeling Multiparty, Multimodal Interactions, Co-located with ICMI 2014, Munich, Germany. Jokinen, K., Furukawa, H., Nishida, M., & Yamamoto, S. (2013). Gaze and turn-taking behavior in casual conversational interactions. ACM Trans. Interact. Intell. Syst., 3(2), Article 12. https://doi.org/10.1145/2499474.2499481 Jokinen, K., Nishida, M., & Yamamoto, S. (2010). On eye-gaze and turn-taking. Proceedings of the 2010 workshop on Eye gaze in intelligent human machine interaction, Hong Kong, China. https://doi.org/10.1145/2002333.2002352 Kadushin, A., & Kadushin, G. (1997). The social work interview: A guide for human service professionals. Columbia University Press. Kawahara, T., Iwatate, T., & Takanashi, K. (2012). Prediction of turn-taking by combining prosodic and eye-gaze information in poster conversations. Thirteenth Annual Conference of the International Speech Communication Association, Portland, Oregon, USA. Kendon, A. (1967). Some functions of gaze-direction in social interaction. Acta Psychologica, 26, 22-63. https://doi.org/https://doi.org/10.1016/0001-6918(67)90005-4 Kong, A. P.-H., Law, S.-P., Kwan, C. C.-Y., Lai, C., & Lam, V. (2015). A coding system with independent annotations of gesture forms and functions during verbal communication: Development of a database of speech and gesture (DoSaGE). Journal of Nonverbal Behavior, 39(1), 93-111. https://doi.org/10.1007/s10919-014-0200-6 Kontogiorgos, D., Pereira, A., Andersson, O., Koivisto, M., Rabal, E. G., Vartiainen, V., & Gustafson, J. (2019). The Effects of anthropomorphism and non-verbal social behaviour in virtual assistants Proceedings of the 19th ACM International Conference on Intelligent Virtual Agents, Paris, France. https://doi.org/10.1145/3308532.3329466 Koudenburg, N., Postmes, T., & Gordijn, E. H. (2017). Beyond content of conversation:The role of conversational form in the emergence and regulation of social structure. Personality and Social Psychology Review, 21(1), 50-71. https://doi.org/10.1177/1088868315626022 Lala, D., Inoue, K., & Kawahara, T. (2019). Smooth turn-taking by a robot using an online continuous model to generate turn-taking cues 2019 International Conference on Multimodal Interaction, Suzhou, China. https://doi.org/10.1145/3340555.3353727 Lala, D., Milhorat, P., Inoue, K., Ishida, M., Takanashi, K., & Kawahara, T. (2017). Attentive listening system with backchanneling, response generation and flexible turn-taking. Saarbrücken, Germany. Latikka, R., Rubio-Hernandez, R., Lohan, E. S., Rantala, J., Fernandez, F. N., Laitinen, A., & Oksanen, A. (2021). Older adults' loneliness, social isolation, and physical information and communication technology in the era of ambient assisted living: A systematic literature review. Journal of Medical Internet Research, 23(12), 16, Article e28022. https://doi.org/10.2196/28022 Latikka, R., Turja, T., & Oksanen, A. (2019). Self-efficacy and acceptance of robots. Computers in Human Behavior, 93, 157-163. https://doi.org/https://doi.org/10.1016/j.chb.2018.12.017 Lee, K. M., Jung, Y., Kim, J., & Kim, S. R. (2006). Are physically embodied social agents better than disembodied social agents?: The effects of physical embodiment, tactile interaction, and people's loneliness in human–robot interaction. International Journal of Human-Computer Studies, 64(10), 962-973. https://doi.org/https://doi.org/10.1016/j.ijhcs.2006.05.002 Lee, S., & Naguib, A. M. (2020). Toward a sociable and dependable elderly care robot: Design, implementation and user study. Journal of Intelligent & Robotic Systems, 98(1), 5-17. https://doi.org/10.1007/s10846-019-01028-8 Levinson, S. C., & Torreira, F. (2015). Timing in turn-taking and its implications for processing models of language. Frontiers in Psychology, 6. https://doi.org/10.3389/fpsyg.2015.00731 Li, J., & Chignell, M. (2011). Communication of emotion in social robots through simple head and arm movements. International Journal of Social Robotics, 3(2), 125-142. https://doi.org/10.1007/s12369-010-0071-x Lim, L. L., & Kua, E.-H. (2011). Living alone, loneliness, and psychological well-being of older persons in Singapore. Current Gerontology and Geriatrics Research, 2011, 673181. https://doi.org/10.1155/2011/673181 Looije, R., Neerincx, M. A., & Cnossen, F. (2010). Persuasive robotic assistant for health self-management of older adults: Design and evaluation of social behaviors [Article]. International Journal of Human-Computer Studies, 68(6), 386-397. https://doi.org/10.1016/j.ijhcs.2009.08.007 Maalouf, N., Sidaoui, A., Elhajj, I. H., & Asmar, D. (2018). Robotics in nursing: A scoping review. Journal of Nursing Scholarship, 50(6), 590-600. https://doi.org/10.1111/jnu.12424 Mackenzie, C. (2000). Adult spoken discourse: the influences of age and education. Int J Lang Commun Disord, 35(2), 269-285. https://doi.org/10.1080/136828200247188 Magyari, L., Bastiaansen, M. C., de Ruiter, J. P., & Levinson, S. C. (2014). Early anticipation lies behind the speed of response in conversation. J Cogn Neurosci, 26(11), 2530-2539. https://doi.org/10.1162/jocn_a_00673 Mavridis, N. (2015). A review of verbal and non-verbal human-robot interactive communication. Robotics and Autonomous Systems, 63, 22-35. https://doi.org/10.1016/j.robot.2014.09.031 McCarthy, M. (2010). Spoken fluency revisited. English profile journal, 1. McColl, D., & Nejat, G. (2013). Meal-time with a socially assistive robot and older adults at a long-term care facility. J. Hum.-Robot Interact., 2(1), 152–171. https://doi.org/10.5898/JHRI.2.1.McColl McFarland, D. H. (2001). Respiratory markers of conversational interaction. McNeill, D. (1992). Hand and mind: What gestures reveal about thought. University of Chicago Press. McNeill, D. (2005). Gesture and thought. University of Chicago Press. https://doi.org/10.7208/chicago/9780226514642.001.0001 Mehrabian, A., & Ferris, S. R. (1967). Inference of attitudes from nonverbal communication in two channels. Journal of Consulting Psychology, 31(3), 248-252. https://doi.org/10.1037/h0024648 Mehrabian, A., & Wiener, M. (1967). Decoding of inconsistent communications. J Pers Soc Psychol, 6(1), 109-114. https://doi.org/10.1037/h0024532 Mejia, C., & Kajikawa, Y. (2017). Bibliometric analysis of social robotics research: Identifying research trends and knowledgebase. Applied Sciences-Basel, 7(12). https://doi.org/10.3390/app7121316 Miller, J. (1982). Divided attention: Evidence for coactivation with redundant signals. Cognitive Psychology, 14(2), 247-279. https://doi.org/https://doi.org/10.1016/0010-0285(82)90010-X Mondada, L. (2007). Multimodal resources for turn-taking: pointing and the emergence of possible next speakers. Discourse Studies, 9(2), 194-225. https://doi.org/10.1177/1461445607075346 Morett, L. M. (2014). When hands speak louder than words: The role of gesture in the communication, encoding, and recall of words in a novel second language. Modern Language Journal, 98(3), 834-853. https://doi.org/10.1111/j.1540-4781.2014.12125.x Mori, M., MacDorman, K. F., & Kageki, N. (2012). The uncanny valley [From the Field]. Ieee Robotics & Automation Magazine, 19(2), 98-100. https://doi.org/10.1109/MRA.2012.2192811 Mortensen, L., Meyer, A. S., & Humphreys, G. W. (2006). Age-related effects on speech production: A review. Language and Cognitive Processes, 21(1-3), 238-290. https://doi.org/10.1080/01690960444000278 Mutlu, B. (2011). Designing embodied cues for dialogue with robots. AI Magazine, 32(4), 17-30. https://doi.org/10.1609/aimag.v32i4.2376 Mutlu, B., Kanda, T., Forlizzi, J., Hodgins, J., & Ishiguro, H. (2012). Conversational gaze mechanisms for humanlike robots. ACM Trans. Interact. Intell. Syst., 1(2), Article 12. https://doi.org/10.1145/2070719.2070725 Mutlu, B., Shiwa, T., Kanda, T., Ishiguro, H., & Hagita, N. (2009, 11-13 March 2009). Footing in human-robot conversations: How robots might shape participant roles using gaze cues. 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI), La Jolla, California, USA. Muto, Y., Takasugi, S., Yamamoto, T., & Miyake, Y. (2009, 27 Sept.-2 Oct. 2009). Timing control of utterance and gesture in interaction between human and humanoid robot. RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication, Toyama, Japan. Niewiadomski, R., Ceccaldi, E., Huisman, G., Volpe, G., & Mancini, M. (2019). Computational commensality: From theories to computational models for social food preparation and consumption in HCI. Frontiers in Robotics and Ai, 6. https://doi.org/10.3389/frobt.2019.00119 Nishio, T., Yoshikawa, Y., Sakai, K., Iio, T., Chiba, M., Asami, T., Isoda, Y., & Ishiguro, H. (2021). The effects of physically embodied multiple conversation robots on the elderly. Front Robot AI, 8, 633045. https://doi.org/10.3389/frobt.2021.633045 Nomura, T., Suzuki, T., Kanda, T., & Kato, K. (2006a). Measurement of anxiety toward robots. The 15th IEEE International Symposium on Robot and Human Interactive Communication, Hatfield, UK. Nomura, T., Suzuki, T., Kanda, T., & Kato, K. (2006b). Measurement of negative attitudes toward robots. Interaction Studies, 7(3), 437-454. https://doi.org/10.1075/is.7.3.14nom O'Connell, D. C., Kowal, S., & Kaltenbacher, E. (1990). Turn-taking: A critical analysis of the research tradition. Journal of Psycholinguistic Research, 19(6), 345-373. https://doi.org/10.1007/BF01068884 O'Suilleabhain, P. S., Gallagher, S., & Steptoe, A. (2019). Loneliness, living Alone, and all-cause mortality: The role of emotional and social loneliness in the elderly during 19 years of follow-up. Psychosomatic Medicine, 81(6), 521-526. https://doi.org/10.1097/Psy.0000000000000710 Obo, T., & Takizawa, K. (2022). Analysis of timing and effect of visual cue on turn-taking in human-robot interaction. Journal of Robotics and Mechatronics, 34(1), 55-63. https://doi.org/10.20965/jrm.2022.p0055 Oertel, C., Włodarczak, M., Edlund, J., Wagner, P., & Gustafson, J. (2012). Gaze patterns in turn-taking. Thirteenth annual conference of the international speech communication association, Palanica, A., Thommandram, A., & Fossat, Y. (2019). Adult verbal comprehension performance is better from human speakers than social robots, but only for easy questions. International Journal of Social Robotics, 11(2), 359-369. https://doi.org/10.1007/s12369-018-0504-5 Park, C., Kim, J., & Kang, J. H. (2017). Turn-taking intention recognition using multimodal cues in social human-robot interaction. 2017 17th International Conference on Control, Automation and Systems (ICCAS), Jeju, Korea (South). Pecune, F., Callebert, L., & Marsella, S. (2022). Designing persuasive food conversational recommender systems with nudging and socially-aware conversational strategies. Frontiers in Robotics and Ai, 8, 22, Article 733835. https://doi.org/10.3389/frobt.2021.733835 Pelikan, H. R. M., & Broth, M. (2016, 2016). Why that Nao? How humans adapt to a conventional humanoid robot in taking turns-at-talk CHI Conference on Human Factors in Computing Systems, New York, NY, USA. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-131546 Pereira, N., Gonçalves, A. P. B., Goulart, M., Tarrasconi, M. A., Kochhann, R., & Fonseca, R. P. (2019). Age-related differences in conversational discourse abilities: A comparative study. Dementia & neuropsychologia, 13(1), 53-71. https://doi.org/10.1590/1980-57642018dn13-010006 Pu, L. H., Moyle, W., Jones, C., & Todorovic, M. (2019). The effectiveness of social robots for older adults: A systematic review and meta-analysis of randomized controlled studies. Gerontologist, 59(1), E37-E51. https://doi.org/10.1093/geront/gny046 Pütten, A. R.-V. D., & Bock, N. (2018). Development and validation of the self-Efficacy in human-robot-interaction scale (SE-HRI). J. Hum.-Robot Interact., 7(3), Article 21. https://doi.org/10.1145/3139352 Rasmussen, G. (2014). Inclined to better understanding—The coordination of talk and ‘leaning forward’ in doing repair. Journal of Pragmatics, 65, 30-45. https://doi.org/https://doi.org/10.1016/j.pragma.2013.10.001 Riest, C., Jorschick, A. B., & de Ruiter, J. P. (2015). Anticipation in turn-taking: mechanisms and information sources. Frontiers in Psychology, 6, 89-89. https://doi.org/10.3389/fpsyg.2015.00089 Rios-Martinez, J., Spalanzani, A., & Laugier, C. (2015). From proxemics theory to socially-aware navigation: A survey. International Journal of Social Robotics, 7(2), 137-153. https://doi.org/10.1007/s12369-014-0251-1 Robinson, H., MacDonald, B., & Broadbent, E. (2014). The role of healthcare robots for older people at home: A review. International Journal of Social Robotics, 6(4), 575-591. https://doi.org/10.1007/s12369-014-0242-2 Robinson, N. L., Cottier, T. V., & Kavanagh, D. J. (2019). Psychosocial health interventions by social robots: Systematic review of randomized controlled trials. Journal of Medical Internet Research, 21(5), 20. https://doi.org/10.2196/13203 Rochet-Capellan, A., & Fuchs, S. (2014). Take a breath and take the turn: how breathing meets turns in spontaneous dialogue. Philosophical Transactions of the Royal Society B: Biological Sciences, 369(1658), 20130399. Royakkers, L., & van Est, R. (2015). A literature review on new robotics: Automation from love to war. International Journal of Social Robotics, 7(5), 549-570. https://doi.org/10.1007/s12369-015-0295-x Sacks, H., Schegloff, E. A., & Jefferson, G. (1974). A simplest systematics for the organization of turn-taking for conversation. Language, 50(4), 696-735. https://doi.org/10.2307/412243 Salem, M., Rohlfing, K., Kopp, S., & Joublin, F. (2011, 31 July-3 Aug. 2011). A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction. 20th IEEE International Symposium on Robot and Human Interactive Communication, Atlanta, Georgia, USA. Saunderson, S., & Nejat, G. (2019). How robots influence humans: A survey of nonverbal communication in social human-robot interaction. International Journal of Social Robotics, 11(4), 575-608. https://doi.org/10.1007/s12369-019-00523-0 Schegloff, E. A. (2000). Overlapping talk and the organization of turn-taking for conversation. Language in Society, 29(1), 1-63. https://doi.org/10.1017/S0047404500001019 Schegloff, E. A., Jefferson, G., & Sacks, H. (1977). The preference for self-correction in the organization of repair in conversation. Language, 53(2), 361-382. https://doi.org/10.2307/413107 Schegloff, E. A., & Sacks, H. (1973). Opening up closings. 8(4), 289-327. https://doi.org/doi:10.1515/semi.1973.8.4.289 Schramm, W., Chaffee, S. H., & Rogers, E. M. (1997). The beginnings of communication study in America : a personal memoir. Sage. Scoglio, A. A. J., Reilly, E. D., Gorman, J. A., & Drebing, C. E. (2019). Use of social robots in mental health and well-being research: Systematic review. Journal of Medical Internet Research, 21(7), 14, Article e13322. https://doi.org/10.2196/13322 Sheridan, T. B. (2020). A review of recent research in social robotics [Review]. Current Opinion in Psychology, 36, 7-12. https://doi.org/10.1016/j.copsyc.2020.01.003 Shibata, T., & Wada, K. (2011). Robot therapy: A new approach for mental healthcare of the elderly - A mini-review [Review]. Gerontology, 57(4), 378-386. https://doi.org/10.1159/000319015 Shiwa, T., Kanda, T., Imai, M., Ishiguro, H., & Hagita, N. (2008, 12-15 March 2008). How quickly should communication robots respond? 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI), Amsterdam, The Netherlands. Sikveland, R., & Ogden, R. (2012). Holding gestures across turns: Moments to generate shared understanding. Gesture, 12. https://doi.org/10.1075/gest.12.2.03sik Skantze, G. (2016). Real-time coordination in human-robot interaction using face and voice. AI Magazine, 37(4), 19-31. https://doi.org/10.1609/aimag.v37i4.2686 Skantze, G. (2017). Predicting and regulating participation equality in human-robot conversations: Effects of age and gender Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria. https://doi.org/10.1145/2909824.3020210 Skantze, G. (2021). Turn-taking in conversational systems and human-robot interaction: A review. Computer Speech & Language, 67, 101178. https://doi.org/10.1016/j.csl.2020.101178 Skantze, G., Hjalmarsson, A., & Oertel, C. (2014). Turn-taking, feedback and joint attention in situated human–robot interaction. Speech Communication, 65, 50-66. https://doi.org/https://doi.org/10.1016/j.specom.2014.05.005 Skantze, G., Johansson, M., & Beskow, J. (2015). Exploring turn-taking cues in multi-party human-robot discussions about objects. Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, Seattle, Washington, USA. https://doi.org/10.1145/2818346.2820749 Song, Y., & Yan, L. X. (2020). Trust in AI agent: A systematic review of facial anthropomorphic trustworthiness for social robot design. Sensors, 20(18). https://doi.org/ARTN 5087 10.3390/s20185087 Spence, C., Mancini, M., & Huisman, G. (2019). Digital commensality: Eating and drinking in the company of technology. Frontiers in Psychology, 10. https://doi.org/ARTN 2252 10.3389/fpsyg.2019.02252 Stivers, T., Enfield, N. J., Brown, P., Englert, C., Hayashi, M., Heinemann, T., Hoymann, G., Rossano, F., de Ruiter, J. P., Yoon, K. E., & Levinson, S. C. (2009). Universals and cultural variation in turn-taking in conversation. Proceedings of the National Academy of Sciences of the United States of America, 106(26), 10587-10592. https://doi.org/10.1073/pnas.0903616106 Streeck, J., & Hartge, U. (1992). Previews: Gestures at the transition place. In P. Auer & A. Di Luzio (Eds.), The Contextualization of Language (pp. 135-157). John Benjamins. Takahashi, M., Tanaka, H., Yamana, H., & Nakajima, T. (2017). Virtual co-eating: Making solitary eating experience more enjoyable. In N. Munekata, I. Kunita, & J. Hoshino, Entertainment Computing – ICEC 2017 Cham. Tatarian, K., Stower, R., Rudaz, D., Chamoux, M., Kappas, A., & Chetouani, M. (2021). How does modality matter? Investigating the synthesis and effects of multi-modal robot behavior on social intelligence. International Journal of Social Robotics. https://doi.org/10.1007/s12369-021-00839-w Thepsoonthorn, C., Ogawa, K., & Miyake, Y. (2018). The relationship between robot's nonverbal behaviour and human's likability based on human's personality. Scientific Reports, 8. https://doi.org/10.1038/s41598-018-25314-x Thomaz, A. L., & Chao, C. (2011). Turn taking based on information flow for fluent human-robot interaction. AI Magazine, 32(4), 53-63. https://doi.org/10.1609/aimag.v32i4.2379 Todd, J. W. (1912). Reaction to multiple stimuli. The Science Press. https://doi.org/10.1037/13053-000 Torreira, F., Bögels, S., & Levinson, S. C. (2015). Breathing for answering: the time course of response planning in conversation. Frontiers in Psychology, 6. https://doi.org/10.3389/fpsyg.2015.00284 Torta, E., Van Heumen, J., Piunti, F., Romeo, L., & Cuijpers, R. (2015). Evaluation of unimodal and multimodal communication cues for attracting attention in human-robot interaction. International Journal of Social Robotics, 7(1), 89-96. https://doi.org/10.1007/s12369-014-0271-x Trevarthen, C. (1977). Descriptive analyses of infant communicative behaviour. Studies in mother-infant interaction. Tsiourti, C., Weiss, A., Wac, K., & Vincze, M. (2019). Multimodal integration of emotional signals from voice, body, and context: Effects of (in)congruence on emotion recognition and attitudes towards robots. International Journal of Social Robotics, 11(4), 555-573. https://doi.org/10.1007/s12369-019-00524-z Tye-Murray, N. (2003). Conversational fluency of children who use cochlear implants. Ear and Hearing, 24(1), 82S-89S. Urakami, J., & Sutthithatip, S. (2021). Building a collaborative relationship between human and robot through verbal and non-verbal interaction Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, Boulder, CO, USA. https://doi.org/10.1145/3434074.3447171 van de Rijt, L. P. H., Roye, A., Mylanus, E. A. M., van Opstal, A. J., & van Wanrooij, M. M. (2019). The principle of Inverse effectiveness in audiovisual speech perception [Original Research]. Frontiers in Human Neuroscience, 13. https://doi.org/10.3389/fnhum.2019.00335 Van Dijk, E. T., Torta, E., & Cuijpers, R. H. (2013). Effects of eye contact and iconic gestures on message retention in human-robot interaction. International Journal of Social Robotics, 5(4), 491-501. https://doi.org/10.1007/s12369-013-0214-y Van Schendel, J., & Cuijpers, R. (2015). Turn-yielding cues in robot-human conversation. Velichkovsky, B. M. (1995). Communicating attention: Gaze position transfer in cooperative problem solving. Pragmatics & Cognition, 3(2), 199-223. Ventola, E. (1979). The structure of casual conversation in english. Journal of Pragmatics, 3(3), 267-298. https://doi.org/https://doi.org/10.1016/0378-2166(79)90034-1 Verhaeghen, P. (2013). The Elements of Cognitive Aging: Conclusions. In The elements of cognitive aging: Meta-analyses of age-related differences in processing speed and their consequences (pp. 300-310). Oxford University Press. https://doi.org/10.1093/acprof:oso/9780195368697.003.0010 Vogan, A. A., Alnajjar, F., Gochoo, M., & Khalid, S. (2020). Robots, AI, and cognitive training in an era of mass age-Related cognitive decline: A systematic review [Review]. IEEE Access, 8, 18284-18304. https://doi.org/10.1109/access.2020.2966819 Walters, M. L., Dautenhahn, K., Boekhorst, R. t., Kheng Lee, K., Kaouri, C., Woods, S., Nehaniv, C., Lee, D., & Werry, I. (2005, 13-15 Aug. 2005). The influence of subjects' personality traits on personal spatial zones in a human-robot interaction experiment. ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, Nashville, TN, USA. Wang, Y.-T., Nip, I. S., Green, J. R., Kent, R. D., Kent, J. F., & Ullman, C. (2012). Accuracy of perceptual and acoustic methods for the detection of inspiratory loci in spontaneous speech. Behavior research methods, 44(4), 1121-1128. Ward, N. G. (2019). Prosodic patterns in English conversation. Cambridge University Press. https://doi.org/DOI: 10.1017/9781316848265 Weerakoon, D., Subbaraju, V., Karumpulli, N., Tran, T., Xu, Q., Tan, U.-X., Lim, J. H., & Misra, A. (2020). Gesture enhanced comprehension of ambiguous human-to-robot instructions Proceedings of the 2020 International Conference on Multimodal Interaction, Virtual Event, Netherlands. https://doi.org/10.1145/3382507.3418863 Weiß, C. (2018). When gaze-selected next speakers do not take the turn. Journal of Pragmatics, 133, 28-44. Wittenburg, P., Brugman, H., Russel, A., Klassmann, A., & Sloetjes, H. (2006). ELAN: a professional framework for multimodality research. LREC 2006, Genoa, Italy. Włodarczak, M., & Heldner, M. (2016, September 8–12, 2016). Respiratory belts and whistles: A preliminary study of breathing acoustics for turn-taking. Interspeech 2016, San Francisco, USA. Wu, H. L., Yu, Z., Wang, X. J., & Zhang, Q. F. (2020). Language processing in normal aging: Contributions of information-universal and information-specific factors. Acta Psychologica Sinica, 52(5), 541-561. https://doi.org/10.3724/Sp.J.1041.2020.00541 Wu, Y.-H., Fassert, C., & Rigaud, A.-S. (2012). Designing robots for the elderly: Appearance issue and beyond. Archives of Gerontology and Geriatrics, 54(1), 121-126. https://doi.org/https://doi.org/10.1016/j.archger.2011.02.003 Yngve, V. H. (1970). On getting a word in edgewise. Chicago Linguistics Society, 6th Meeting, 1970, 567-578. https://cir.nii.ac.jp/crid/1572824500867512320 Yokozuka, T., Miyamoto, H., Kasai, M., Miyake, Y., & Nozawa, T. (2021). The relationship between turn-taking, vocal pitch synchrony, and rapport in creative problem-solving communication. Speech Communication, 129, 33-40. https://doi.org/https://doi.org/10.1016/j.specom.2021.03.001 Yuan, F. P., Klavon, E., Liu, Z. M., Lopez, R. P., & Zhao, X. P. (2021). A systematic review of robotic rehabilitation for cognitive training. Frontiers in Robotics and Ai, 8, 24, Article 605715. https://doi.org/10.3389/frobt.2021.605715 Zafrani, O., & Nimrod, G. (2019). Towards a holistic approach to studying human-robot interaction in later life [Review]. Gerontologist, 59(1), E26-E36. https://doi.org/10.1093/geront/gny077 Żarkowski, M. (2019). Multi-party turn-taking in repeated human–robot interactions: An interdisciplinary evaluation. International Journal of Social Robotics, 11(5), 693-707. https://doi.org/10.1007/s12369-019-00603-1 Zellers, M., House, D., & Alexanderson, S. (2016). Prosody and hand gesture at turn boundaries in Swedish. Proc. Speech Prosody, Boston, USA. Zima, E., Weiß, C., & Brône, G. (2019). Gaze and overlap resolution in triadic interactions. Journal of Pragmatics, 140, 49-69. Zollick, J. C., Rossle, S., Kluy, L., Kuhlmey, A., & Bluher, S. (2021). Potentials and challenges of social robots in relationships with older people: a rapid review of current debates. Zeitschrift Fur Gerontologie Und Geriatrie, 7. https://doi.org/10.1007/s00391-021-01932-5 | - |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/91359 | - |
dc.description.abstract | 因應全球化高齡社會與隨之而來的照護人力缺口,越來越多研究開始探討如何將社會機器人應用於高齡照護以提升其生活福祉。為了讓機器人擁有更好的對話技巧,經常被應用在人類對話中的非口語話輪輪轉線索如何在高齡者與社會機器人的休閒對話中被使用是本研究的主要研究目標。
為回應本研究的研究問題:高齡者是否會注意並使用非口語話輪輪轉線索以達到順暢的對話?不同的非口語話輪輪轉線索組合如何影響高齡者與機器人的對話順暢度及對話經驗?本研究採取實驗研究以探討非口語話輪輪轉線索與其的不同組合對於非口語話輪輪轉線索的注意、對話順程度及人機互動經驗的影響。實驗參與者須與機器人NAO在下午茶情境中進行休閒對話。本研究使用實驗對話行為與自陳問卷來測量上述變項。 一共47位六十歲(含)以上的高齡者參與實驗。研究結果顯示高齡參與者能夠感知非口語話輪輪轉線索並與機器人完成順暢的對話。雖然不同的非口語話輪輪轉線索組合並不影響主觀對話經驗的評估,客觀行為的話輪輪轉時間顯示使用視線與手勢可以顯著提升高齡者與機器人的對話順暢度。 總結而言,在設計目的為與高齡者進行休閒對話的機器人時,應該考慮納入非口語話輪輪轉線索,因為它可以協助高齡者辨識機器人話輪的結束,並且順暢地接話。此外,本研究也指出,對於高齡者與機器人的對話互動而言,使用視線與手勢而非全身是更為合適且有效率的。 | zh_TW |
dc.description.abstract | As the older population keep growing all over the world and the broadening gap in caregivers followed, how social robots could be utilized in elders’ well-being has become one of the hottest issues. To equip robots with better conversational skills, how non-verbal turn-taking cues, which are essential in human conversation, could be utilized by the elderly in a casual conversation with social robots was of interest in the current study.
Experiment research was adopted to test the effects of non-verbal cues and different combinations of them on the noticeability, fluency of conversation, and HRI experience. The participants were asked to have a casual conversation with the robot, NAO, in a teatime setting. Behavior logs and self-report questionnaires were used to measure the variables. A total of 47 older adults participated. The results revealed that participants had perceived the non-verbal turn-taking cues and achieved fluent conversations. While subjective evaluation showed no effect of different combinations, the floor transfer offset indicated that using gaze and arm-gesture particularly facilitate the elderly users in taking turns. In all, we suggested including non-verbal turn-taking cues when designing robots aiming to have casual conversations with older people. The non-verbal turn-taking cues help older adults identify the end of the robot’s turn and transfer the floor smoothly. In addition, gaze and arm-gesture were found to be valuable and enough for the interaction rather than the whole body. | en |
dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2024-01-12T16:11:09Z No. of bitstreams: 0 | en |
dc.description.provenance | Made available in DSpace on 2024-01-12T16:11:09Z (GMT). No. of bitstreams: 0 | en |
dc.description.tableofcontents | 謝辭 i
摘要 ii Abstract iii 1. Introduction 1 1.1. Background 1 1.2. Research Questions and Objectives 5 2. Literature Review 6 2.1. Social Robots’ Application for Older Adults 6 2.1.1. Older adults' characteristics and need 6 2.1.2. Social robots’ definition and development 8 2.1.3. Social robots’ application for older adults 11 2.2. Turn-taking Mechanism in Conversation 20 2.2.1. Communication, conversation, and turn-taking 20 2.2.2. Turn-taking cues in human-human conversation 28 2.2.3. Turn-taking cues in human-robot conversation 38 3. Methods 50 3.1. Participants 51 3.2. Research Design & Procedure 52 3.3. Instruments 57 3.3.1. Participants’ information collection 57 3.3.2. Robot partner (E1) 58 3.3.3. Fluency of conversation 64 3.4. Data Analysis 69 3.4.1. Annotation and measuring the timing 69 3.4.2. Statistical analyses 71 4. Results and Discussion 72 4.1. Participants 72 4.2. Attention to Non-verbal Cues 74 4.3. Fluency of Conversation 76 4.3.1. Floor transfer offset (FTO) 76 4.3.2. Total number of turn-transition (TTT) 85 4.3.3. Perception of conversational fluency 87 4.4. Human-Robot Interaction Experience 91 4.4.1. Perception of non-verbal cues 93 4.4.2. Comprehension and engagement 94 4.4.3. Anxiety 95 4.4.4. Social presence 96 4.4.5. Overall feelings and future intention 97 4.5. Summary 97 5. Conclusion 99 5.1. Findings of The Study 99 5.1.1. The older adults are able to use the non-verbal turn-taking cues provided by the social robot and achieve a fluent conversation 100 5.1.2. The adding of gesture as non-verbal cue increases the fluency of conversation 101 5.1.3. Different combinations of non-verbal turn-taking cues significantly affect older user’s conversational experience 102 5.2. Implications & Future Research 103 6. Reference 106 | - |
dc.language.iso | en | - |
dc.title | 非口語話輪輪轉線索應用於高齡者與機器人休閒對話情境之研究 | zh_TW |
dc.title | How Non-verbal Turn-taking Cues Affect Elderly Communication with a Social Robot in a Casual Conversation Setting | en |
dc.type | Thesis | - |
dc.date.schoolyear | 111-2 | - |
dc.description.degree | 碩士 | - |
dc.contributor.coadvisor | 岳修平 | zh_TW |
dc.contributor.coadvisor | Hsiu-Ping Yueh | en |
dc.contributor.oralexamcommittee | 傅立成;陳姿伶 | zh_TW |
dc.contributor.oralexamcommittee | Li-Chen Fu;Tzy-Ling Chen | en |
dc.subject.keyword | 高齡者,人機互動,社會機器人,話輪輪轉,非口語線索, | zh_TW |
dc.subject.keyword | Elderly users,Human-robot interaction,Social robots,Turn-taking,Non-verbal cues, | en |
dc.relation.page | 123 | - |
dc.identifier.doi | 10.6342/NTU202303016 | - |
dc.rights.note | 同意授權(全球公開) | - |
dc.date.accepted | 2023-08-09 | - |
dc.contributor.author-college | 文學院 | - |
dc.contributor.author-dept | 圖書資訊學系 | - |
dc.date.embargo-lift | 2028-08-04 | - |
顯示於系所單位: | 圖書資訊學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-111-2.pdf 此日期後於網路公開 2028-08-04 | 1.79 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。