請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101760完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 鄭龍磻 | zh_TW |
| dc.contributor.advisor | Lung-Pan Cheng | en |
| dc.contributor.author | 梁舜勛 | zh_TW |
| dc.contributor.author | Shun-Hsun Liang | en |
| dc.date.accessioned | 2026-03-04T16:21:07Z | - |
| dc.date.available | 2026-03-05 | - |
| dc.date.copyright | 2026-03-04 | - |
| dc.date.issued | 2026 | - |
| dc.date.submitted | 2026-02-23 | - |
| dc.identifier.citation | [1] R. Arora, R. H. Kazi, D. M. Kaufman, W. Li, and K. Singh. Magicalhands: Mid-air hand gestures for animating in vr. In Proceedings of the 32nd annual ACM symposium on user interface software and technology, pages 463–477, 2019.
[2] K. B. Aspiranti and D. M. Hulac. Using fidget spinners to improve on-task classroom behavior for students with adhd. Behavior analysis in practice, 15(2):454–465, 2022. [3] L. Biel. Fidget toys or focus tools. Autism File, 74:12–13, 2017. [4] E. D. Chase, T. Gerstenberg, and S. Follmer. Realism of visual, auditory, and haptic cues in phenomenal causality. In 2023 IEEE World Haptics Conference (WHC), pages 306–312. IEEE, 2023. [5] S. B. da Câmara, R. Agrawal, and K. Isbister. Identifying children’s fidget object preferences: toward exploring the impacts of fidgeting and fidget-friendly tangibles. In Proceedings of the 2018 Designing Interactive Systems Conference, pages 301–311, 2018. [6] M. D. Dogan, E. J. Gonzalez, K. Ahuja, R. Du, A. Colaço, J. Lee, M. Gonzalez-Franco, and D. Kim. Augmented object intelligence with xr-objects. In Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology, pages 1–15, 2024. [7] P. J. Drew, A. T. Winder, and Q. Zhang. Twitches, blinks, and fidgets: important generators of ongoing neural activity. The Neuroscientist, 25(4):298–313, 2019. [8] K. Isbister. Fidget toys aren't just hype. The Conversation, 2017. [9] S. M. Jaswal, A. K. De Bleser, and T. C. Handy. Misokinesia is a sensitivity to seeing others fidget that is prevalent in the general population. Scientific reports, 11(1):17204, 2021. [10] C. Ji and K. Isbister. Ar fidget: augmented reality experiences that support emotion regulation through fidgeting. In CHI Conference on Human Factors in Computing Systems Extended Abstracts, pages 1–4, 2022. [11] M. Karlesky and K. Isbister. Designing for the physical margins of digital workspaces: fidget widgets in support of productivity and creativity. In Proceedings of the 8th international conference on tangible, embedded and embodied interaction, pages 13–20, 2014. [12] M. Karlesky and K. Isbister. Understanding fidget widgets: Exploring the design space of embodied self-regulation. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction, pages 1–10, 2016. [13] L. H. Kim, V. Domova, Y. Yao, and P. Rajabi. Swarmfidget: Exploring programmable actuated fidgeting with swarm robots. In Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology, pages 1–15, 2023. [14] S. L. Kriescher, D. M. Hulac, A. M. Ryan, and B. L. King. Evaluating the evidence for fidget toys in the classroom. Intervention in School and Clinic, 59(1):66–69, 2023. [15] J. Li, Q. Yang, K. Xu, Y. Zhang, and C. Xu. Echosight: Streamlining bidirectional virtual-physical interaction with in-situ optical tethering. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems, pages 1–18, 2025. [16] R.-H. Liang, B. Yu, M. Xue, J. Hu, and L. M. Feijs. Biofidget: Biofeedback for respiration training using an augmented fidget spinner. In Proceedings of the 2018 CHI conference on human factors in computing systems, pages 1–12, 2018. [17] D. Lindlbauer, A. M. Feit, and O. Hilliges. Context-aware online adaptation of mixed reality interfaces. In Proceedings of the 32nd annual ACM symposium on user interface software and technology, pages 147–160, 2019. [18] K. Monteiro, R. Vatsal, N. Chulpongsatorn, A. Parnami, and R. Suzuki. Teachable reality: Prototyping tangible augmented reality with everyday objects by leveraging interactive machine teaching. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, pages 1–15, 2023. [19] R. Nguyen, C. Gouin-Vallerand, and M. Amiri. Hand interaction designs in mixed and augmented reality head mounted display: a scoping review and classification. Frontiers in Virtual Reality, 4:1171230, 2023. [20] R. Nyqvist. Fidgeting for creativity. 2016. [21] K. Perrykkad and J. Hohwy. Fidgeting as self-evidencing: A predictive processing account of non-goal-directed action. New Ideas in Psychology, 56:100750, 2020. [22] J. Persia. Examining the impacts of subtle fidget jewelry on anxiety, stress, and attention. 2023. [23] O. Ricciardi, P. Maggi, and F. D. Nocera. Boredom makes me'nervous': fidgeting as a strategy for contrasting the lack of variety. International journal of human factors and ergonomics, 6(3):195–207, 2019. [24] S. H. Ross, N. Sullivan, and J. A. Yoon. Virtual fidgets: Opportunities and design principles for bringing fidgeting to online learning. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems, pages 1–6, 2023. [25] A. Sayara, E. L. Chen, C. Nguyen, R. Xiao, and D. Yoon. Gesturecanvas: A programming by demonstration system for prototyping compound freehand interaction in vr. In Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology, pages 1–17, 2023. [26] J. Shi, Z. Guo, S. Cheng, Y. Liu, M. Zhang, and Z. Xiao. 3d object recognition based on point cloud geometry construction and embeddable attention. In Image and Graphics : 12th International Conference, ICIG 2023, Nanjing, China, September 22– 24, 2023, Proceedings, Part III, page 235– 246, Berlin, Heidelberg, 2023. Springer-Verlag. [27] S. Stalvey and H. Brasell. Using stress balls to focus the attention of sixth-grade learners. Journal of At-Risk Issues, 12(2):7–16, 2006. [28] T. Wang, X. Qian, F. He, X. Hu, Y. Cao, and K. Ramani. Gesturar: An authoring system for creating freehand interactive augmented reality applications. In The 34th Annual ACM Symposium on User Interface Software and Technology, pages 552–567, 2021. [29] D. Weimer and S. K. Ganapathy. A synthetic visual environment with hand gesturing and voice input. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’89, page 235– 240, New York, NY, USA, 1989. Association for Computing Machinery. [30] J. O. Wobbrock, M. R. Morris, and A. D. Wilson. User-defined gestures for surface computing. In Proceedings of the SIGCHI conference on human factors in computing systems, pages 1083–1092, 2009. [31] N. Zhou, Y. Sun, S. Devleminck, and L. Geurts. Squeezable interface for emotion regulation in work environments. In Proceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction, pages 1–7, 2024. | - |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101760 | - |
| dc.description.abstract | Fidgeting 泛指人類為了自我調節而進行的低強度、重複性的動作。雖然傳統上這種行為根植於觸覺體驗(如擠壓或輕敲),但近期的研究開始探索透過視覺介面來豐富這類行為,並擴展其表達潛力。然而,現有的方法鮮少將「Fidget 動作」與「回饋」之間的映射關係正式化,且往往依賴於特定的工具,這與日常自然的動作有所脫節。為了彌補這一缺口,我們開發了一套混合實境創作系統,旨在以日常的 Fidgeting 為基礎,進行視覺化的設計。我們展示了該介面的工作流程,並詳細介紹了從前置研究中提取出的基礎可供性映射。研究結果展示了 10 款由使用者創作的視覺設計及其相應的映射策略。 | zh_TW |
| dc.description.abstract | Fidgeting involves low-effort, often repetitive hand movements used for self-regulation. While traditionally grounded in haptic experiences such as squeezing or tapping, recent research explores visual medium to enrich fidgeting and extend its expressive potential. However, these approaches rarely formalize gesture–feedback mappings and often rely on specialized tools that diverge from natural, everyday gestures. To address this gap, we developed a mixed reality (MR) authoring system that enables visual fidget design grounded in everyday fidgeting behaviors. We demonstrate the workflow of our interface and walk through the underlying affordance mapping extracted from our formative study. Our results show 10 user-created visual fidget designs with mapping strategies. | en |
| dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2026-03-04T16:21:07Z No. of bitstreams: 0 | en |
| dc.description.provenance | Made available in DSpace on 2026-03-04T16:21:07Z (GMT). No. of bitstreams: 0 | en |
| dc.description.tableofcontents | 摘要 iii
Abstract v Contents vii List of Figures ix Chapter 1 Introduction 1 Chapter 2 Related Work 5 2.1 Fidgeting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.2 Enrich Fidget Experience through Visual Effects . . . . . . . . . . . 6 2.3 Design Exploration through Mixed Reality . . . . . . . . . . . . . . 7 Chapter 3 Formative Study 9 3.1 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 3.2 Apparatus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.3 Task . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.4 Procedure and Measures . . . . . . . . . . . . . . . . . . . . . . . . 12 3.5 Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 3.6 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 3.7 Findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 Chapter 4 Authoring System 19 4.1 Affordance Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . 22 4.1.1 Motional Affordance . . . . . . . . . . . . . . . . . . . . . . . . . 22 4.1.2 Causal Temporality . . . . . . . . . . . . . . . . . . . . . . . . . . 25 4.2 Animation Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . 25 4.3 Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4.3.1 Gesture Registration Pipeline . . . . . . . . . . . . . . . . . . . . . 26 4.3.2 Inference Module . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 4.3.3 Runtime Adaptation . . . . . . . . . . . . . . . . . . . . . . . . . . 28 Chapter 5 Validation Study 29 5.1 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 5.2 Task and Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 5.3 Findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 Chapter 6 Discussion 33 6.1 Visual Effect as a Catalyst for Fidgeting . . . . . . . . . . . . . . . . 33 6.2 Enhancing Self-Awareness and Reducing Disruptive Habits . . . . . 34 6.3 Emotion Regulation . . . . . . . . . . . . . . . . . . . . . . . . . . 35 Chapter 7 Limitations & Future Work 37 7.1 System Constrains . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 7.2 Experiment Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 Chapter 8 Conclusion 41 References 43 Appendix A — Codebook 49 | - |
| dc.language.iso | en | - |
| dc.subject | 混合實境 | - |
| dc.subject | Fidgeting | - |
| dc.subject | 具身互動 | - |
| dc.subject | 創作工具 | - |
| dc.subject | Mixed Reality | - |
| dc.subject | Fidgeting | - |
| dc.subject | Embodied Interaction | - |
| dc.subject | Authoring Tool | - |
| dc.title | 視覺 Fidget 互動:範例、設計框架與混合實境中的創作工具 | zh_TW |
| dc.title | Visual Fidget Interactions: Examples, Design Frameworks, and Authoring Tools in Mixed Reality | en |
| dc.type | Thesis | - |
| dc.date.schoolyear | 114-1 | - |
| dc.description.degree | 碩士 | - |
| dc.contributor.oralexamcommittee | Lawrence Kim;楊興棟;Masahiko Inami | zh_TW |
| dc.contributor.oralexamcommittee | Lawrence Kim;Xing-Dong Yang;Masahiko Inami | en |
| dc.subject.keyword | 混合實境,Fidgeting具身互動創作工具 | zh_TW |
| dc.subject.keyword | Mixed Reality,FidgetingEmbodied InteractionAuthoring Tool | en |
| dc.relation.page | 49 | - |
| dc.identifier.doi | 10.6342/NTU202600777 | - |
| dc.rights.note | 同意授權(全球公開) | - |
| dc.date.accepted | 2026-02-24 | - |
| dc.contributor.author-college | 電機資訊學院 | - |
| dc.contributor.author-dept | 資訊網路與多媒體研究所 | - |
| dc.date.embargo-lift | 2026-03-05 | - |
| 顯示於系所單位: | 資訊網路與多媒體研究所 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-114-1.pdf | 1.24 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
