Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊網路與多媒體研究所
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101760
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor鄭龍磻zh_TW
dc.contributor.advisorLung-Pan Chengen
dc.contributor.author梁舜勛zh_TW
dc.contributor.authorShun-Hsun Liangen
dc.date.accessioned2026-03-04T16:21:07Z-
dc.date.available2026-03-05-
dc.date.copyright2026-03-04-
dc.date.issued2026-
dc.date.submitted2026-02-23-
dc.identifier.citation[1] R. Arora, R. H. Kazi, D. M. Kaufman, W. Li, and K. Singh. Magicalhands: Mid-air hand gestures for animating in vr. In Proceedings of the 32nd annual ACM symposium on user interface software and technology, pages 463–477, 2019.
[2] K. B. Aspiranti and D. M. Hulac. Using fidget spinners to improve on-task classroom behavior for students with adhd. Behavior analysis in practice, 15(2):454–465, 2022.
[3] L. Biel. Fidget toys or focus tools. Autism File, 74:12–13, 2017.
[4] E. D. Chase, T. Gerstenberg, and S. Follmer. Realism of visual, auditory, and haptic cues in phenomenal causality. In 2023 IEEE World Haptics Conference (WHC), pages 306–312. IEEE, 2023.
[5] S. B. da Câmara, R. Agrawal, and K. Isbister. Identifying children’s fidget object preferences: toward exploring the impacts of fidgeting and fidget-friendly tangibles. In Proceedings of the 2018 Designing Interactive Systems Conference, pages 301–311, 2018.
[6] M. D. Dogan, E. J. Gonzalez, K. Ahuja, R. Du, A. Colaço, J. Lee, M. Gonzalez-Franco, and D. Kim. Augmented object intelligence with xr-objects. In Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology, pages 1–15, 2024.
[7] P. J. Drew, A. T. Winder, and Q. Zhang. Twitches, blinks, and fidgets: important generators of ongoing neural activity. The Neuroscientist, 25(4):298–313, 2019.
[8] K. Isbister. Fidget toys aren't just hype. The Conversation, 2017.
[9] S. M. Jaswal, A. K. De Bleser, and T. C. Handy. Misokinesia is a sensitivity to seeing others fidget that is prevalent in the general population. Scientific reports, 11(1):17204, 2021.
[10] C. Ji and K. Isbister. Ar fidget: augmented reality experiences that support emotion regulation through fidgeting. In CHI Conference on Human Factors in Computing Systems Extended Abstracts, pages 1–4, 2022.
[11] M. Karlesky and K. Isbister. Designing for the physical margins of digital workspaces: fidget widgets in support of productivity and creativity. In Proceedings of the 8th international conference on tangible, embedded and embodied interaction, pages 13–20, 2014.
[12] M. Karlesky and K. Isbister. Understanding fidget widgets: Exploring the design space of embodied self-regulation. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction, pages 1–10, 2016.
[13] L. H. Kim, V. Domova, Y. Yao, and P. Rajabi. Swarmfidget: Exploring programmable actuated fidgeting with swarm robots. In Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology, pages 1–15, 2023.
[14] S. L. Kriescher, D. M. Hulac, A. M. Ryan, and B. L. King. Evaluating the evidence for fidget toys in the classroom. Intervention in School and Clinic, 59(1):66–69, 2023.
[15] J. Li, Q. Yang, K. Xu, Y. Zhang, and C. Xu. Echosight: Streamlining bidirectional virtual-physical interaction with in-situ optical tethering. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems, pages 1–18, 2025.
[16] R.-H. Liang, B. Yu, M. Xue, J. Hu, and L. M. Feijs. Biofidget: Biofeedback for respiration training using an augmented fidget spinner. In Proceedings of the 2018 CHI conference on human factors in computing systems, pages 1–12, 2018.
[17] D. Lindlbauer, A. M. Feit, and O. Hilliges. Context-aware online adaptation of mixed reality interfaces. In Proceedings of the 32nd annual ACM symposium on user interface software and technology, pages 147–160, 2019.
[18] K. Monteiro, R. Vatsal, N. Chulpongsatorn, A. Parnami, and R. Suzuki. Teachable reality: Prototyping tangible augmented reality with everyday objects by leveraging interactive machine teaching. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, pages 1–15, 2023.
[19] R. Nguyen, C. Gouin-Vallerand, and M. Amiri. Hand interaction designs in mixed and augmented reality head mounted display: a scoping review and classification. Frontiers in Virtual Reality, 4:1171230, 2023.
[20] R. Nyqvist. Fidgeting for creativity. 2016.
[21] K. Perrykkad and J. Hohwy. Fidgeting as self-evidencing: A predictive processing account of non-goal-directed action. New Ideas in Psychology, 56:100750, 2020.
[22] J. Persia. Examining the impacts of subtle fidget jewelry on anxiety, stress, and attention. 2023.
[23] O. Ricciardi, P. Maggi, and F. D. Nocera. Boredom makes me'nervous': fidgeting as a strategy for contrasting the lack of variety. International journal of human factors and ergonomics, 6(3):195–207, 2019.
[24] S. H. Ross, N. Sullivan, and J. A. Yoon. Virtual fidgets: Opportunities and design principles for bringing fidgeting to online learning. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems, pages 1–6, 2023.
[25] A. Sayara, E. L. Chen, C. Nguyen, R. Xiao, and D. Yoon. Gesturecanvas: A programming by demonstration system for prototyping compound freehand interaction in vr. In Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology, pages 1–17, 2023.
[26] J. Shi, Z. Guo, S. Cheng, Y. Liu, M. Zhang, and Z. Xiao. 3d object recognition based on point cloud geometry construction and embeddable attention. In Image and Graphics : 12th International Conference, ICIG 2023, Nanjing, China, September 22– 24, 2023, Proceedings, Part III, page 235– 246, Berlin, Heidelberg, 2023. Springer-Verlag.
[27] S. Stalvey and H. Brasell. Using stress balls to focus the attention of sixth-grade learners. Journal of At-Risk Issues, 12(2):7–16, 2006.
[28] T. Wang, X. Qian, F. He, X. Hu, Y. Cao, and K. Ramani. Gesturar: An authoring system for creating freehand interactive augmented reality applications. In The 34th Annual ACM Symposium on User Interface Software and Technology, pages 552–567, 2021.
[29] D. Weimer and S. K. Ganapathy. A synthetic visual environment with hand gesturing and voice input. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’89, page 235– 240, New York, NY, USA, 1989. Association for Computing Machinery.
[30] J. O. Wobbrock, M. R. Morris, and A. D. Wilson. User-defined gestures for surface computing. In Proceedings of the SIGCHI conference on human factors in computing systems, pages 1083–1092, 2009.
[31] N. Zhou, Y. Sun, S. Devleminck, and L. Geurts. Squeezable interface for emotion regulation in work environments. In Proceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction, pages 1–7, 2024.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101760-
dc.description.abstractFidgeting 泛指人類為了自我調節而進行的低強度、重複性的動作。雖然傳統上這種行為根植於觸覺體驗(如擠壓或輕敲),但近期的研究開始探索透過視覺介面來豐富這類行為,並擴展其表達潛力。然而,現有的方法鮮少將「Fidget 動作」與「回饋」之間的映射關係正式化,且往往依賴於特定的工具,這與日常自然的動作有所脫節。為了彌補這一缺口,我們開發了一套混合實境創作系統,旨在以日常的 Fidgeting 為基礎,進行視覺化的設計。我們展示了該介面的工作流程,並詳細介紹了從前置研究中提取出的基礎可供性映射。研究結果展示了 10 款由使用者創作的視覺設計及其相應的映射策略。zh_TW
dc.description.abstractFidgeting involves low-effort, often repetitive hand movements used for self-regulation. While traditionally grounded in haptic experiences such as squeezing or tapping, recent research explores visual medium to enrich fidgeting and extend its expressive potential. However, these approaches rarely formalize gesture–feedback mappings and often rely on specialized tools that diverge from natural, everyday gestures. To address this gap, we developed a mixed reality (MR) authoring system that enables visual fidget design grounded in everyday fidgeting behaviors. We demonstrate the workflow of our interface and walk through the underlying affordance mapping extracted from our formative study. Our results show 10 user-created visual fidget designs with mapping strategies.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2026-03-04T16:21:07Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2026-03-04T16:21:07Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents摘要 iii
Abstract v
Contents vii
List of Figures ix
Chapter 1 Introduction 1
Chapter 2 Related Work 5
2.1 Fidgeting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.2 Enrich Fidget Experience through Visual Effects . . . . . . . . . . . 6
2.3 Design Exploration through Mixed Reality . . . . . . . . . . . . . . 7
Chapter 3 Formative Study 9
3.1 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.2 Apparatus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.3 Task . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.4 Procedure and Measures . . . . . . . . . . . . . . . . . . . . . . . . 12
3.5 Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.6 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.7 Findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Chapter 4 Authoring System 19
4.1 Affordance Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . 22
4.1.1 Motional Affordance . . . . . . . . . . . . . . . . . . . . . . . . . 22
4.1.2 Causal Temporality . . . . . . . . . . . . . . . . . . . . . . . . . . 25
4.2 Animation Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . 25
4.3 Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.3.1 Gesture Registration Pipeline . . . . . . . . . . . . . . . . . . . . . 26
4.3.2 Inference Module . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
4.3.3 Runtime Adaptation . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Chapter 5 Validation Study 29
5.1 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.2 Task and Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.3 Findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Chapter 6 Discussion 33
6.1 Visual Effect as a Catalyst for Fidgeting . . . . . . . . . . . . . . . . 33
6.2 Enhancing Self-Awareness and Reducing Disruptive Habits . . . . . 34
6.3 Emotion Regulation . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Chapter 7 Limitations & Future Work 37
7.1 System Constrains . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
7.2 Experiment Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Chapter 8 Conclusion 41
References 43
Appendix A — Codebook 49
-
dc.language.isoen-
dc.subject混合實境-
dc.subjectFidgeting-
dc.subject具身互動-
dc.subject創作工具-
dc.subjectMixed Reality-
dc.subjectFidgeting-
dc.subjectEmbodied Interaction-
dc.subjectAuthoring Tool-
dc.title視覺 Fidget 互動:範例、設計框架與混合實境中的創作工具zh_TW
dc.titleVisual Fidget Interactions: Examples, Design Frameworks, and Authoring Tools in Mixed Realityen
dc.typeThesis-
dc.date.schoolyear114-1-
dc.description.degree碩士-
dc.contributor.oralexamcommitteeLawrence Kim;楊興棟;Masahiko Inamizh_TW
dc.contributor.oralexamcommitteeLawrence Kim;Xing-Dong Yang;Masahiko Inamien
dc.subject.keyword混合實境,Fidgeting具身互動創作工具zh_TW
dc.subject.keywordMixed Reality,FidgetingEmbodied InteractionAuthoring Toolen
dc.relation.page49-
dc.identifier.doi10.6342/NTU202600777-
dc.rights.note同意授權(全球公開)-
dc.date.accepted2026-02-24-
dc.contributor.author-college電機資訊學院-
dc.contributor.author-dept資訊網路與多媒體研究所-
dc.date.embargo-lift2026-03-05-
顯示於系所單位:資訊網路與多媒體研究所

文件中的檔案:
檔案 大小格式 
ntu-114-1.pdf1.24 MBAdobe PDF檢視/開啟
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved