請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/91363
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 傅楸善 | zh_TW |
dc.contributor.advisor | Chiou-Shann Fuh | en |
dc.contributor.author | 呂英弘 | zh_TW |
dc.contributor.author | Ying-Hong Lu | en |
dc.date.accessioned | 2024-01-25T16:10:24Z | - |
dc.date.available | 2024-01-26 | - |
dc.date.copyright | 2024-01-25 | - |
dc.date.issued | 2022 | - |
dc.date.submitted | 2002-01-01 | - |
dc.identifier.citation | [1] N. Ahmad, R. A. R. Ghazilla, N. M. Khairi, and V. Kasi, “Reviews on Various Inertial Measurement Unit (IMU) Sensor Applications,” International Journal of Signal Processing Systems, Vol. 1, No. 2, pp. 256-262, 2013. [2] R. Aigner, D. Wigdor, H. Benko, M. Haller, D. Lindbauer, A. Ion, ... and J. T. K. V. Koh, “Understanding Mid-Air Hand Gestures: A Study of Human Preferences in Usage of Gesture Types for HCI,” Microsoft Research TechReport MSR-TR-2012-111, 2, 30, 2012. [3] R. E. Bailey, J. J. Arthur III, and S. P. Williams, “Latency Requirements for Head-Worn Display S/EVS Applications,” SPIE Enhanced and Synthetic Vision, Vol. 5424, pp. 98-109, 2004. [4] T. Bailey and H. Durrant-Whyte, “Simultaneous Localization and Mapping (SLAM): Part II,” IEEE Robotics and Automation Magazine, Vol. 13, No. 3, pp. 108-117, 2006. [5] S. Beauregard, “A Helmet-Mounted Pedestrian Dead Reckoning System,” Proceedings of International Forum on Applied Wearable Computing, Bremen, Germany, pp. 1-11, 2006. [6] S. Beauregard and H. Haas, “Pedestrian Dead Reckoning: A Basis for Personal Positioning,” Proceedings of Workshop on Positioning, Navigation and Communication, Hannover, Germany, pp. 27-35, 2006. [7] J. D. N. Dionisio, W. G. Burns III, and R. Gilbert, “3D Virtual Worlds and the Metaverse: Current Status and Future Possibilities,” ACM Computing Surveys, Vol. 45, No. 3, pp. 1-38, 2013. [8] Epson, “Moverio BT-30C,” https://www.epson.com.tw/, 2022. [9] Fun2, “FunZoo,” https://fun2studio.com/game_FunZoo.html, 2022. [10] L. C. Hale, “Principles and Techniques for Designing Precision Machines,” Ph. D. Thesis, Department of Mechanical Engineering, MIT, 1999. [11] K. Holmqvist, M. Nyström, R. Andersson, R. Dewhurst, H. Jarodzka, and J. Van de Weijer, Eye Tracking: A Comprehensive Guide to Methods and Measures, Oxford University Press, Oxford, UK, 2011. [12] Jorjin, “Jorjin J-Reality J7EF Plus,” https://www.jorjin.com/products/ar-smart-glasses/j-reality/j7ef-plus/, 2022. [13] K. Kiyokawa, “Trends and Vision of Head Mounted Display in Augmented Reality,” Proceedings of International Symposium on Ubiquitous Virtual Reality, Daejeon, Korea, pp. 14-17, 2012. [14] K. Kiyokawa, M. Billinghurst, B. Campbell, and E. Woods, “An Occlusion Capable Optical See-through Head Mount Display for Supporting Co-located Collaboration,” Proceedings of IEEE and ACM International Symposium on Mixed and Augmented Reality, Tokyo, Japan, pp. 133-141, 2003. [15] M. Lambooij, M. Fortuin, I. Heynderickx, and W. IJsselsteijn, “Visual Discomfort and Visual Fatigue of Stereoscopic Displays: A Review,” Journal of Imaging Science and Technology, Vol. 53, No. 3, pp. 30201-1-14, 2009. [16] L. H. Lee and P. Hui, “Interaction Methods for Smart Glasses: A Survey,” IEEE Access, Vol. 6, pp. 28712-28732, 2018. [17] H. Leppäkoski, J. Collin, and J. Takala, “Pedestrian Navigation Based on Inertial Sensors, Indoor Map, and WLAN Signals,” Journal of Signal Processing Systems, Vol. 71, No. 3, pp. 287-296, 2013. [18] S. Li, C. Xu, and M. Xie, “A Robust O(n) Solution to the Perspective-N-Point Problem,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 34, No. 7, pp. 1444-1450, 2012. [19] S. Liu, D. Cheng, and H. Hua, “An Optical See-through Head Mounted Display with Addressable Focal Planes,” Proceedings of IEEE/ACM International Symposium on Mixed and Augmented Reality, Cambridge, UK, pp. 33-42, 2008. [20] Microsoft, “Microsoft HoloLens 2,” https://www.microsoft.com/en-us/hololens/hardware, 2022. [21] P. Milgram and F. Kishino, “A Taxonomy of Mixed Reality Visual Displays,” IEICE Transactions on Information and Systems, Vol. 77, No. 12, pp. 1321-1329, 1994. [22] Peaksel Games, “Snake Game,” https://play.google.com/store/apps/details?id=com.snakegame.freesnakegames, 2022. [23] F. Remondino and D. Stoppa, Eds., TOF Range-Imaging Cameras, Vol. 68121, Springer, Heidelberg, Germany, 2013. [24] J. L. Schonberger and J. M. Frahm, “Structure-from-Motion Revisited,” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, pp. 4104-4113, 2016. [25] ST, “ST VL53L5CX Datasheet,” https://www.st.com/resource/en/datasheet/vl53l5cx.pdf, 2022. [26] A. R. Vidal, H. Rebecq, T. Horstschaefer, and D. Scaramuzza, “Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High-Speed Scenarios,” IEEE Robotics and Automation Letters, Vol. 3, No. 2, pp. 994-1001, 2018. [27] Y. Wu, F. Tang, and H. Li, “Image-Based Camera Localization: An Overview,” Visual Computing for Industry, Biomedicine, and Art, Vol. 1, No. 1, pp. 1-13, 2018. | - |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/91363 | - |
dc.description.abstract | 擴增實境(Augmented Reality, AR)上的人機互動方式有許多值得探索的部分。空中手勢(Mid-Air Hand Gesture)的整合是當中特別重要的題目,因為在每一個文化中,手勢都是重要的溝通工具。 在這篇論文中,我們提出呂手勢(LuGesture),包含三個準確、快速、舒適、省電與低成本的手勢組。這些手勢使用8x8相素的低解析度飛時(Time-of-Flight)深度相機做辨認。手勢辨識的演算法運算量低,可以被執行在智慧型眼鏡的微處理器(Miro-Controller Unit)上。 我們也探索了幾個不同的使用者介面來搭配我們的手勢組。我們所選擇的使用者介面都是依據相對應的手勢組之特性。在我們的實驗中,我們得到了很好的結果。 我們的演算法和使用者介面都成功整合在佐臻的J7EF Plus智慧型眼鏡上。整合的方法和流程都有詳述在論文中。 | zh_TW |
dc.description.abstract | There is much to explore for human computer interactions in Augmented Reality (AR). Integration of mid-air hand gestures is an exceptionally important topic, since gestures facilitate important communication in every culture. In this thesis, we proposed three mid-air hand gesture sets (LuGesture) for AR that are accurate, fast, comfortable, power-efficient, and low-cost. These gestures are designed to be recognized with an 8x8 pixels low-resolution Time-of-Flight (ToF) depth camera (ST VL53L5CX). The gesture recognition algorithms are also designed to be fast and can run on a Micro-Controller Unit (MCU: ST STM32F401CE) on a pair of smart glasses. We also explore a few different User Interfaces (UI) to accompany our LuGesture. Each selection of UI is based on the characteristics of the corresponding gesture set, and have shown excellent results in our experiments. Our algorithms and UIs have been packaged and tested on Jorjin J7EF Plus smart glasses with success. The pipeline of data is also described in this thesis. | en |
dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2024-01-25T16:10:24Z No. of bitstreams: 0 | en |
dc.description.provenance | Made available in DSpace on 2024-01-25T16:10:24Z (GMT). No. of bitstreams: 0 | en |
dc.description.tableofcontents | 誌謝 i 中文摘要 iii ABSTRACT iv CONTENTS v LIST OF FIGURES viii Chapter 1 Introduction 1 1.1 The Concept of Augmented Reality and Metaverse 1 1.2 The Hardware of Smart Glasses 4 1.2.1 Display 4 1.2.2 Movement Tracking 6 1.2.3 Computing 12 1.3 Interacting with Smart Glasses 15 1.4 Mid-Air Hand Gestures 17 1.5 Our Hardware Configuration 20 1.6 The Time-of-Flight Sensor 21 1.7 Thesis Organization 24 Chapter 2 Problem Description 25 2.1 The Low-Resolution Depth Sensor 25 2.2 Gesture Recognition 27 Chapter 3 Methodology 29 3.1 LuGesture 1: Swipe + Push/Pull 29 3.1.1 Swipe 29 3.1.2 Push and Pull 36 3.1.3 Precedence 37 3.2 LuGesture 2: Hand Speed 38 3.3 LuGesture 3: Hand Tracking 41 Chapter 4 Experiment Results 45 4.1 LuGesture 1: Swipe + Push/Pull 45 4.2 LuGesture 2: Hand Speed 48 4.3 LuGesture 3: Hand Tracking 49 Chapter 5 Integration 51 5.1 System Overview 51 5.2 Data Flow 52 5.3 UI 56 5.4 Packaging and Distribution 58 Chapter 6 Conclusion and Future Works 60 References 62 | - |
dc.language.iso | en | - |
dc.title | 在智慧型眼鏡上用飛時相機做手勢辨認 | zh_TW |
dc.title | Hand Gesture Recognition with Time-of-Flight Camera for Smart Glasses | en |
dc.type | Thesis | - |
dc.date.schoolyear | 110-2 | - |
dc.description.degree | 碩士 | - |
dc.contributor.oralexamcommittee | 王獻章;邱立誠 | zh_TW |
dc.contributor.oralexamcommittee | ;; | en |
dc.subject.keyword | 擴增實境,空中手勢,呂手勢,智慧型眼鏡,飛時相機,手勢辨識,人機互動, | zh_TW |
dc.subject.keyword | Augmented Reality,Mid-air Hand Gesture,LuGesture,Smart Glasses,Time-of-Flight Camera,Gesture Recognition,Human Computer Interaction, | en |
dc.relation.page | 65 | - |
dc.identifier.doi | 10.6342/NTU202200665 | - |
dc.rights.note | 同意授權(限校園內公開) | - |
dc.date.accepted | 2022-03-31 | - |
dc.contributor.author-college | 電機資訊學院 | - |
dc.contributor.author-dept | 生醫電子與資訊學研究所 | - |
dc.date.embargo-lift | 2027-03-27 | - |
顯示於系所單位: | 生醫電子與資訊學研究所 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-110-2.pdf 目前未授權公開取用 | 2.54 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。