Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88677
Full metadata record
???org.dspace.app.webui.jsptag.ItemTag.dcfield???ValueLanguage
dc.contributor.advisor傅楸善zh_TW
dc.contributor.advisorChiou-Shann Fuhen
dc.contributor.author鍾承鎧zh_TW
dc.contributor.authorCheng-Kai Chungen
dc.date.accessioned2023-08-15T17:19:57Z-
dc.date.available2023-11-09-
dc.date.copyright2023-08-15-
dc.date.issued2023-
dc.date.submitted2023-08-04-
dc.identifier.citationReferences
[1] P. Milgram and F. Kishino, “A Taxonomy of Mixed Reality Visual Displays,” IEICE Transactions on Information and Systems, Vol. 77, No. 12, pp. 1321-1329, 1994.
[2] L. H. Lee and P. Hui, “Interaction Methods for Smart Glasses: A Survey,” IEEE Access, Vol. 6, pp. 28712-28732, 2018.
[3] R. Aigner, D. Wigdor, H. Benko, M. Haller, D. Lindbauer, A. Ion, S. D. Zhao, and J. T. K. V. Koh, “Understanding Mid-Air Hand Gestures: A Study of Human Preferences in Usage of Gesture Types for HCI,” Microsoft Research TechReport MSR-TR-2012-111, 2, 30, 2012.
[4] F. Remondino and D. Stoppa, Eds., TOF Range-Imaging Cameras, Vol. 68121, Springer, Heidelberg, Germany, 2013.
[5] P. Padmanabhan, C. Zhang, and E. Charbon, “Modeling and Analysis of a Direct Time-of-Flight Sensor Architecture for LiDAR Applications,” Sensors, vol. 19, No. 5464, pp. 1-27, 2019.
[6] M. S. Keel et al., "A VGA Indirect Time-of-Flight CMOS Image Sensor with 4-Tap 7- μ m Global-Shutter Pixel and Fixed-Pattern Phase Noise Self-Compensation," IEEE Journal of Solid-State Circuits, vol. 55, pp. 889-897, 2020.
[7] P. A. Rauschnabel, A. Brem, and Y. Ro, “Augmented Reality Smart Glasses: Definition, Conceptual Insights, and Managerial Importance,” Unpublished Working Paper, The University of Michigan-Dearborn, College of Business, https://www.researchgate.net/publication/279942768_Augmented_Reality_Smart_Glasses_Definition_Conceptual_Insights_and_Managerial_Importance, 2015.
[8] I. E. Sutherland, “Head-Mounted Three Dimensional Display,” Proceedings of Fall Joint Comput. Conf. AFIPS, San Francisco, CA, 33, pp. 757–764, 1968.
[9] Google, “Glass Enterprise Edition 2,” https://support.google.com/glass-enterprise/customer/answer/9220200?hl=en&ref_topic=9235678, 2023.
[10] D. Kim and Y. Choi, “Applications of Smart Glasses in Applied Sciences: A Systematic Review,” Appl. Sci., Vol. 11, 4956, pp. 1-21, 2021.
[11] Microsoft, “Microsoft HoloLens 2,” https://www.microsoft.com/zh-tw/hololens/hardware, 2023.
[12] Epson, “Epson Moverio BT-350,” https://www.epson.com.tw/c/Moverio-BT-350/p/V11H837054, 2023.
[13] J. P. Wachs, M. Kölsch, H. Stern, and Y. Edan, "Vision-Based Hand-Gesture Applications," Communications of the ACM, vol. 54, pp. 60-71, 2011.
[14] J. Suarez and R. R. Murphy, "Hand Gesture Recognition with Depth Images: A Review," 2012 IEEE RO-MAN: International Symposium on Robot and Human Interactive Communication, Paris, France, pp. 411-417, doi: 10.1109/ROMAN.2012.6343787, 2012.
[15] M. Oudah, A. Al-Naji, and J. Chahl, "Hand Gesture Recognition Based on Computer Vision: A Review of Techniques," Journal of Imaging, Vol. 6.8: 73, pp. 1-29, 2020.
[16] W. Daolei and K.B. Lim, “Obtaining Depth Map from Segment-Based Stereo Matching Using Graph Cuts,” Journal of Visual Communication and Image Representation, Vol. 22, Issue 4, pp. 325-331, 2011.
[17] Y. H. Lu, “LuGesture: Hand Gesture Recognition with Time-of-Flight Camera for Smart Glasses,” Master Thesis, Department of Computer Science and Information Engineering, National Taiwan University, 2022.
[18] Jorjin, “Jorjin J-Reality J7EF Plus,” https://www.jorjin.com/products/ar-smart-glasses/j-reality/j7ef-plus/, 2023.
[19] R. Szeliski, Computer Vision: Algorithms and Applications, Springer, London, 2021.
[20] C. Walker, “Stereo Vision Basics,” http://chriswalkertechblog.blogspot.com/2014/03/stereo-vision-basics.html, 2023.
[21] L. Li, “Time-of-Flight Camera – An Introduction,” Texas Instrument Technical White Paper, 2014.
[22] J. Geng, "Structured-Light 3D Surface Imaging: A Tutorial," Advances in Optics and Photonics, Vol. 3, No. 2, pp. 128-160, 2011.
[23] P. Zanuttigh, G. Marin, C. D. Mutto, F. Dominio, L. Minto, and G. M. Cortelazzo, Time-of-Flight and Structured Light Depth Cameras: Technology and Applications, Springer, Berlin, 978-3, 2016.
[24] Sony, “Sony Xperia 1 II,” https://www.sony.com.tw/zh/electronics/smartphones/xperia-1m2, 2023.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88677-
dc.description.abstract本論文提出一個名為「鍾手勢」的演算法,希望藉由搭載著8x8像素的低解析度飛時深度相機的智慧型眼鏡,來進行手勢辨識,其中共包含了六個精準、快速、舒適的動作,旨在提高使用者的操作體驗。
我們首先回顧了手勢辨認技術的相關知識和現有相關研究,並探索三種主要的深度預測技術:立體視覺、結構光和 ToF相機。接著,我們針對智慧型眼鏡的特性和使用情境,提出了適合的手勢辨認應用場景和手勢操作設計,並開發了相對應的軟體系統。最後,我們進行了實驗評估,包括手勢辨認的精準度、穩定度和使用者的滿意度。實驗結果顯示,本研究所提出的手勢辨認系統具有良好的準確度和穩定度,且能有效提升智慧型眼鏡的使用體驗。
我們的演算法開發及應用皆是在佐臻的J7EF Plus智慧型眼鏡上,詳細的方法和流程會在論文中加以說明。
zh_TW
dc.description.abstractIn this thesis, we propose ChungGesture, which utilizes a low-resolution Time-of-Flight (ToF) camera with 8x8 pixels to perform gesture recognition on smart glasses. The algorithm consists of six precise, fast, and comfortable gestures designed to enhance the user experience.
We first review the relevant knowledge and existing research on gesture recognition technology and explored three main depth prediction techniques: stereo vision, structured light, and ToF camera. Next, we propose suitable application scenarios and gesture designs based on the characteristics and usage context of smart glasses, and develop corresponding software systems. Finally, we conduct experimental evaluations, including gesture recognition accuracy, stability, and user satisfaction. The results show that our gesture recognition system has good accuracy and stability and can effectively improve the user experience of smart glasses.
Our algorithm development and application are performed on Jorjin J7EF Plus smart glasses, and the detailed methods and processes will be explained in this thesis.
en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-08-15T17:19:57Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2023-08-15T17:19:57Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents誌謝 i
中文摘要 ii
ABSTRACT iii
CONTENTS iv
LIST OF FIGURES vii
LIST OF TABLES x
Chapter 1 Introduction 1
1.1 The Concept of Smart Glasses 1
1.1.1 Hardware 2
1.1.2 Virtual Reality (VR) and Augmented Reality (AR) 3
1.1.3 Interacting with Smart Glasses 7
1.2 Hand Gestures 9
1.3 Thesis Organization 11
Chapter 2 Related Works 12
2.1 History of Smart Glasses 12
2.2 The Method of Depth Estimation 16
2.2.1 Stereo Vision 17
2.2.2 Structured Light 18
2.2.3 Time-of-Flight Camera 20
2.3 Hand Gestures Recognition 23
Chapter 3 Background 26
3.1 Our Device Configuration 26
3.2 Sensor Decision 29
3.3 Hardware Limitation 30
3.3.1 The Low-Resolution Depth Sensor 30
3.3.2 Limited Field of View (FoV) 31
3.3.3 Sensor Mobility Constraint 31
3.4 Existing Techniques and Limitations 31
3.4.1 LuGesture Method 31
3.4.2 Limitations of LuGesture Method 37
Chapter 4 Methodology 40
4.1 Overview 40
4.2 Data Collection 41
4.3 Noise Reduction 42
4.3.1 Remove Background 42
4.3.2 Remove Twinkle 44
4.4 Hand Area Detection 45
4.4.1 Left-Right Hand Discrimination 45
4.4.2 Cropping for Left-Right Hand Differentiation 46
4.4.3 Calculating Hand Palm Centroid 48
4.4.4 Smoothing 52
4.5 Gesture Recognition 53
4.5.1 Swipe 54
4.5.2 Push and Pull 56
4.6 Our System 57
Chapter 5 Experimental Results 59
5.1 Experimental Setting 59
5.2 Precision Result 59
5.3 Stability Result 61
5.4 Gesture Recognition Accuracy Result 64
Chapter 6 Conclusion and Future Works 66
References 67
-
dc.language.isoen-
dc.subject擴增實境眼鏡zh_TW
dc.subject手勢辨識zh_TW
dc.subject鍾手勢zh_TW
dc.subject飛時相機zh_TW
dc.subjectTime-Of-Flight Cameraen
dc.subjectChungGestureen
dc.subjectGesture Recognitionen
dc.subjectAugmented Reality (AR) Glassesen
dc.title鍾手勢:手勢辨認使用飛時相機應用於智慧型眼鏡zh_TW
dc.titleChungGesture: Hand Gesture Recognition with Time-of-Flight Camera for Smart Glassesen
dc.typeThesis-
dc.date.schoolyear111-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee王獻章;方瓊瑤zh_TW
dc.contributor.oralexamcommitteeHsien-Chang Wang;Chiung-Yao Fangen
dc.subject.keyword鍾手勢,飛時相機,手勢辨識,擴增實境眼鏡,zh_TW
dc.subject.keywordChungGesture,Time-Of-Flight Camera,Gesture Recognition,Augmented Reality (AR) Glasses,en
dc.relation.page70-
dc.identifier.doi10.6342/NTU202302797-
dc.rights.note同意授權(全球公開)-
dc.date.accepted2023-08-08-
dc.contributor.author-college電機資訊學院-
dc.contributor.author-dept資訊工程學系-
Appears in Collections:資訊工程學系

Files in This Item:
File SizeFormat 
ntu-111-2.pdf2.65 MBAdobe PDFView/Open
Show simple item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved