請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/49628
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 陳彥仰(Mike Y. Chen) | |
dc.contributor.author | Jhe-Wei Lin | en |
dc.contributor.author | 林哲緯 | zh_TW |
dc.date.accessioned | 2021-06-15T11:38:34Z | - |
dc.date.available | 2016-08-26 | |
dc.date.copyright | 2016-08-26 | |
dc.date.issued | 2016 | |
dc.date.submitted | 2016-08-15 | |
dc.identifier.citation | [1] Data Gloves | 5DT . http://www.5dt.com/?page_id=34, 2015.
[2] Thalmic Labs. Myo. https://www.thalmic.com/en/myo/, 2015. [3] G. C. Burdea and P. Coiffet. Virtual Reality Technology. John Wiley & Sons, Inc., New York, NY, USA, 2 edition, 2003. [4] C.-C. Chang and C.-J. Lin. LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2:27:1–27:27, 2011. Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm. [5] A. Dementyev and J. A. Paradiso. Wristflex: Low-power gesture input with wrist- worn pressure sensors. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, UIST ’14, pages 161–166, New York, NY, USA, 2014. ACM. [6] T. Deyle, S. Palinko, E. Poole, and T. Starner. Hambone: A bio-acoustic gesture interface. In Wearable Computers, 2007 11th IEEE International Symposium on, pages 3–10, Oct 2007. [7] L. Dipietro, A. Sabatini, and P. Dario. A survey of glove-based systems and their applications. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 38(4):461–482, July 2008. [8] R.Fukui,M.Watanabe,T.Gyota,M.Shimosaka,andT.Sato.Handshapeclassifica- tion with a wrist contour sensor: Development of a prototype device. In Proceedings of the 13th International Conference on Ubiquitous Computing, UbiComp ’11, pages 311–314, New York, NY, USA, 2011. ACM. [9] N. Kevin, S. Ranganath, and D. Ghosh. Trajectory modeling in gesture recognition using cybergloves reg; and magnetic trackers. In TENCON 2004. 2004 IEEE Region 10 Conference, volume A, pages 571–574 Vol. 1, Nov 2004. [10] D. Kim, O. Hilliges, S. Izadi, A. D. Butler, J. Chen, I. Oikonomidis, and P. Olivier. Digits: Freehand 3d interactions anywhere using a wrist-worn gloveless sensor. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, UIST ’12, pages 167–176, New York, NY, USA, 2012. ACM. [11] Y. S. Kim, B. S. Soh, and S.-G. Lee. A new wearable input device: Scurry. Industrial Electronics, IEEE Transactions on, 52(6):1490–1499, Dec 2005. [12] G. Kulaksiz and R. Gozil. The effect of hand preference on hand anthropometric measurements in healthy individuals. Annals of Anatomy - Anatomischer Anzeiger, 184(3):257 – 265, 2002. [13] J. J. LaViola, Jr. A survey of hand posture and gesture recognition techniques and technology. Technical report, Providence, RI, USA, 1999. [14] P. Mistry and P. Maes. Sixthsense: A wearable gestural interface. In ACM SIG- GRAPH ASIA 2009 Sketches, SIGGRAPH ASIA ’09, pages 11:1–11:1, New York, NY, USA, 2009. ACM. [15] J. Perng, B. Fisher, S. Hollar, and K. Pister. Acceleration sensing glove (asg). In Wearable Computers, 1999. Digest of Papers. The Third International Symposium on, pages 178–180, Oct 1999. [16] J. Rekimoto. Gesturewrist and gesturepad: Unobtrusive wearable interaction de- vices. In Proceedings of the 5th IEEE International Symposium on Wearable Com- puters, ISWC ’01, pages 21–, Washington, DC, USA, 2001. IEEE Computer Society. [17] T. S. Saponas, D. S. Tan, D. Morris, R. Balakrishnan, J. Turner, and J. A. Landay. Enabling always-available input with muscle-computer interfaces. In Proceedings of the 22Nd Annual ACM Symposium on User Interface Software and Technology, UIST ’09, pages 167–176, New York, NY, USA, 2009. ACM. [18] K. Tsukada and M. Yasumura. Ubi-finger: Gesture input device for mobile use. In Ubicomp 2001 Informal Companion Proceedings, page 11. Citeseer, 2001. [19] D. Way and J. Paradiso. A usability user study concerning free-hand microgesture and wrist-worn sensors. In Wearable and Implantable Body Sensor Networks (BSN), 2014 11th International Conference on, pages 138–142, June 2014. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/49628 | - |
dc.description.abstract | 在這篇論文裡,我們探索透過手背來作為感測手勢的來源,其干擾 遠低於以手套形式作為感測手勢的方法以及提供比手腕和手肘作為來 源更為精準的辨識。我們的裝置利用應變規陣列黏貼於手背表面,並 利用機器學習相關技術來辨識多樣的手勢。
為了更加了解手勢辨識之正確度以及手背上不同感測位置的影響, 我們安排共計十位使用者實施使用者研究。實驗結果顯示出:藉由將 感測器所讀取之數值圖像化後,跨使用者的結果差異極大,而相同的 使用者則相似。對於跨使用者做出 16 個手勢的系統而言,系統辨識 之正確度落於 27.4%,而對於單一使用者而言,同樣的系統則能達到 95.8%的正確率。另一項實驗結果,則顯示以橫列為基礎的排列方式, 感測陣列最適合黏附於坐落在 MCP(手指與手掌間的關節) 至手腕頂端 之間 1/8 到 1/4 的位置。 | zh_TW |
dc.description.abstract | In this paper, we explore using the back of hands for sensing hand ges- tures, which interferes less than glove-based approaches and provides better recognition than sensing at wrists and forearms. Our prototype, BackHand, uses an array of strain gauge sensors affixed to the back of hands, and applies machine learning techniques to recognize a variety of hand gestures.
We conducted a user study with 10 participants to better understand ges- ture recognition accuracy and the effects of sensing locations. Results showed that sensor reading patterns differ significantly across users, but are consis- tent for the same user. The leave-one-user-out accuracy is low at an average of 27.4%, but reaches 95.8% average accuracy for 16 popular hand gestures when personalized for each participant. The most promising location spans the 1/8˷1/4 area between the metacarpophalangeal joints (MCP, the knuckles between the hand and fingers) and the head of ulna (tip of the wrist). | en |
dc.description.provenance | Made available in DSpace on 2021-06-15T11:38:34Z (GMT). No. of bitstreams: 1 ntu-105-R02944056-1.pdf: 8284245 bytes, checksum: bb06376dae5771f0b19701e7c822cdf4 (MD5) Previous issue date: 2016 | en |
dc.description.tableofcontents | 誌謝 i
Abstract ii 摘要 iii 1 Introduction 1 2 Related Work 4 2.1 Finger-basedHandGestureInterfaces 4 2.2 Wrist-basedHandGestureInterfaces 5 2.3 Arm-basedHandGestureInterfaces 6 3 BackHand-Prototype-Design 7 3.1 Hardware 7 3.2 MachineLearning 10 4 Evaluation: Most Promising Location 11 4.1 StudyDesign 11 4.1.1 GestureSet 12 4.1.2 Procedure 12 4.2 Results 14 4.2.1 HeatMapVisualization 14 4.2.2 AccuracyRates 16 5 Discussion 18 5.1 ConfusionMatrix 18 5.2 SignalSourceComparison 19 5.3 Limitations 20 6 Applications 21 6.1 MusicalPerformance 21 6.2 SmartwatchControl 22 7 Conclusion 23 Bibliography 25 | |
dc.language.iso | en | |
dc.title | 手背: 透過手背之手勢辨識研究 | zh_TW |
dc.title | BackHand:Sensing Hand Gesture via Back of the Hand | en |
dc.type | Thesis | |
dc.date.schoolyear | 104-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 唐玄輝(Hsien-Hui Tang),余能豪(Neng-Hao Yu) | |
dc.subject.keyword | 手勢辨識,穿戴式介面,手背,手勢介面,應變規,機器學習,手勢互動, | zh_TW |
dc.subject.keyword | Gesture recognition,wearable interface,back of the hand,hand gesture interface,strain gauge,machine learning,gestural interaction, | en |
dc.relation.page | 27 | |
dc.identifier.doi | 10.6342/NTU201602594 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2016-08-16 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 資訊網路與多媒體研究所 | zh_TW |
顯示於系所單位: | 資訊網路與多媒體研究所 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-105-1.pdf 目前未授權公開取用 | 8.09 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。