請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/57310完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 陳炳宇 | |
| dc.contributor.author | Chi-Chiang Huang | en |
| dc.contributor.author | 黃志強 | zh_TW |
| dc.date.accessioned | 2021-06-16T06:41:12Z | - |
| dc.date.available | 2019-08-05 | |
| dc.date.copyright | 2014-08-05 | |
| dc.date.issued | 2014 | |
| dc.date.submitted | 2014-07-29 | |
| dc.identifier.citation | [1] Microsoft kinect for windows
http://www.microsoft.com/en-us/kinectforwindows/. [2] Opencv on cuda http://opencv.org/platforms/cuda.html. [3] A. Banerjee, J. Burstyn, A. Girouard, and R. Vertegaal. Multipoint: Comparing laser and manual pointing as remote input in large display interactions. IJHCS, 70(10):690–702, 2012. [4] S. Boring, D. Baur, A. Butz, S. Gustafson, and P. Baudisch. Touch projector: Mobile interaction through video. In Proc. CHI ’10, pages 2287–2296, 2010. [5] X. Cao and R. Balakrishnan. VisionWand: Interaction techniques for large displays using a passive wand tracked in 3D. In Proc. UIST ’03, pages 173–182, 2003. [6] K. Cheng and M. Takatsuka. Initial evaluation of a bare-hand interaction technique for large displays using a webcam. In Proc. EICS ’09, pages 291–296, 2009. [7] G. Farneback. Two-frame motion estimation based on polynomial expansion. In Proc. SCIA ’03, pages 363–370, Berlin, Heidelberg, 2003. Springer-Verlag. [8] N. Gillian. Gesture recognition toolkit http://www.nickgillian.com/software/grt. [9] T. Grossman and R. Balakrishnan. The bubble cursor: Enhancing target acquisition by dynamic resizing of the cursor’s activation area. In Proc. CHI ’05, pages 281–290, 2005. [10] R. Jota, M. A. Nacenta, J. A. Jorge, S. Carpendale, and S. Greenberg. A comparison of ray pointing techniques for very large displays. In Proc. GI ’10, pages 269–276, 2010. [11] J. H. Lee and S.-H. Bae. Binocular cursor: Enabling selection on transparent displays troubled by binocular parallax. In Proc. CHI ’13, pages 3169–3172, 2013. [12] D. C. McCallum and P. Irani. ARC-Pad: Absolute+relative cursor positioning for large displays with a mobile touchscreen. In Proc. UIST ’09, pages 153–156, 2009. [13] T. Moscovich. Contact area interaction with sliding widgets. In Proc. UIST ’09, pages 13–22, 2009. [14] B. A. Myers, R. Bhatnagar, J. Nichols, C. H. Peck, D. Kong, R. Miller, and A. C. Long. Interacting at a distance: Measuring the performance of laser pointers and other devices. In Proc. CHI ’02, pages 33–40, 2002. [15] M. Nancel, O. Chapuis, E. Pietriga, X.-D. Yang, P. P. Irani, and M. Beaudouin-Lafon. High-precision pointing on large wall displays using small handheld devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13, pages 831–840, New York, NY, USA, 2013. ACM. [16] J. S. Pierce, A. S. Forsberg, M. J. Conway, S. Hong, R. C. Zeleznik, and M. R. Mine. Image plane interaction techniques in 3D immersive environments. In Proc. I3D ’97, pages 39–43., 1997. [17] G. Shoemaker, A. Tang, and K. S. Booth. Shadow reaching: A new perspective on interaction for large displays. In Proc. UIST ’07, pages 53–56, 2007. [18] D. Vogel and R. Balakrishnan. Distant freehand pointing and clicking on very large, high resolution displays. In Proc. UIST ’05, pages 33–42, 2005. | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/57310 | - |
| dc.description.abstract | 目前市面上智慧電視的體感操作以移動游標為主,使用者藉由移動手掌來操作螢幕上的游標、透過手勢執行點選指令。過去有許多研究探討如何徒手直接選取螢幕上的目標,而非透過移動游標的方式,以增加體感操作的效率。然而,前人所提出的遠距直接選取技術需要特別的硬體設置,以準確的偵測使用者動作,因此要將該技術布置於普通環境中是有困難的。在本論文中,我們提出FingerShot系統,一種基於視角之遠距直接選取技術,且只需要使用單台彩色深度攝影機,即可讓使用者遠距直接選取螢幕上之目標。我們提出之即時偵測演算法會追蹤使用者的上半身骨架、眼睛及指尖,計算眼睛到指尖之射線與螢幕的交點,即使用者瞄準的位置。我們的系統也支援雙手操作及多人使用。本系統之使用者測試結果顯示,在距離螢幕至少1.6公尺遠的五種不同位置,當使用者閉起非慣用眼僅用單眼瞄準時,可點選到螢幕上11公分寬的目標;當雙眼皆張開時,可點選到12.3公分寬的目標。 | zh_TW |
| dc.description.abstract | Several research studies have attempted to enable remote direct-pointing to increase the effectiveness of freehand remote controlling compared with controlling through an indirect cursor. Nonetheless, since previous remote direct-pointing solutions usually required specific hardware settings for reliable sensing, to deploy such remote direct-pointing methods in the real environment is difficult. In this paper, we present FingerShot, a perspective-based absolute pointing method that requires only one RGBD camera to enable users to directly point to a remote display. The proposed real-time sensing algorithm defines the remote direct-pointing position on the display by tracking users' upper bodies, eyes and fingertips reliably, as well as supporting bi-manual and multi-user interactions. User study results show that users can effectively select on-screen gridded 11cm-width targets with their non-dominant eye closed, and 12.3cm-width targets with their both eyes open from five different positions at least 1.6m away from the display using our remote direct-pointing technique. | en |
| dc.description.provenance | Made available in DSpace on 2021-06-16T06:41:12Z (GMT). No. of bitstreams: 1 ntu-103-R01922069-1.pdf: 8031349 bytes, checksum: 803ab90b20f7ec9bfe7322c8ebafa0f9 (MD5) Previous issue date: 2014 | en |
| dc.description.tableofcontents | 口試委員會審定書 i
誌謝 ii 摘要 iii Abstract iv 1 Introduction 1 1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 FingerShot: Remote Direct-Pointing . . . . . . . . . . . . . . . . . . . . 3 2 Related Work 5 2.1 Device-Based Remote-Pointing . . . . . . . . . . . . . . . . . . . . . . . 5 2.2 Freehand Remote-Pointing . . . . . . . . . . . . . . . . . . . . . . . . . 8 2.3 Perspective-Based Remote-Pointing . . . . . . . . . . . . . . . . . . . . 9 3 System Design 11 3.1 Design Challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.1.1 Pilot Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.1.2 Results and Discussion . . . . . . . . . . . . . . . . . . . . . . . 12 3.2 Design and Implementation . . . . . . . . . . . . . . . . . . . . . . . . . 13 3.2.1 Occlusion-Free Eye Tracking . . . . . . . . . . . . . . . . . . . 13 3.2.2 Real-Time Fingertip Tracking and Rectifying . . . . . . . . . . . 14 3.2.3 Implementing Perspective-Based Direct-Pointing . . . . . . . . . 17 4 Evaluation 19 4.1 Study on Speed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 4.1.1 Apparatus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 4.1.2 Experiment Design . . . . . . . . . . . . . . . . . . . . . . . . . 20 4.1.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 4.2 Study on Accuracy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 4.2.1 Apparatus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 4.2.2 Experiment Design . . . . . . . . . . . . . . . . . . . . . . . . . 25 4.2.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 5 Possible Applications and Generalization 40 5.1 Rapid Land-on Selection . . . . . . . . . . . . . . . . . . . . . . . . . . 40 5.1.1 Video Player . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 5.1.2 Calculator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 5.1.3 Combing with Sliding Widgets . . . . . . . . . . . . . . . . . . . 41 5.2 Precise Pointing with Absolute+Relative Cursor . . . . . . . . . . . . . . 42 5.3 Possible Generalization . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 6 Conclusion and Future Work 45 6.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 6.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 A Implementation Details 47 A.1 Coordinate Space Transformation . . . . . . . . . . . . . . . . . . . . . 47 A.2 System Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 Bibliography 51 | |
| dc.language.iso | en | |
| dc.subject | 彩色深度攝影機 | zh_TW |
| dc.subject | 視角 | zh_TW |
| dc.subject | 遠距 | zh_TW |
| dc.subject | 體感操控 | zh_TW |
| dc.subject | 直接選取 | zh_TW |
| dc.subject | RGBD camera | en |
| dc.subject | Direct-pointing | en |
| dc.subject | Remote | en |
| dc.subject | Perspective | en |
| dc.title | 以單台彩色深度攝影機實現基於視角之遠距直接選取技術 | zh_TW |
| dc.title | FingerShot: Perspective-Based Remote Direct-Pointing Using One RGBD Camera | en |
| dc.type | Thesis | |
| dc.date.schoolyear | 102-2 | |
| dc.description.degree | 碩士 | |
| dc.contributor.oralexamcommittee | 陳彥仰,余能豪,詹力韋 | |
| dc.subject.keyword | 視角,遠距,體感操控,直接選取,彩色深度攝影機, | zh_TW |
| dc.subject.keyword | Perspective,Remote,Direct-pointing,RGBD camera, | en |
| dc.relation.page | 53 | |
| dc.rights.note | 有償授權 | |
| dc.date.accepted | 2014-07-30 | |
| dc.contributor.author-college | 電機資訊學院 | zh_TW |
| dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
| 顯示於系所單位: | 資訊工程學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-103-1.pdf 未授權公開取用 | 7.84 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
