Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/57310
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳炳宇
dc.contributor.authorChi-Chiang Huangen
dc.contributor.author黃志強zh_TW
dc.date.accessioned2021-06-16T06:41:12Z-
dc.date.available2019-08-05
dc.date.copyright2014-08-05
dc.date.issued2014
dc.date.submitted2014-07-29
dc.identifier.citation[1] Microsoft kinect for windows
http://www.microsoft.com/en-us/kinectforwindows/.
[2] Opencv on cuda
http://opencv.org/platforms/cuda.html.
[3] A. Banerjee, J. Burstyn, A. Girouard, and R. Vertegaal. Multipoint: Comparing
laser and manual pointing as remote input in large display interactions. IJHCS,
70(10):690–702, 2012.
[4] S. Boring, D. Baur, A. Butz, S. Gustafson, and P. Baudisch. Touch projector: Mobile
interaction through video. In Proc. CHI ’10, pages 2287–2296, 2010.
[5] X. Cao and R. Balakrishnan. VisionWand: Interaction techniques for large displays
using a passive wand tracked in 3D. In Proc. UIST ’03, pages 173–182, 2003.
[6] K. Cheng and M. Takatsuka. Initial evaluation of a bare-hand interaction technique
for large displays using a webcam. In Proc. EICS ’09, pages 291–296, 2009.
[7] G. Farneback. Two-frame motion estimation based on polynomial expansion. In
Proc. SCIA ’03, pages 363–370, Berlin, Heidelberg, 2003. Springer-Verlag.
[8] N. Gillian. Gesture recognition toolkit
http://www.nickgillian.com/software/grt.
[9] T. Grossman and R. Balakrishnan. The bubble cursor: Enhancing target acquisition
by dynamic resizing of the cursor’s activation area. In Proc. CHI ’05, pages 281–290,
2005.
[10] R. Jota, M. A. Nacenta, J. A. Jorge, S. Carpendale, and S. Greenberg. A comparison
of ray pointing techniques for very large displays. In Proc. GI ’10, pages 269–276,
2010.
[11] J. H. Lee and S.-H. Bae. Binocular cursor: Enabling selection on transparent displays
troubled by binocular parallax. In Proc. CHI ’13, pages 3169–3172, 2013.
[12] D. C. McCallum and P. Irani. ARC-Pad: Absolute+relative cursor positioning for
large displays with a mobile touchscreen. In Proc. UIST ’09, pages 153–156, 2009.
[13] T. Moscovich. Contact area interaction with sliding widgets. In Proc. UIST ’09,
pages 13–22, 2009.
[14] B. A. Myers, R. Bhatnagar, J. Nichols, C. H. Peck, D. Kong, R. Miller, and A. C.
Long. Interacting at a distance: Measuring the performance of laser pointers and
other devices. In Proc. CHI ’02, pages 33–40, 2002.
[15] M. Nancel, O. Chapuis, E. Pietriga, X.-D. Yang, P. P. Irani, and M. Beaudouin-Lafon.
High-precision pointing on large wall displays using small handheld devices. In
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems,
CHI ’13, pages 831–840, New York, NY, USA, 2013. ACM.
[16] J. S. Pierce, A. S. Forsberg, M. J. Conway, S. Hong, R. C. Zeleznik, and M. R. Mine.
Image plane interaction techniques in 3D immersive environments. In Proc. I3D
’97, pages 39–43., 1997.
[17] G. Shoemaker, A. Tang, and K. S. Booth. Shadow reaching: A new perspective on
interaction for large displays. In Proc. UIST ’07, pages 53–56, 2007.
[18] D. Vogel and R. Balakrishnan. Distant freehand pointing and clicking on very large,
high resolution displays. In Proc. UIST ’05, pages 33–42, 2005.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/57310-
dc.description.abstract目前市面上智慧電視的體感操作以移動游標為主,使用者藉由移動手掌來操作螢幕上的游標、透過手勢執行點選指令。過去有許多研究探討如何徒手直接選取螢幕上的目標,而非透過移動游標的方式,以增加體感操作的效率。然而,前人所提出的遠距直接選取技術需要特別的硬體設置,以準確的偵測使用者動作,因此要將該技術布置於普通環境中是有困難的。在本論文中,我們提出FingerShot系統,一種基於視角之遠距直接選取技術,且只需要使用單台彩色深度攝影機,即可讓使用者遠距直接選取螢幕上之目標。我們提出之即時偵測演算法會追蹤使用者的上半身骨架、眼睛及指尖,計算眼睛到指尖之射線與螢幕的交點,即使用者瞄準的位置。我們的系統也支援雙手操作及多人使用。本系統之使用者測試結果顯示,在距離螢幕至少1.6公尺遠的五種不同位置,當使用者閉起非慣用眼僅用單眼瞄準時,可點選到螢幕上11公分寬的目標;當雙眼皆張開時,可點選到12.3公分寬的目標。zh_TW
dc.description.abstractSeveral research studies have attempted to enable remote direct-pointing to increase the effectiveness of freehand remote controlling compared with controlling through an indirect cursor. Nonetheless, since previous remote direct-pointing solutions usually required specific hardware settings for reliable sensing, to deploy such remote direct-pointing methods in the real environment is difficult. In this paper, we present FingerShot, a perspective-based absolute pointing method that requires only one RGBD camera to enable users to directly point to a remote display. The proposed real-time sensing algorithm defines the remote direct-pointing position on the display by tracking users' upper bodies, eyes and fingertips reliably, as well as supporting bi-manual and multi-user interactions. User study results show that users can effectively select on-screen gridded 11cm-width targets with their non-dominant eye closed, and 12.3cm-width targets with their both eyes open from five different positions at least 1.6m away from the display using our remote direct-pointing technique.en
dc.description.provenanceMade available in DSpace on 2021-06-16T06:41:12Z (GMT). No. of bitstreams: 1
ntu-103-R01922069-1.pdf: 8031349 bytes, checksum: 803ab90b20f7ec9bfe7322c8ebafa0f9 (MD5)
Previous issue date: 2014
en
dc.description.tableofcontents口試委員會審定書 i
誌謝 ii
摘要 iii
Abstract iv
1 Introduction 1
1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 FingerShot: Remote Direct-Pointing . . . . . . . . . . . . . . . . . . . . 3
2 Related Work 5
2.1 Device-Based Remote-Pointing . . . . . . . . . . . . . . . . . . . . . . . 5
2.2 Freehand Remote-Pointing . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.3 Perspective-Based Remote-Pointing . . . . . . . . . . . . . . . . . . . . 9
3 System Design 11
3.1 Design Challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.1.1 Pilot Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.1.2 Results and Discussion . . . . . . . . . . . . . . . . . . . . . . . 12
3.2 Design and Implementation . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.2.1 Occlusion-Free Eye Tracking . . . . . . . . . . . . . . . . . . . 13
3.2.2 Real-Time Fingertip Tracking and Rectifying . . . . . . . . . . . 14
3.2.3 Implementing Perspective-Based Direct-Pointing . . . . . . . . . 17
4 Evaluation 19
4.1 Study on Speed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.1.1 Apparatus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.1.2 Experiment Design . . . . . . . . . . . . . . . . . . . . . . . . . 20
4.1.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
4.2 Study on Accuracy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
4.2.1 Apparatus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
4.2.2 Experiment Design . . . . . . . . . . . . . . . . . . . . . . . . . 25
4.2.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
5 Possible Applications and Generalization 40
5.1 Rapid Land-on Selection . . . . . . . . . . . . . . . . . . . . . . . . . . 40
5.1.1 Video Player . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
5.1.2 Calculator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
5.1.3 Combing with Sliding Widgets . . . . . . . . . . . . . . . . . . . 41
5.2 Precise Pointing with Absolute+Relative Cursor . . . . . . . . . . . . . . 42
5.3 Possible Generalization . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
6 Conclusion and Future Work 45
6.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
6.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
A Implementation Details 47
A.1 Coordinate Space Transformation . . . . . . . . . . . . . . . . . . . . . 47
A.2 System Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Bibliography 51
dc.language.isoen
dc.subject彩色深度攝影機zh_TW
dc.subject視角zh_TW
dc.subject遠距zh_TW
dc.subject體感操控zh_TW
dc.subject直接選取zh_TW
dc.subjectRGBD cameraen
dc.subjectDirect-pointingen
dc.subjectRemoteen
dc.subjectPerspectiveen
dc.title以單台彩色深度攝影機實現基於視角之遠距直接選取技術zh_TW
dc.titleFingerShot: Perspective-Based Remote Direct-Pointing Using One RGBD Cameraen
dc.typeThesis
dc.date.schoolyear102-2
dc.description.degree碩士
dc.contributor.oralexamcommittee陳彥仰,余能豪,詹力韋
dc.subject.keyword視角,遠距,體感操控,直接選取,彩色深度攝影機,zh_TW
dc.subject.keywordPerspective,Remote,Direct-pointing,RGBD camera,en
dc.relation.page53
dc.rights.note有償授權
dc.date.accepted2014-07-30
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-103-1.pdf
  未授權公開取用
7.84 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved