Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/74483
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳彥仰(Mike Y. Chen)
dc.contributor.authorLeon Yulun Hsuen
dc.contributor.author徐御倫zh_TW
dc.date.accessioned2021-06-17T08:38:22Z-
dc.date.available2019-08-13
dc.date.copyright2019-08-13
dc.date.issued2019
dc.date.submitted2019-08-08
dc.identifier.citation[1]R. Bates and H. Istance. Why are eye mice unpopular?—a detailed comparison of head and eye controlled assistive technology pointing devices.Universal Access inthe Information Society, 2, 01 2003.
[2]E. Bizzi, R. E. Kalil, and V. Tagliasco. Eye-head coordination in monkeys: Evidence for centrally patterned organization.Science (New York, N.Y.), 173:452–4, 08 1971.
[3]H. Drewes and A. Schmidt. Interacting with the computer using gaze gestures. InProceedings of the 11th IFIP TC 13 International Conference on Human-computer Interaction - Volume Part II, INTERACT’07, pages 475–488, Berlin, Heidelberg,2007. Springer-Verlag.
[4]E. G. Freedman. Coordination of the eyes and head during visual orienting.Experi-mental Brain Research, 190(4):369, Aug 2008.
[5]S.-T. Graupner and S. Pannasch. Continuous gaze cursor feedback in various tasks:Influence on eye movement behavior, task performance and subjective distraction. InC. Stephanidis, editor,HCI International 2014 - Posters’ Extended Abstracts, pages323–329, Cham, 2014. Springer International Publishing.
[6]M. Gresty. Coordination of head and eye movements to fixate continuous and inter-mittent targets.Vision Research, 14(6):395 – 403, 1974.
[7]D. Guitton, A. Bergeron, W. Y. Choi, and S. Matsuo. On the feedback control oforienting gaze shifts made with eye and head movements. 142:55 – 68, 2003
[8]D. Guitton and M. Volle. Gaze control in humans: eye-head coordination duringorienting movements to targets within and beyond the oculomotor range.Journal of neurophysiology, 58 3:427–59, 1987.
[9]R. L. Huston. The measure of man and woman–human factors in design alvin r.tilley, henry dreyfuss associates 1993, 96 pages, $60.00 new york: Whitney library of design, watson-guptill isbn 0-8230-3031-8.Ergonomics in Design, 2(2):37–39,1994.
[10]H. Istance, A. Hyrskykari, L. Immonen, S. Mansikkamaa, and S. Vickers. Designinggaze gestures for gaming: An investigation of performance. In Proceedings of the2010 Symposium on Eye-Tracking Research & Applications, ETRA ’10, pages323–330, New York, NY, USA, 2010. ACM.
[11]R. J. Jacob and K. S. Karn. Commentary on section 4 - eye tracking in human-computer interaction and usability research: Ready to deliver the promises. InJ. Hyönä, R. Radach, and H. Deubel, editors,The Mind’s Eye, pages 573 – 605.North-Holland, Amsterdam, 2003.
[12]R. J. K. Jacob. The use of eye movements in human-computer interaction techniques:What you look at is what you get.ACM Trans. Inf. Syst., 9(2):152–169, Apr. 1991.
[13]S. Jalaliniya, D. Mardanbeigi, T. Pederson, and D. W. Hansen. Head and eye move-ment as pointing modalities for eye wear computers. In 2014 11th International Con-ference on Wearable and Implantable Body Sensor Networks Workshops, pages 50–53, June 2014.
[14]M. Kumar, A. Paepcke, T. Winograd, and T. Winograd. Eyepoint: Practical pointing and selection using gaze and keyboard. In Proceedings of the SIGCHI Conferenceon Human Factors in Computing Systems, CHI ’07, pages 421–430, New York, NY,USA, 2007. ACM.
[15]M. Kytö, B. Ens, T. Piumsomboon, G. A. Lee, and M. Billinghurst. Pinpointing:Precise head- and eye-based target selection for augmented reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, CHI ’18,pages 81:1–81:14, New York, NY, USA, 2018. ACM.
[16]M. Land and B. Tatler. Looking and acting: Vision and eye movements in natural behaviour. Looking and Acting: Vision and Eye Movements in Natural Behaviour,pages 1–288, 01 2012.
[17]C. J. Lin, S.-H. Ho, and Y.-J. Chen. An investigation of pointing postures in a 3dstereoscopic environment.Applied Ergonomics, 48:154 – 163, 2015.
[18]P. Morasso, E. Bizzi, and J. Dichgans. Adjustment of saccade characteristics during head movements.Experimental Brain Research, 16(5):492–500, Mar 1973.
[19]M. E. Mott, S. Williams, J. O. Wobbrock, and M. R. Morris. Improving dwell-based gaze typing with dynamic, cascading dwell times. InProceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI ’17, pages 2558–2570,New York, NY, USA, 2017. ACM.
[20]K. Pfeuffer, J. Alexander, M. K. Chong, Y. Zhang, and H. Gellersen. Gaze-shifting:Direct-indirect input with pen and touch modulated by gaze. InProceedings of the28th Annual ACM Symposium on User Interface Software & Technology, UIST’15, pages 373–383, New York, NY, USA, 2015. ACM.
[21]T. Piumsomboon, G. Lee, R. W. Lindeman, and M. Billinghurst. Exploring naturaleye-gaze-based interaction for immersive virtual reality. In 2017 IEEE Symposiumon 3D User Interfaces (3DUI), pages 36–39, March 2017.
[22]V. Rajanna and J. P. Hansen. Gaze typing in virtual reality: Impact of keyboard design, selection method, and motion. InProceedings of the 2018 ACM Symposiumon EyeTracking Research&Applications, ETRA ’18, pages 15:1–15:10, New York,NY, USA, 2018. ACM.
[23]S. Stellmach and R. Dachselt. Look & touch: Gaze-supported target acquisition. In Proceedings of the SIGCHI Conference on Human Factorsin Computing Systems,CHI ’12, pages 2981–2990, New York, NY, USA, 2012. ACM.
[24]G. Thibodeau and K. Patton.Anatomy and Physiology Third Edition.Mosby, St.Louis, 1996.
[25]D. Tweed, B. Glenn, and T. Vilis. Eye-head coordination during large gaze shifts.Journal of Neuro physiology, 73(2):766–779, 1995. PMID: 7760133.
[26]E. Velloso, M. Wirth, C. Weichel, A. Esteves, and H. Gellersen. Ambigaze: Directcontrol of ambient devices by gaze. InProceedings of the 2016 ACM Conferenceon Designing Interactive Systems, DIS ’16, pages 812–817, New York, NY, USA,2016. ACM.
[27]J.-L. Vercher and G. Gauthier. Eye-head movement coordination: Vestibulo-ocularreflex suppression with head-fixed target fixation.Journal of vestibular research :equilibrium & orientation, 1:161–70, 01 1991.
[28]M. Vidal, A. Bulling, and H. Gellersen. Pursuits: Spontaneous interaction with dis-plays based on smooth pursuit eye movement and moving targets. InProceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp ’13, pages 439–448, New York, NY, USA, 2013. ACM.
[29]O. Špakov, P. Isokoski, and P. Majaranta. Look and lean: Accurate head-assistedeye pointing. InProceedings of the Symposium on Eye Tracking Research and Applications, ETRA ’14, pages 35–42, New York, NY, USA, 2014. ACM.
[30]O. Špakov and P. Majaranta. Enhanced gaze interaction using simple head gestures.InProceedings of the 2012 ACM Conference on Ubiquitous Computing, UbiComp’12, pages 705–710, New York, NY, USA, 2012. ACM.
[31]C. Ware and H. H. Mikaelian. An evaluation of an eye tracker as a device for computer input 2. InProceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface, CHI ’87, pages 183–188, New York,NY, USA, 1987. ACM.
[32]H. Zangemeister and L. Stark. Types of gaze movement: Variable interactions of eye and head movements.Experimental neurology, 77:563–77, 10 1982.
[33]S. Zhai, C. Morimoto, and S. Ihde. Manual and gaze input cascaded (magic) pointing.In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems,CHI ’99, pages 246–253, New York, NY, USA, 1999. ACM.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/74483-
dc.description.abstract隨著VR頭戴式顯示器的快速發展,研究人員一直在探索可能的交互技術,使用戶能夠輕鬆地與虛擬環境進行交互。研究強調頭部和眼睛的使用,因為我們本能地與我們視線中的物體相互作用。當觀察目標時,頭部和眼睛協調旋轉,以使目標保持在人的中心視野內;然而,它們旋轉的速度和速度尚未量化。之前的工作表明,雖然比眼睛凝視更慢,但是頭部指向更準確,而凝視指向更快但不太準確。更好地理解頭部運動與凝視將使我們能夠結合這兩種指向技術來實現快速和更準確的指向。這項工作首先研究了VR環境中目標採集期間的頭眼協調。然後,我們進一步利用了頭眼協調的本質,並利用眼睛注視的速度和頭部旋轉的精度實現了算法。我們後來在速度,準確性方面比較了Gaze + Head與頭部和眼睛指向的性能。結果表明,我們的技術快速,輕鬆,高度準確,因此比傳統的頭部和眼睛注視技術更受青睞。zh_TW
dc.description.abstractWith rapid advancement of VR head-mounted display, researchers have been exploring possible interaction technique that allow users to easily interact with virtual environment. Researches have emphasize on uses of head and eyes, since we instinctively interact with objects in our line of sight. When looking at a target, both the head and the eyes rotate in coordination to keep the target within the human’s central vision; however, how much and how fast they rotate have not been quantified. Prior works have shown that head pointing is more accurate though slower than eye gazing, while gaze pointing is faster but less accurate. The better understanding of head movement versus gaze will enable us to combine these two pointing techniques to achieve a fast and more accurate pointing. This work first investigated head-eye coordination during target acquisition in a VR environment. Then, we further exploited the nature of head-eye coordination and implemented an algorithm utilizing the speed of eye gaze and the precision of head rotation. We later compared the performance of Gaze+Head with head and eye pointing in terms of speed, accuracy. The result demonstrated that our technique is fast, effortless, and highly accurate, thus a much more favored pointing method than the traditional head and eye gazing techniques.en
dc.description.provenanceMade available in DSpace on 2021-06-17T08:38:22Z (GMT). No. of bitstreams: 1
ntu-108-R06922138-1.pdf: 1017989 bytes, checksum: 2405e4ca12b34b1ee85aa02d01ddad05 (MD5)
Previous issue date: 2019
en
dc.description.tableofcontents摘要 i
Abstract ii
1 Introduction 1
2 Related Work 4
2.1 Eye and Head Behavior 4
2.2 Eye Gaze 5
2.3 Gaze and Head Interaction 5
3 Design Consideration 7
4 User Behaviour Study 9
4.1 Participants 9
4.2 Apparatus 9
4.3 Procedure 10
4.4 Design 11
5 Result 12
6 Implementation 14
7 Comparative Study 16
7.1 Participants 17
7.2 Procedure 17
7.3 Design 17
8 Result 19
8.1 Average Time 19
8.2 Error 20
8.3 Questionnaires 20
8.4 Post Interview 21
9 Discussion 23
10 Future work and Conclusion 25
Bibliography 26
dc.language.isozh-TW
dc.subject互動科技zh_TW
dc.subject虛擬實境zh_TW
dc.subjectInteractive technologyen
dc.subjectVirtual realityen
dc.title凝視指向與頭細化微調指向操作zh_TW
dc.titleGaze+Head: Gaze Pointing with Implicitly Trigger Head Refinementen
dc.typeThesis
dc.date.schoolyear107-2
dc.description.degree碩士
dc.contributor.oralexamcommittee鄭龍磻(Lung-Pan Cheng),黃大源(Da-yuan Huang),詹力韋(Liwei Chan)
dc.subject.keyword互動科技,虛擬實境,zh_TW
dc.subject.keywordInteractive technology,Virtual reality,en
dc.relation.page30
dc.identifier.doi10.6342/NTU201901984
dc.rights.note有償授權
dc.date.accepted2019-08-08
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-108-1.pdf
  未授權公開取用
994.13 kBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved