請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/69442完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 陳炳宇 | |
| dc.contributor.author | Wei Wei | en |
| dc.contributor.author | 魏瑋 | zh_TW |
| dc.date.accessioned | 2021-06-17T03:15:49Z | - |
| dc.date.available | 2021-07-18 | |
| dc.date.copyright | 2018-07-18 | |
| dc.date.issued | 2018 | |
| dc.date.submitted | 2018-07-05 | |
| dc.identifier.citation | [1] Tobii x2-30 eye tracker accuracy and precision test report, 2013.
[2] F. Alnajar, T. Gevers, R. Valenti, and S. Ghebreab. Calibration-free gaze estimation usinghuman gaze patterns. InComputer vision (iccv), 2013 ieee international conference on,pages 137–144. IEEE, 2013. [3] D. A. Atchison, G. Smith, and G. Smith. Optics of the human eye. 2000. [4] J. Chen and Q. Ji. Probabilistic gaze estimation without active personal calibration. InComputer vision and pattern recognition (cvpr), 2011 ieee conference on, pages 609–616.IEEE, 2011. [5] J. Chen and Q. Ji. A probabilistic approach to online eye gaze tracking without explicitpersonal calibration.IEEE Transactions on Image Processing, 24(3):1076–1086, 2015. [6] A. Esteves, E. Velloso, A. Bulling, and H. Gellersen. Orbits: Gaze interaction for smartwatches using smooth pursuit eye movements. InProceedings of the 28th Annual ACMSymposium on User Interface Software & Technology, pages 457–466. ACM, 2015. [7] A. J. Hornof and T. Halverson. Cleaning up systematic error in eye-tracking data by us-ing required fixation locations.Behavior Research Methods, Instruments, & Computers,34(4):592–604, 2002. [8] T. E. Hutchinson, K. P. White, W. N. Martin, K. C. Reichert, and L. A. Frey. Human-computer interaction using eye-gaze input.IEEE Transactions on systems, man, and cyber-netics, 19(6):1527–1534, 1989. [9] H. Istance, R. Bates, A. Hyrskykari, and S. Vickers. Snap clutch, a moded approach tosolving the midas touch problem. InProceedings of the 2008 symposium on Eye trackingresearch & applications, pages 221–228. ACM, 2008. [10] R. J. Jacob. What you look at is what you get: eye movement-based interaction techniques.InProceedings of the SIGCHI conference on Human factors in computing systems, pages11–18. ACM, 1990. [11] M. Kassner, W. Patera, and A. Bulling. Pupil: an open source platform for pervasive eyetracking and mobile gaze-based interaction. InProceedings of the 2014 ACM internationaljoint conference on pervasive and ubiquitous computing: Adjunct publication, pages 1151–1160. ACM, 2014. [12] S. Kumano, K. Otsuka, R. Ishii, and J. Yamato. Collective first-person vision for automaticgaze analysis in multiparty conversations.IEEE Transactions on Multimedia, 19(1):107–122, 2017. [13] C. Lander, F. Kerber, T. Rauber, and A. Kr ̈uger. A time-efficient re-calibration algorithmfor improved long-term accuracy of head-worn eye trackers. InProceedings of the NinthBiennial ACM Symposium on Eye Tracking Research & Applications, pages 213–216. ACM,2016. [14] C. Lutteroth, M. Penkar, and G. Weber. Gaze vs. mouse: A fast and accurate gaze-only clickalternative. InProceedings of the 28th annual ACM symposium on user interface software& technology, pages 385–394. ACM, 2015. [15] I. S. MacKenzie. An eye on input: Research challenges in using the eye for computerinput control. InProceedings of the 2010 Symposium on Eye-Tracking Research &Applications, ETRA ’10, pages 11–12, New York, NY, USA, 2010. ACM. [16] P. Majaranta and K.-J. R ̈aih ̈a. Twenty years of eye typing: systems and design issues. InProceedings of the 2002 symposium on Eye tracking research & applications, pages 15–22.ACM, 2002. [17] P. Majaranta and K.-J. R ̈aih ̈a. Twenty years of eye typing: Systems and design issues. InProceedings of the 2002 Symposium on Eye Tracking Research & Applications, ETRA ’02,pages 15–22, New York, NY, USA, 2002. ACM. [18] G. W. Maus, M. Duyck, M. Lisi, T. Collins, D. Whitney, and P. Cavanagh. Target displace-ments during eye blinks trigger automatic recalibration of gaze direction.Current biology,27(3):445–450, 2017. [19] C. H. Morimoto and M. R. Mimica. Eye gaze tracking techniques for interactive applica-tions.Computer vision and image understanding, 98(1):4–24, 2005. [20] T. Ohno. One-point calibration gaze tracking method. InProceedings of the 2006 sympo-sium on Eye tracking research & applications, pages 34–34. ACM, 2006. [21] T. Ohno and N. Mukawa. A free-head, simple calibration, gaze tracking system that enablesgaze-based interaction. InProceedings of the 2004 symposium on Eye tracking research &applications, pages 115–122. ACM, 2004. [22] A. M. Penkar, C. Lutteroth, and G. Weber. Designing for the eye: Design parameters fordwell in gaze interaction. InProceedings of the 24th Australian Computer-Human Interac-tion Conference, OzCHI ’12, pages 479–488, New York, NY, USA, 2012. ACM. [23] A. M. Penkar, C. Lutteroth, and G. Weber. Designing for the eye: design parameters fordwell in gaze interaction. InProceedings of the 24th Australian Computer-Human Interac-tion Conference, pages 479–488. ACM, 2012. [24] A. M. Penkar, C. Lutteroth, and G. Weber. Eyes only: Navigating hypertext with gaze. InIFIP Conference on Human-Computer Interaction, pages 153–169. Springer, 2013. [25] D. Perra, R. K. Gupta, and J.-M. Frahm. Adaptive eye-camera calibration for head-worndevices. InCVPR, pages 4146–4155, 2015. [26] K. Pfeuffer, M. Vidal, J. Turner, A. Bulling, and H. Gellersen. Pursuit calibration: makinggaze calibration less tedious and more flexible. InProceedings of the 26th annual ACMsymposium on User interface software and technology, pages 261–270. ACM, 2013. [27] L. A. Riggs, J. P. Kelly, K. A. Manning, and R. K. Moore. Blink-related eye movements.Investigative ophthalmology & visual science, 28(2):334–342, 1987. [28] T. Santini, W. Fuhl, and E. Kasneci. Calibme: Fast and unsupervised eye tracker calibrationfor gaze-based pervasive human-computer interaction. InProceedings of the 2017 CHIConference on Human Factors in Computing Systems, pages 2594–2605. ACM, 2017. [29] L. E. Sibert and R. J. K. Jacob. Evaluation of eye gaze interaction. InProceedings of theSIGCHI Conference on Human Factors in Computing Systems, CHI ’00, pages 281–288,New York, NY, USA, 2000. ACM. [30] S. Stellmach and R. Dachselt. Look & touch: gaze-supported target acquisition. InProceed-ings of the SIGCHI Conference on Human Factors in Computing Systems, pages 2981–2990.ACM, 2012. [31] Y. Sugano, Y. Matsushita, and Y. Sato. Learning-by-synthesis for appearance-based 3d gazeestimation. InComputer Vision and Pattern Recognition (CVPR), 2014 IEEE Conferenceon, pages 1821–1828. IEEE, 2014. [32] S. Tripathi and B. Guenter. A statistical approach to continuous self-calibrating eye gazetracking for head-mounted virtual reality systems. InApplications of Computer Vision(WACV), 2017 IEEE Winter Conference on, pages 862–870. IEEE, 2017. [33] B. Velichkovsky, A. Sprenger, and P. Unema. Towards gaze-mediated interaction: Collectingsolutions of the “midas touch problem”. InHuman-Computer Interaction INTERACT’97,pages 509–516. Springer, 1997. [34] M. Vidal, A. Bulling, and H. Gellersen. Pursuits: spontaneous interaction with displaysbased on smooth pursuit eye movement and moving targets. InProceedings of the 2013ACM international joint conference on Pervasive and ubiquitous computing, pages 439–448. ACM, 2013. [35] A. Villanueva, R. Cabeza, and S. Porta. Gaze tracking system model based on physi-cal parameters.International Journal of Pattern Recognition and Artificial Intelligence,21(05):855–877, 2007. [36] C. Ware and H. H. Mikaelian. An evaluation of an eye tracker as a device for computerinput2. InAcm sigchi bulletin, volume 17, pages 183–188. ACM, 1987. [37] H. Yamazoe, A. Utsumi, T. Yonezawa, and S. Abe. Remote gaze estimation with a singlecamera based on facial-feature tracking without special calibration actions. InProceedingsof the 2008 symposium on Eye tracking research & applications, pages 245–250. ACM,2008. | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/69442 | - |
| dc.description.abstract | 在這篇論文中提出一種以視覺選取機制為基礎之上實作的眼動儀自動校準機制,使操作者能夠在不受打斷的情況下進行重新校準。此方法是藉由使用者進行眼動儀操作時產生的視覺資訊來對系統內的迴歸模型進行更新,進而讓使用者能夠保持良好的校準品質以進行長時間的眼動操作。這個方法中所用作的基礎的視覺選取方法是以「多重確認」的形式進行,透過兩階段的選取步驟以防止誤觸非使用者所想的目標。使用者必須將視線停留在欲選取物件上並在第二步驟中把視線移至確認用目標上以完成一次選取,我們將第二步驟中的確認用目標作為蒐集視覺資料的區塊,每當使用者進行確認步驟時就能觸發自動校準機制。此研究除提出將重新校準融入一般操作的概念外,也提出一種將眼動儀的迴歸模型依據資料蒐集區域劃分為不同部分,並將模型更改為權重迴歸模型以均衡使用者在不同位置上的視覺落差。我們透過實作概念性的操作介面讓受測者使用,在這之上分析使用者在加入自動校準系統後的使用情境。 | zh_TW |
| dc.description.abstract | In this paper, we present a novel approach of recalibrating head-mounted eye-tracking systems in runtime without deliberating user's working tasks. By gathering the data points from the user's working tasks (e.g., selecting a target), we may update the regression model used for mapping the device's positions to the gaze points of the eye-tracker with the data collected.
This approach eliminates the need of an explicit recalibration process. While maintaining the stability of the regression model, the user may need to continue his/her tasks without being interrupted if the mapping quality dropped. Our method is built on a known dwell-based gaze selection method 'Multiple Confirm', which highlights a two-step validation on selecting targets. However, the user needs to dwell on the target and another confirmation target to complete the selection procedure, this may stop the user's manipulation. Based on this, we modify the confirmation target and collect nearby gaze data, which are used for replacing the outdated data in the calibration. Then, a robust model is developed to update the regression model's different sectors in a sequential order. To evaluate the proposed method, we compared it against the usage of eye-trackers without applying runtime recalibration. The result indicates that the accuracy bias of the model can be controlled within a certain degree of visual angle even in a long-term usage. | en |
| dc.description.provenance | Made available in DSpace on 2021-06-17T03:15:49Z (GMT). No. of bitstreams: 1 ntu-107-R05725019-1.pdf: 3086136 bytes, checksum: 2a9f6c910f53465c4935f72d89482dc9 (MD5) Previous issue date: 2018 | en |
| dc.description.tableofcontents | Chapter 1 - Introduction p.1
Chapter 2 - Related Work p.5 Chapter 3 - Overview p.11 Chapter 4 - Design Space p.18 Chapter 5 - Methodologies p.26 Chapter 6 - Evaluation p.30 Chapter 7 - Discussion p.38 Chapter 8 - Conclusion p.41 Bibliography p.43 | |
| dc.language.iso | zh-TW | |
| dc.subject | 重新校準 | zh_TW |
| dc.subject | 視覺輸入 | zh_TW |
| dc.subject | Recalibration | en |
| dc.subject | gaze input. | en |
| dc.title | 基於視覺選取的頭戴式眼動儀自動校準系統 | zh_TW |
| dc.title | Recalibrating Head-Mounted Eye-Trackers in Gaze Selection | en |
| dc.type | Thesis | |
| dc.date.schoolyear | 106-2 | |
| dc.description.degree | 碩士 | |
| dc.contributor.oralexamcommittee | 余能豪,張永儒,黃大源,詹力韋 | |
| dc.subject.keyword | 重新校準,視覺輸入, | zh_TW |
| dc.subject.keyword | Recalibration,gaze input., | en |
| dc.relation.page | 46 | |
| dc.identifier.doi | 10.6342/NTU201801322 | |
| dc.rights.note | 有償授權 | |
| dc.date.accepted | 2018-07-06 | |
| dc.contributor.author-college | 管理學院 | zh_TW |
| dc.contributor.author-dept | 資訊管理學研究所 | zh_TW |
| 顯示於系所單位: | 資訊管理學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-107-1.pdf 未授權公開取用 | 3.01 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
