請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/825
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 陳永耀 | |
dc.contributor.author | Zheng-Hong Ma | en |
dc.contributor.author | 馬政宏 | zh_TW |
dc.date.accessioned | 2021-05-11T05:07:55Z | - |
dc.date.available | 2020-02-15 | |
dc.date.available | 2021-05-11T05:07:55Z | - |
dc.date.copyright | 2019-02-15 | |
dc.date.issued | 2019 | |
dc.date.submitted | 2019-02-13 | |
dc.identifier.citation | REFERENCES
[1] S. P. Liversedge and J. M. Findlay, “Saccadic eye movements and cognition,” Trends in Cognitive Science, vol. 4, no. 1, pp. 6–14, 2000. [2] R. J. K. Jacob, “The use of eye movements in human computer interaction techniques: What you look at is what you get,” ACM Trans. Inf. Syst., vol. 9, no. 3, pp. 152–169, 1991. [3] G. Underwood, Cognitive Processes in Eye Guidance. Oxford Univ. Press, 2005. [4] S. P. Liversedge and J. M. Findlay, “Saccadic eye movements and cognition,” Trends in Cognitive Science, vol. 4, no. 1, pp. 6–14, 2000. [5] M. F. Mason, B. M. Hood, and C. N. Macrae, “Look into my eyes: Gaze direction and person memory,” Memory, vol. 12, pp. 637–643, 2004. [6] R. J. K. Jacob, “The use of eye movements in human computer interaction techniques: What you look at is what you get,” ACM Trans. Inf. Syst., vol. 9, no. 3, pp. 152–169, 1991. [7] S. Zhai, C. H. Morimoto, and S. Ihde, “Manual and gaze input cascaded (magic) pointing,” in Proc. ACM SIGHCI-Human Factors Comput. Syst. Conf., 1999, pp. 246–253. [8] Z. Zhu and Q. Ji, “Eye and gaze tracking for interactive graphic display,” Mach. Vis. Application, vol. 15, no. 3, pp. 139–148, 2004. [9] Rafael C. Gonzalez, Richard E. Woods, 'Digital Image Processing', Prentice-Hall, Inc., Second Edition, 2002 [10] J. Merchant, R. Morrissette, and J. Porterfield, “Remote Measurements of Eye Direction Allowing Subject Motion over One Cubic Foot of Space,” IEEE Trans. Biomedical Eng., vol. 21, no. 4, pp. 309-317, July 1974. [11] N. Ramanauskas, 'Calibration of video-oculographical eye-tracking system,' Elektronika ir Elektrotechnika, vol. 72, pp. 65-68, 2015. [12] Suwitchaya Rattarom, Nattapol Aunsri and Surapong Uttama, “Interpolation Based Polynomial Regression for Eye Gazing Estimation: A Comparative Study”, 2015 IEEE International Conference on ECTI-CON. [13] J. Coughlan, A. Yuille, C. English, and D. Snow, “Efficient Deformable Template Detection and Localization without User Initialization,” Computer Vision and Image Understanding, vol. 78, no. 3, pp. 303-319, 2000. [14] C. H. Morimoto and M. Mimica, “Eye gaze tracking techniques for interactive applications,” Comput. Vi. Image Understand, Special Issue on Eye Detection and Tracking, vol. 98, no. 1, pp. 4–24, 2005. [15] Dong-Chan Cho and Whoi-Yul Kim, “Long-Range Gaze Tracking System for Large Movements”, IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. 60, NO. 12, DECEMBER 2013. [16] P. Viola and M. J. Jones, 'Robust real-time face detection,' International journal of computer vision, vol. 57, pp. 137-154, 2004. [17] Zhiwei Zhu and Qiang Ji, “Novel Eye Gaze Tracking Techniques Under Natural Head Movement”, IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. 54, NO. 12, DECEMBER 2007. [18] Dan Witzner Hansen and Qiang Ji, “In the Eye of the Beholder: A Survey of Models for Eyes and Gaze”, IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 32, NO. 3, MARCH 2010. [19] R.H.S. Carpenter, “Movements of the Eyes.” Pion Limited, 1988. [20] E.D. Guestrin and M. Eizenman, “General Theory of Remote Gaze Estimation Using the Pupil Center and Corneal Reflections,” IEEE Trans. Biomedical Eng., vol. 53, no. 6, pp. 1124-1133, June 2006. [21] Takehiko Ohno, Naoki Mukawa, Atsushi Yoshikawa “FreeGaze: A Gaze Tracking System for Everyday Gaze Interaction”, Proceedings of the symposium on ETRA: eye tracking research and applications symposium, pp.125-132, 2002. [22] Anuradha Kar and Peter Corcoran, “A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms”, Digital Object Identifier 10.1109/ACCESS.2017.2735633 [23] Alfred Yarbus, “Eye Movements and Vision”, PLENUM PRESS NEW YORK 1967. [24] S. Milekic, “The more you look the more you get: Intention-based interface using gaze-tracking,” in Trant, J.(Des.) Museums and the Web 2002: Selected Papers from an Int. Conf., Archives and Museum Informatics, 2002, pp. 1–27. [25] K. Hyoki, M. Shigeta, N. Tsuno, Y. Kawamuro, and T. Kinoshita, “Quantitative electro-oculography and electroencephalography as indexes of alertness,” Electroencephalogr. Clinical Neurophysiol., vol.106, pp. 213–219, 1998. [26] A. E. Kaufman, A. Bandopadhay, and B. D. Shaviv, 'An eye tracking computer user interface,' in Virtual Reality, 1993. Proceedings., IEEE 1993 Symposium on Research Frontiers in, pp. 120-121, 1993. [27] D. A. Robinson, 'A method of measuring eye movemnent using a scieral search coil in a magnetic field,' IEEE Transactions on bio-medical electronics, vol. 10, pp. 137-145, 1963. [28] A. Villanueva, R. Cabeza, and S. Porta, “Eye Tracking: Pupil Orientation Geometrical Modeling,” Image and Vision Computing,vol. 24, no. 7, pp. 663-679, July 2006. [29] A. Villanueva, R. Cabeza, and S. Porta, “Gaze Tracking System Model Based on Physical Parameter,” Int’l J. Pattern Recognition and Artificial Intelligence, 2007. [30] T. Ohno, N. Mukawa, and A. Yoshikawa, “Freegaze: A Gaze Tracking System for Everyday Gaze Interaction,” Proc. Eye Tracking Research Applications Symp., pp. 125-132, 2002. [31] S.-W. Shih, Y.-T. Wu, and J. Liu, “A Calibration-Free Gaze Tracking Technique,” Proc. 15th Int’l Conf. Pattern Recognition, pp. 201-204, 2000. [32] Sheng-Wen Shih and Jin Liu, “A Novel Approach to 3-D Gaze Tracking Using Stereo Cameras”, IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS—PART B: CYBERNETICS, VOL. 34, NO. 1, FEBRUARY 2004. [33] Craig Hennessey, Borna Noureddin and Peter Lawrence, “A Single Camera Eye-Gaze Tracking System with Free Head Motion”, ETRA 2006, San Diego, California, 27–29 March 2006. [34] GOSS, D. A., AND WEST, R. W. 2002. “Introduction to the optics of the eye. Butterworth-Heinemann.” [35] Chunfei Ma, Kang-A Choi, Byeong-Doo Choi and Sung-Jea Ko, “Robust remote gaze estimation method based on multiple geometric transforms”, Optical Engineering 54(8), 083103 (August 2015). [36] D. H. Yoo and M. J. Chung, ``A novel non-intrusive eye gaze estimation using cross-ratio under large head motion,' Comput. Vis. Image Understand., vol. 98, no. 1, pp. 25-51, Apr. 2005. [37] I. F. Ince and J. W. Kim, ``A 2D eye gaze estimation system with low resolution webcam images,' EURASIP J. Adv. Signal Process., vol. 2011,no. 1, p. 40, Dec. 2011. [38] W. Wang, Y. Huang, and R. Zhang, ``Driver gaze tracker using deformable template matching,' in Proc. IEEE Int. Conf. Veh. Electron. Safety, Jul. 2011, pp. 244-247. [39] D. Beymer and M. Flickner, “Eye gaze tracking using an active stereo head,” in Proc. Int. Conf. Computer Vision and Pattern Recognition, 2003, pp. 451–458. [40] T. Ohno and N. Mukawa, “A free-head, simple calibration, gaze tracking system that enables gaze-based interaction,” in Proc. Symp. Eye Tracking Res. Appl., 2004, pp. 115–122. [41] C. A. Hennessey and P. D. Lawrence, “Improving the accuracy and reliability of remote system-calibration-free eye-gaze tracking,” IEEE Trans.Biomed. Eng., vol. 56, no. 7, pp. 1891–1900, Jul. 2009. [42] C.Hennessey and J. Fiset, “Long range eye tracking: Bringing eye tracking into the living room,” in Proc. Symp. Eye Tracking Res. Appl., 2012, pp. 249–252. [43] D. C. Cho,W. S.Yap, H. K. Lee, I. J. Lee, and W. Y.Kim, “Long range eye gaze tracking system for a large screen,” IEEE Trans. Consum. Electron., vol. 58, no. 4, pp. 1119–1128, Nov. 2012. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/handle/123456789/825 | - |
dc.description.abstract | 眼動追蹤技術在未來的人機互動以及虛擬實境等領域裡都有很大的應用潛力。雖然眼對追蹤有長久的研究歷史,時至今日,仍有許多限制。大部分眼動追蹤針對大約70 cm 左右的操作距離,並且可能不允許頭部自由運動。對比於靜態坐姿的追蹤,動態頭部的長距離追蹤則需要更多改善。本研究旨在開發一定程度頭部運動下的長距離眼動追蹤系統。本研究提出四種方法且針對其中三種進行實驗。從實驗結果得知,本研究確實有效減少內插式眼動追蹤法的頭部運動誤差,其中平均誤差均小於1°,符合精準型的凝視點偵測,且測試頭部有效運動範圍為60×16×40以上,操作距離可達2米,整體表現相對於其他研究還算相當不錯。 | zh_TW |
dc.description.abstract | Gaze estimation is a potential technique in future Human Computer Interaction (HCI) and virtual reality. Although the long history of gaze estimation research, there are still many limits in algorithms nowadays. Oftentimes, many approaches are only valid to use in a distance around 70 cm and might not allow free head motion. This is not desirable when it comes to dynamic and long range head motion gaze estimation instead of stationary sitting gaze estimation.This work aims at developing long range gaze estimation system which allows certain degree of head motion. And four proposed methods with experiment for three methods are also provided. From the experiment, it could be shown that this work indeed effectively reduce the head motion error in interpolation based gaze estimation, and the average error are all below 1° which could be regarded as high accuracy gaze estimation. And the tested effective head motion range could be larger than 60×16×40 and the operation distance is 2 m. The overall performance of this work is fairly well compared with other researches. | en |
dc.description.provenance | Made available in DSpace on 2021-05-11T05:07:55Z (GMT). No. of bitstreams: 1 ntu-108-R05921002-1.pdf: 7382697 bytes, checksum: ce8bc9cd5c6f93b6a58b461f2e3454cc (MD5) Previous issue date: 2019 | en |
dc.description.tableofcontents | 口試委員會審定書
誌謝 i 中文摘要 ii ABSTRACT iii CONTENTS iv LIST OF FIGURES viii LIST OF TABLES xvi Chapter 1 Introduction 1 1.1 Motivation and Problem Definition 2 1.2 Previous Work of Gaze Estimation System 3 1.3 Proposed Approach 4 1.4 Thesis Overview 5 Chapter 2 Study on Gaze Estimation 7 2.1 Eye Structure and Movement type 7 2.2 Intrusive Gaze Estimation 10 2.3 Non-intrusive Gaze Estimation 11 2.3.1 Interpolation based method 12 2.3.2 3D Model Based Method 21 2.3.3 Other Methods 29 Chapter 3 Fundamentals of 2D Interpolation Based Method 32 3.1 Feature Extraction 32 3.1.1 Pupil Center Extraction 33 3.1.2 Corneal Reflection Extraction 35 3.2 Calibration Procedure 36 3.3 Mapping Function 37 3.3.1 Second order polynomial 38 3.3.2 Trigonometric Mapping Function 40 3.4 Adaptive Feature Extraction 46 3.4.1 Adaptive Pupil Threshold 47 3.4.2 Noise Reduction 50 3.4.3 Adaptive Corneal Reflections Threshold 52 Chapter 4 Proposed Head Motion Error Compensation Algorithms 54 4.1 Geometry Restoration Method 55 4.1.1 PCCR transfer relation 57 4.1.2 Retrieve Mapping function 62 4.1.3 Assessment for Geometry Restoration Method 67 4.2 Advanced Geometry Restoration Method 70 4.2.1 Revisit PCCR transfer process 71 4.2.2 Image Plane Rotation 74 4.2.3 Restore a Rotated Virtual Calibration System 76 4.2.4 Invalid Mapping Region Examination 80 4.2.5 Error Amplification Rate 83 4.3 Distance Normalization PCCR method 85 4.3.1 PCCR Variation for X-Y Plane Head Motion 85 4.3.2 PCCR Variation for Z Direction Head Motion 87 4.3.3 Empirical Data Justification 88 4.3.4 PCCR Vector with Distance Normalization 92 4.3.5 Compensation for General Head Motion 95 4.3.6 Assessment for Distance Normalization PCCR Method 101 4.4 Area Coordinate Mapping Method 102 4.4.1 Area coordinate 102 4.4.2 Area Coordinate Defined Feature 104 4.4.3 Model Discussion 106 4.4.4 Assessment for Area Mapping Method 108 Chapter 5 Gaze Estimation Experiment 111 5.1 Experiment Setting 112 5.2 Calibration Procedure 114 5.3 Results of Gaze Estimation 118 5.3.1 Area Coordinate Mapping (ACM) Result 118 5.3.2 Advanced Geometry Restoration (AGR) Result 124 5.3.3 Distance Normalized PCCR (DNP) Result 130 5.4 Discussion 136 5.5 Comparison 138 Chapter 6 Conclusions and Future Work 140 REFERENCES 142 APPENDIX 148 | |
dc.language.iso | en | |
dc.title | 新型自由頭部運動之長距離眼動追蹤演算法 | zh_TW |
dc.title | Novel Algorithms for Long Distance Free Head Motion Gaze Estimation System | en |
dc.date.schoolyear | 107-1 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 何明志,顏家鈺,林文澧 | |
dc.subject.keyword | 遠距離眼動追蹤,內插式眼動追蹤法,頭部運動誤差,瞳孔偵測,映射函數, | zh_TW |
dc.subject.keyword | Remote eye gaze estimation,Interpolation based gaze estimation,Head motion error,Pupil detection,Mapping function, | en |
dc.relation.page | 194 | |
dc.identifier.doi | 10.6342/NTU201900345 | |
dc.rights.note | 同意授權(全球公開) | |
dc.date.accepted | 2019-02-13 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 電機工程學研究所 | zh_TW |
顯示於系所單位: | 電機工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-108-1.pdf | 7.21 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。