請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/20733
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 簡韶逸(Shao-Yi Chien) | |
dc.contributor.author | Liang Fang | en |
dc.contributor.author | 方亮 | zh_TW |
dc.date.accessioned | 2021-06-08T03:00:52Z | - |
dc.date.copyright | 2017-07-27 | |
dc.date.issued | 2017 | |
dc.date.submitted | 2017-07-24 | |
dc.identifier.citation | [1] D. A. Robinson, “A method of measuring eye movemnent using a scieral search coil in a magnetic field,” IEEE Transactions on Bio-medical Electronics, vol. 10, no. 4, pp. 137–145, Oct 1963.
[2] A. Bulling, J. A. Ward, H. Gellersen, and G. Troster, “Eye movement analysis for activity recognition using electrooculography,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 33, no. 4, pp. 741–753, April 2011. [3] D. W. Hansen and Q. Ji, “In the eye of the beholder: A survey of models for eyes and gaze,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, no. 3, pp. 478–500, March 2010. [4] L. S ́wirski, A. Bulling, and N. Dodgson, “Robust real-time pupil tracking in highly off-axis images,” in Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 2012, pp. 173–176. [5] E. D. Guestrin and M. Eizenman, “General theory of remote gaze estimation using the pupil center and corneal reflections,” IEEE Transactions on Biomedical Engineering, vol. 53, no. 6, pp. 1124–1133, June 2006. [6] L. Swirski and N. Dodgson, “A fully-automatic, temporal approach to single camera, glint-free 3d eye model fitting,” Proc. PETMEI, 2013. [7] K.-H. Tan, D. J. Kriegman, and N. Ahuja, “Appearance-based eye gaze estimation,” in Sixth IEEE Workshop on Applications of Computer Vision, 2002. (WACV 2002). Proceedings., 2002, pp. 191–195. [8] F. Lu, Y. Sugano, T. Okabe, and Y. Sato, “Adaptive linear regression for appearance-based gaze estimation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 36, no. 10, pp. 2033–2046, Oct 2014. [9] B. C. Chen, P. C. Wu, and S. Y. Chien, “Real-time eye localization, blink detection, and gaze estimation system without infrared illumination,” in Proceedings of 2015 IEEE International Conference on Image Processing (ICIP), Sept 2015, pp. 715–719. [10] D. Li and D. J. Parkhurst, “Starburst: A robust algorithm for video-based eye tracking,” Elselvier Science, vol. 6, 2005. [11] W. Fuhl, M. Tonsen, A. Bulling, and E. Kasneci, “Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art,” Machine Vision and Applications, vol. 27, no. 8, pp. 1275–1288, 2016. [12] M. Tonsen, X. Zhang, Y. Sugano, and A. Bulling, “Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments,” in Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 2016, pp. 139–142. [13] W. Fuhl, T. Ku ̈bler, K. Sippel, W. Rosenstiel, and E. Kasneci, “Excuse: Robust pupil detection in real-world scenarios,” in Proceedings of International Conference on Computer Analysis of Images and Patterns. Springer, 2015, pp. 39–51. [14] W. Fuhl, T. C. Santini, T. Ku ̈bler, and E. Kasneci, “Else: Ellipse selection for robust pupil detection in real-world environments,” in Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 2016, pp. 123–130. [15] D. Kim and G. Han, “A 200 μ s processing time smart image sensor for an eye tracker using pixel-level analog image processing,” IEEE Journal of Solid-State Circuits, vol. 44, no. 9, pp. 2581–2590, Sept 2009. [16] K. Bong, I. Hong, G. Kim, and H. J. Yoo, “A 0.5-degree error 10mw cmos image sensor-based gaze estimation processor with logarithmic processing,” in Proceedings of 2015 Symposium on VLSI Circuits (VLSI Circuits), June 2015, pp. C46–C47. [17] I. Ksa, “A circle fitting procedure and its error analysis,” IEEE Transactions on Instrumentation and Measurement, vol. IM-25, no. 1, pp. 8–14, March 1976. [18] Z. R. Cherif, A. Nait-Ali, J. F. Motsch, and M. O. Krebs, “An adaptive calibration of an infrared light device used for gaze tracking,” in IMTC/2002.Proceedings of the 19th IEEE Instrumentation and Measurement Technology Conference (IEEE Cat. No.00CH37276), vol. 2, May 2002, pp. 1029–1033 vol.2. [19] D. Reddy, M. R, and S. Bhat, “Canny edge detection using verilog,” International Journal of Engineering Sciences and Research Technology, vol. 3, no. 6, pp. 256–260, 2014. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/20733 | - |
dc.description.abstract | 在過去四十餘年間,人們為了獲得可靠的、便宜的並且易於架設的眼動儀投 入了許多的精力。這項技術被應用於商業研究,消費者分析以及視覺探究,並 且,它也被視為一種理想的人機交互介面。目前大多數的商用眼動儀使用角膜反 射的技術去估計使用者的注視點,這種方法非常精準並且容易架設,但是,需要 使用多顆LED限制了它的設計,並且也耗費許多功率。
在這篇論文中,我們開發了一個使用單顆LED和低功耗眼部攝影機的眼動追 蹤系統來延長它的使用時間、拓展它的使用場景。在系統中,我們使用暗瞳檢測 及多項式映射實現視線估計。我們在眼部影像上進行進行初步區域估計、canny邊 緣檢測及優化和半徑擬合來獲得眼睛的姿態。我們所提出的演算法結合了大多數 近期成果的優勢並且將其改進使其適應硬體實作的需求。除此之外,這個系統的 硬體架構也被開發並且在FPGA平台上進行了實現。這個系統達到了高精度、低 功耗和實時的表現。 | zh_TW |
dc.description.abstract | Significant efforts have been devoted to robust, cheap and easy-to-set eye trackers in the pass 40 years. The technique is applied to commercial research, consumer analysis and vision research, and is regarded as an ideal approach of user-computer interface. Most of current commercial eye trackers use corneal reflection to estimate the point of regard of users, which is accurate and easy to set up. However, multiple LED using limits the design and consumes much power.
In this thesis, a system with only one illumination LED and low-cost eye camera is developed to expand the lifetime and usage scenario of eye trackers. Dark pupil detection and polynomial mapping gaze estimation are used in the system. We perform initial region estimation, canny edge detection and refinement and radius fitting on the eye image to get the exact pose of eye. In the proposed algorithm, we combine the advantages of the most recent works and modify them to meet the requirements of hardware implementation. Moreover, the hardware architecture of the system is developed and implemented on an FPGA platform. The system achieves a high-accuracy, low-cost and real-time performance. | en |
dc.description.provenance | Made available in DSpace on 2021-06-08T03:00:52Z (GMT). No. of bitstreams: 1 ntu-106-R04943156-1.pdf: 7394821 bytes, checksum: b0b5808b792741114e17207e8d0d5e5e (MD5) Previous issue date: 2017 | en |
dc.description.tableofcontents | Acknowledgments i
Abstract iii List of Figures ix List of Tables xi Chapter 1 Introduction 1 1.1 Motivation................................. 1 1.2 Problem Definition ............................ 2 1.3 Contribution................................ 3 1.4 Organization of the Thesis........................ 3 Chapter 2 Related Work 5 2.1 Eye Tracking Review........................... 5 2.1.1 Search Coil ............................ 5 2.1.2 Electro-oculography(EOG).................... 6 2.1.3 Image-based Approach ...................... 7 2.2 Gaze Estimation Review ......................... 8 2.2.1 Model-Based ........................... 8 2.2.2 Appearance-Based ........................ 10 2.2.3 Feature-Based........................... 11 2.3 Related Pupil Detection Works ..................... 12 2.3.1 Software Approaches ....................... 12 2.3.2 Hardware Approaches ...................... 14 Chapter 3 Proposed Algorithm 15 3.1 Overview.................................. 15 3.2 Initial Region Estimation......................... 16 3.3 Canny Edge Detection with Image Opening . . . . . . . . . . . . . . 17 3.4 Edge Refine................................ 18 3.4.1 Morphologic Simplification.................... 18 3.4.2 Two-Pass Connected Components Labelling . . . . . . . . . . 19 3.5 Radius Selection and Fitting....................... 21 3.6 Gaze Mapping............................... 23 Chapter 4 Hardware Implementation 25 4.1 Overview.................................. 25 4.2 Top Module................................ 26 4.3 Downsample and Initial Region Estimation. . . . . . . . . . . . . . . 27 4.4 Canny Edge Detection with Opening Operation . . . . . . . . . . . . 29 4.5 Edge Simplification Operation...................... 31 4.6 Two-Pass Connected Components Labelling . . . . . . . . . . . . . . 32 4.7 Radius Fitting............................... 33 4.8 Verification on FPGA........................... 35 Chapter 5 Evaluation and Discussion 39 5.1 Dataset .................................. 39 5.2 Results................................... 40 5.2.1 Software Benchmark ....................... 40 5.2.2 Hardware Benchmark....................... 42 5.2.3 System Accurancy ........................ 43 5.3 Discussion................................. 47 5.3.1 Data Arrangement ........................ 47 5.3.2 Fitting Algorithm......................... 48 5.3.3 Estimation Delay ......................... 49 Chapter 6 Conclusion 51 Bibliography 53 | |
dc.language.iso | en | |
dc.title | 使用紅外照明的可靠眼動追蹤演算法及FPGA實現 | zh_TW |
dc.title | Algorithm and FPGA Implementation of Robust Eye Tracking with IR Illumination | en |
dc.type | Thesis | |
dc.date.schoolyear | 105-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 陳宏銘,葉素玲 | |
dc.subject.keyword | 眼動追蹤,瞳孔偵測,視線估計,FPGA實作,基於特徵的, | zh_TW |
dc.subject.keyword | eye-tracking,pupil detection,gaze estimation,FPGA implementation,feature-based, | en |
dc.relation.page | 56 | |
dc.identifier.doi | 10.6342/NTU201700937 | |
dc.rights.note | 未授權 | |
dc.date.accepted | 2017-07-25 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 電子工程學研究所 | zh_TW |
顯示於系所單位: | 電子工程學研究所 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-106-1.pdf 目前未授權公開取用 | 7.22 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。