Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電子工程學研究所
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/77880
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor簡韶逸(Shao-Yi Chien)
dc.contributor.authorPo-Jung Chiuen
dc.contributor.author邱柏榕zh_TW
dc.date.accessioned2021-07-11T14:36:35Z-
dc.date.available2022-08-31
dc.date.copyright2017-08-31
dc.date.issued2017
dc.date.submitted2017-08-16
dc.identifier.citation[1] F. Lu, Y. Sugano, T. Okabe, and Y. Sato, “Adaptive linear regression for appearance-based gaze estimation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 36, no. 10, pp. 2033–2046, 2014.
[2] D. W. Hansen and Q. Ji, “In the eye of the beholder: A survey of models for eyes and gaze,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, no. 3, pp. 478–500, 2010.
[3] E. D. Guestrin and M. Eizenman, “General theory of remote gaze estimation using the pupil center and corneal reflections,” IEEE Transactions on Biomedical Engineering, vol. 53, no. 6, pp. 1124–1133, 2006.
[4] D. Li, D. Winfield, and D. J. Parkhurst, “Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches,” in Proceedings of the 2005 IEEE International Conference on Computer Vision and Pattern Recognition - Workshops, vol. 3, 2005, pp. 79–79.
[5] B. C. Chen, P. C. Wu, and S. Y. Chien, “Real-time eye localization, blink detection, and gaze estimation system without infrared illumination,” in Proceedings of the 2015 IEEE International Conference on Image Processing, 2015, pp. 715–719.
[6] J. Li and S. Li, “Two-phase approach—calibration and iris contour estimation for gaze tracking of head-mounted eye camera,” in Proceedings of the 2016 IEEE International Conference on Image Processing, 2016, pp. 3136–3140.
[7] S. Baluja and D. Pomerleau, “Non-intrusive gaze tracking using artificial neural networks,” Tech. Rep., 1994.
[8] K.-H. Tan, D. J. Kriegman, and N. Ahuja, “Appearance-based eye gaze estimation,” in Proceedings of the Sixth IEEE Workshop on Applications of Computer Vision, 2002, pp. 191–195.
[9] S. T. Roweis and L. K. Saul, “Nonlinear dimensionality reduction by locally linear embedding,” science, vol. 290, no. 5500, pp. 2323–2326, 2000.
[10] F. Lu, Y. Sugano, T. Okabe, and Y. Sato, “Inferring human gaze from appearance via adaptive linear regression,” in Proceedings of the 2011 International Conference on Computer Vision, 2011, pp. 153–160.
[11] Y. Sugano, Y. Matsushita, Y. Sato, and H. Koike, “An incremental learning method for unconstrained gaze estimation,” in Proceedings of the 10th European Conference on Computer Vision: Part III, 2008, pp. 656–667.
[12] Y. Sugano, Y. Matsushita, and Y. Sato, “Appearance-based gaze estimation using visual saliency,” IEEE transactions on pattern analysis and machine intelligence, vol. 35, no. 2, pp. 329–341, 2013.
[13] Y. Sugano, Y. Matsushita, and Y.Sato, “Learning-by-synthesis for appearance-based 3d gaze estimation,” in Proceedings of the 2014 IEEE International Conference on Computer Vision and Pattern Recognition, 2014, pp. 1821–1828.
[14] T. Schneider, B. Schauerte, and R. Stiefelhagen, “Manifold alignment for person independent appearance-based gaze estimation,” in Proceedings of the 22nd International Conference on Pattern Recognition, 2014, pp. 1167–1172.
[15] X. Zhang, Y. Sugano, M. Fritz, and A. Bulling, “Appearance-based gaze estimation in the wild,” in Proceedings of the 2015 IEEE International Conference on Computer Vision and Pattern Recognition, 2015, pp. 4511–4520.
[16] K. Krafka, A. Khosla, P. Kellnhofer, H. Kannan, S. Bhandarkar, W. Matusik, and A. Torralba, “Eye tracking for everyone,” in Proceedings of the 2016 IEEE International Conference on Computer Vision and Pattern Recognition, 2016, pp. 2176–2184.
[17] B. A. Smith, Q. Yin, S. K. Feiner, and S. K. Nayar, “Gaze locking: passive eye contact detection for human-object interaction,” in Proceedings of the 26th annual ACM symposium on User interface software and technology, 2013, pp. 271–280.
[18] K. A. F. Mora, F. Monay, and J.-M. Odobez, “Eyediap: A database for the development and evaluation of gaze estimation algorithms from rgb and rgb-d cameras,” in Proceedings of the 2014 Symposium on Eye Tracking Research and Applications, 2014, pp. 255–258.
[19] E. Wood, T. Baltruˇ saitis, L.-P. Morency, P. Robinson, and A. Bulling, “Learning an appearance-based gaze estimator from one million synthesised images,” in Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, 2016, pp. 131–138.
[20] C. D. McMurrough, V. Metsis, J. Rich, and F. Makedon, “An eye tracking dataset for point of gaze detection,” in Proceedings of the 2012 Symposium on Eye Tracking Research and Applications, 2012, pp. 305–308.
[21] U. Weidenbacher, G. Layher, P. M. Strauss, and H. Neumann, “A comprehensive head pose and gaze database,” in Proceedings of the 3rd IET International Conference on Intelligent Environments, 2007, pp. 455–458.
[22] Q. Huang, A. Veeraraghavan, and A. Sabharwal, “Tabletgaze: unconstrained appearance-based gaze estimation in mobile tablets,” arXiv preprint arXiv:1508.01244, 2015.
[23] H. Murase and S. K. Nayar, “Visual learning and recognition of 3-d objects from appearance,” International Journal of Computer Vision, vol. 14, no. 1, pp. 5–24, 1995.
[24] P. N. Belhumeur, J. P. Hespanha, and D. J. Kriegman, “Eigenfaces vs. fisherfaces: Recognition using class specific linear projection,” IEEE Transactions on pattern analysis and machine intelligence, vol. 19, no. 7, pp. 711–720, 1997.
[25] A. S. Georghiades, P. N. Belhumeur, and D. J. Kriegman, “From few to many: illumination cone models for face recognition under variable lighting and pose,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 23, no. 6, pp. 643–660, 2001.
[26] P. Einarsson, C.-F. Chabert, A. Jones, W.-C. Ma, B. Lamond, T. Hawkins, M. Bolas, S. Sylwan, and P. Debevec, “Relighting human locomotion with flowed reflectance fields,” in Proceedings of the 17th Eurographics Conference on Rendering Techniques, 2006, pp. 183–194.
[27] P. E. Debevec and J. Malik, “Recovering high dynamic range radiance maps from photographs,” in Proceedings of the 24th Annual Conference on Computer Graphics and Interactive Techniques, 1997, pp. 369–378.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/77880-
dc.description.abstract眼動儀以往常被用來作為科學、醫療研究或人類行為研究的輔助工具,近年來由於擴增實境(AR)與虛擬實境(VR)的興起,眼動儀開始被注意到,並且使用在此類頭戴式裝置中。 眼動資訊對於擴增實境與虛擬實境來說相當的重要,許多研究人員與開發者致力於將眼動儀整合進AR/VR裝置中,並做出許多相當有趣的應用。
然而市面上的眼動儀絕大多數都是採用以紅外光為基礎的技術,這衍生出了一些潛在的問題: 首先,由於太陽光的光譜屬於全波段光,眼動儀所使用的近紅外光(Near-Infrared)被包含於其中。這導致眼動儀在戶外的環境中,精準度會下降,甚至有可能會無法正常運作。 另外一個潛在的問題是關於紅外光有可能對眼睛造成的傷害,由於人眼看不到近紅外光,在使用傳統眼動儀的同時,雖然人眼看不到,但其實同時間是有多盞紅外光源圍繞著眼睛。如果長時間配戴這類紅外線眼動儀,眼球將會持續暴露在紅外線的照射中。
如果想要將眼動儀應用於可能長時間、全天候配戴的AR裝置上,這兩個問題必須要能被克服。因此我們提出一種應用於頭戴裝置且無需額外紅外光源的眼神追蹤系統,它使用一種基於圖象表徵的可見光演算法,特色是不易受環境光源影響,適用於全天候的環境中。 實驗結果顯示,使用我們提出的特徵抽取技術可以將注視點誤差降低38%∼68.2%,測試於兩種不同類型的測資上。
zh_TW
dc.description.abstractRecently, eye trackers become not only a tool for medical research or human behavioral reseach, but also used in head-mounted devices such as VR (virtual reality) or AR (augmented reality). Eye gaze informations are valuable in AR/VR environments. Many researchers and developers are devoted to develope interesting applications which utilize the eye gaze information as additional user input.
However, there are some disadvantages in traditional eye trackers. First, commercial eye trackers use infrared illuminators as additional light sources. The robustness at outdoor environment is not guaranteed, especially under harsh sunlight. Another issue is about health concerns. The reason is that near-infrared light cannot be seen by human eyes, but it is still absorbed by the retina. Wearing a such
IR-based eye tracker for a long time may cause health concerns on eyes.
To address these issues, in this thesis, we propose an appearance-based gaze estimation algorithm that do not need external light source, and a novel eye gaze dataset featuring multi-illumination labels is proposed in this work as well. The proposed algorithm is robust against illumination variaions, while other appearance-based
methods perform badly under such common usage scenarios. Experimental result reports that, with the proposed illumination-robust feature extraction method, gaze estimation errors are able to decrease by 38% and 68.2% on synthetic and real data, respectively.
en
dc.description.provenanceMade available in DSpace on 2021-07-11T14:36:35Z (GMT). No. of bitstreams: 1
ntu-106-R04943039-1.pdf: 22433229 bytes, checksum: 324e881a297703a3e0f1e13358e562b1 (MD5)
Previous issue date: 2017
en
dc.description.tableofcontentsList of Figures ix
List of Tables xiii
Chapter 1 Introduction 1
1.1 Introduction of Eye Tracking and Gaze Estimation . . . . . 1
1.2 Motivation . . . . . 2
1.3 Contribution . . . . . 3
Chapter 2 Related Work . . . . . 5
2.1 Model-based Gaze Estimation . . . . . 6
2.2 Shape-based Gaze Estimation . . . . . 8
2.3 Appearance-based Gaze Estimation . . . . . 8
2.4 Eye Gaze Datasets . . . . . 10
Chapter 3 Proposed Gaze Estimation System . . . . . 13
3.1 Design Challenge . . . . . 13
3.1.1 Eye Camera Position Analysis . . . . . 14
3.1.2 Illumination Analysis . . . . . 16
3.2 Overview . . . . . 16
3.3 Algorithm . . . . . 17
3.3.1 Preprocessing . . . . . 17
3.3.2 Feature Extraction . . . . . 19
3.3.3 High Dimensional Regression . . . . . 22
3.3.4 illumination-robust Feature Projection . . . . . 26
Chapter 4 Multi-Illumination Gaze Dataset . . . . . 33
4.1 Review of Current Available Datasets . . . . . 34
4.2 Proposed Dataset . . . . . 35
4.2.1 The Apparatus . . . . . 36
4.2.2 HDR Image Capturing and Post-processing . . . . . 36
4.2.3 Illumination Simulation . . . . . 38
4.3 Analysis . . . . . 38
Chapter 5 Experiments . . . . . 43
5.1 Experimental Setup . . . . . 43
5.2 Results . . . . . 46
5.2.1 Synthetic Eye Images . . . . . 46
5.2.2 Real Eye Images . . . . . 47
5.2.3 Comparison with Other Appearance-based Approaches . . . . . 49
5.3 Analysis . . . . . 52
5.3.1 LDA Dimensions . . . . . 52
Chapter 6 Conclusion . . . . . 55
Bibliography . . . . . 57
Chapter A Sample images of the proposed gaze dataset . . . . . 63
Chapter B Gaze Error Map . . . . . 67
dc.language.isoen
dc.subject眼動偵測zh_TW
dc.subject眼動儀zh_TW
dc.subject無紅外光zh_TW
dc.subject視線追蹤zh_TW
dc.subjectillumination- robusten
dc.subjectappearance-baseden
dc.subjectgaze estimationen
dc.subjecteye trackingen
dc.title適用於頭戴式穿戴裝置且不易受環境光影響基於圖像表徵之眼神追蹤演算法zh_TW
dc.titleIllumination-Robust Appearance-Based Gaze Estimation on Head-Mounted Devicesen
dc.typeThesis
dc.date.schoolyear105-2
dc.description.degree碩士
dc.contributor.oralexamcommittee陳宏銘(Homer H. Chen),葉素玲(Su-Ling Yeh)
dc.subject.keyword眼動儀,眼動偵測,視線追蹤,無紅外光,zh_TW
dc.subject.keywordeye tracking,gaze estimation,appearance-based,illumination- robust,en
dc.relation.page70
dc.identifier.doi10.6342/NTU201702490
dc.rights.note有償授權
dc.date.accepted2017-08-16
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept電子工程學研究所zh_TW
顯示於系所單位:電子工程學研究所

文件中的檔案:
檔案 大小格式 
ntu-106-R04943039-1.pdf
  未授權公開取用
21.91 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved