Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電機工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/49444
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳永耀(Yung-Yaw Chen)
dc.contributor.authorZhi-Xiang Liuen
dc.contributor.author劉志詳zh_TW
dc.date.accessioned2021-06-15T11:28:58Z-
dc.date.available2016-08-30
dc.date.copyright2016-08-30
dc.date.issued2016
dc.date.submitted2016-08-16
dc.identifier.citation[1] C. W. Cho, J. W. Lee, K. Y. Shin, E. C. Lee, K. R. Park, H. Lee, et al., 'Gaze detection by wearable eye-tracking and NIR LED-based head-tracking device based on SVR,' ETRI Journal, vol. 34, pp. 542-552, 2012.
[2] J. W. Bang, E. C. Lee, and K. R. Park, 'New computer interface combining gaze tracking and brainwave measurements,' IEEE Transactions on Consumer Electronics, vol. 57, pp. 1646-1651, 2011.
[3] C. W. Cho, J. W. Lee, E. C. Lee, and K. R. Park, 'Robust gaze-tracking method by using frontal-viewing and eye-tracking cameras,' Optical Engineering, vol. 48, pp. 127202-127202-15, 2009.
[4] L. Piccardi, B. Noris, O. Barbey, A. Billard, G. Schiavone, F. Keller, et al., 'Wearcam: A head mounted wireless camera for monitoring gaze attention and for the diagnosis of developmental disorders in young children,' in RO-MAN 2007-The 16th IEEE International Symposium on Robot and Human Interactive Communication, pp. 594-598, 2007.
[5] S.-W. Shih and J. Liu, 'A novel approach to 3-D gaze tracking using stereo cameras,' IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), vol. 34, pp. 234-245, 2004.
[6] E. Murphy-Chutorian, A. Doshi, and M. M. Trivedi, 'Head pose estimation for driver assistance systems: A robust algorithm and experimental evaluation,' in 2007 IEEE Intelligent Transportation Systems Conference, pp. 709-714, 2007.
[7] H. C. Lee, D. T. Luong, C. W. Cho, E. C. Lee, and K. R. Park, 'Gaze tracking system at a distance for controlling IPTV,' IEEE Transactions on Consumer Electronics, vol. 56, pp. 2577-2583, 2010.
[8] T. Poitschke, E. Bay, F. Laquai, G. Rigoll, S. Bardins, K. Bartl, et al., 'Using liquid lenses to extend the operating range of a remote gaze tracking system,' in Systems, Man and Cybernetics, 2009. SMC 2009. IEEE International Conference on, pp. 1250-1254, 2009.
[9] P. Viola and M. J. Jones, 'Robust real-time face detection,' International journal of computer vision, vol. 57, pp. 137-154, 2004.
[10] Y. Freund and R. E. Schapire, 'A desicion-theoretic generalization of on-line learning and an application to boosting,' in European conference on computational learning theory, pp. 23-37, 1995.
[11] J. Daugman, 'How iris recognition works,' IEEE Transactions on circuits and systems for video technology, vol. 14, pp. 21-30, 2004.
[12] R. O. Duda and P. E. Hart, 'Use of the Hough transformation to detect lines and curves in pictures,' Communications of the ACM, vol. 15, pp. 11-15, 1972.
[13] R. Szeliski, “Computer vision: algorithms and applications”, Springer Science & Business Media, 2010.
[14] H. Yuen, J. Princen, J. Illingworth, and J. Kittler, 'Comparative study of Hough transform methods for circle finding,' Image and vision computing, vol. 8, pp. 71-77, 1990.
[15] D. Li, D. Winfield, and D. J. Parkhurst, 'Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches,' in 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05)-Workshops, pp. 79-79, 2005.
[16] E. Wood and A. Bulling, 'Eyetab: Model-based gaze estimation on unmodified tablet computers,' in Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 207-210, 2014.
[17] L. Świrski, A. Bulling, and N. Dodgson, 'Robust real-time pupil tracking in highly off-axis images,' in Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 173-176, 2012.
[18] P. L. Rosin, 'Analysing error of fit functions for ellipses,' Pattern Recognition Letters, vol. 17, pp. 1461-1470, 1996.
[19] B. K. Horn and B. G. Schunck, 'Determining optical flow,' Artificial intelligence, vol. 17, pp. 185-203, 1981.
[20] C. H. Morimoto and M. R. Mimica, 'Eye gaze tracking techniques for interactive applications,' Computer Vision and Image Understanding, vol. 98, pp. 4-24, 2005.
[21] K. H. Suh, Y.-J. Kim, Y. Kim, D. Ko, and E. C. Lee, 'Monocular Eye Tracking System Using Webcam and Zoom Lens,' in Advanced Multimedia and Ubiquitous Engineering, ed: Springer, pp. 135-141, 2015.
[22] J. M. Lee, H. C. Lee, S. Y. Gwon, D. Jung, W. Pan, C. W. Cho, et al., 'A new gaze estimation method considering external light,' Sensors, vol. 15, pp. 5935-5981, 2015.
[23] D.-C. Cho and W.-Y. Kim, 'Long-range gaze tracking system for large movements,' IEEE Transactions on Biomedical Engineering, vol. 60, pp. 3432-3440, 2013.
[24] L. Sesma-Sanchez, A. Villanueva, and R. Cabeza, 'Gaze estimation interpolation methods based on binocular data,' IEEE Transactions on Biomedical Engineering, vol. 59, pp. 2235-2243, 2012.
[25] D.-C. Cho, W.-S. Yap, H. Lee, I. Lee, and W.-Y. Kim, 'Long range eye gaze tracking system for a large screen,' IEEE Transactions on Consumer Electronics, vol. 58, pp. 1119-1128, 2012.
[26] J. M. Lee, H. C. Lee, S. Y. Gwon, D. Jung, W. Pan, C. W. Cho, et al., 'A new gaze estimation method considering external light,' Sensors, vol. 15, pp. 5935-5981, 2015.
[27] Z. Zheng, J. Yang, and L. Yang, 'A robust method for eye features extraction on color image,' Pattern Recognition Letters, vol. 26, pp. 2252-2261, 2005.
[28] G. Fanelli, T. Weise, J. Gall, and L. Van Gool, 'Real time head pose estimation from consumer depth cameras,' in Joint Pattern Recognition Symposium, pp. 101-110, 2011.
[29] R. Jafari and D. Ziou, 'Eye-gaze estimation under various head positions and iris states,' Expert Systems with Applications, vol. 42, pp. 510-518, 2015.
[30] N. Ramanauskas, 'Calibration of video-oculographical eye-tracking system,' Elektronika ir Elektrotechnika, vol. 72, pp. 65-68, 2015.
[31] D. A. Robinson, 'A method of measuring eye movemnent using a scieral search coil in a magnetic field,' IEEE Transactions on bio-medical electronics, vol. 10, pp. 137-145, 1963.
[32] A. E. Kaufman, A. Bandopadhay, and B. D. Shaviv, 'An eye tracking computer user interface,' in Virtual Reality, 1993. Proceedings., IEEE 1993 Symposium on Research Frontiers in, pp. 120-121, 1993.
[33] J. S. Babcock and J. B. Pelz, 'Building a lightweight eyetracking headgear,' in Proceedings of the 2004 symposium on Eye tracking research & applications, pp. 109-114, 2004.
[34] F. Timm and E. Barth, 'Accurate Eye Centre Localisation by Means of Gradients,' VISAPP, vol. 11, pp. 125-130, 2011.
[35] H. C. Lee, W. O. Lee, C. W. Cho, S. Y. Gwon, K. R. Park, H. Lee, et al., 'Remote gaze tracking system on a large display,' Sensors, vol. 13, pp. 13439-13463, 2013.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/49444-
dc.description.abstract在凝視偵測系統當中,主要目標為估計使用者之凝視點在螢幕座標系上之座標位置,凝視點估計之準確性深受使用者的頭部位移與操作距離所影響,本論文當中使用了新的系統環境配置以及演算法來補償遠距離凝視點估計之誤差,本論文中有兩項新穎之部分,第一,利用多個近紅外光光緣之系統配置,產生出多個座標轉換函式來補償凝視點估計之誤差,第二,從多個座標轉換函式當中挑選出最佳的座標轉換函式之演算法。
整個系統的操作是首先經由外部攝影機來拍攝使用者之眼睛影像,再藉由此眼睛影像去偵測出瞳孔中心點在影像當中的座標,並且外部會有多個近紅外光之光源,藉由這些光源在眼睛上面產生反光點,並以這些角膜反光點當成是參考點,並且偵測此參考點在影像上面之座標,然而影像上的瞳孔中心點與各個參考點皆能形成一個向量,並利用此向量經由座標轉換得到使用者在螢幕上之凝視點位置,在實驗結果呈現本論文之方法比單顆紅外線光源之系統配置有更好的表現,並且在凝視點估計的實驗結果呈現出其平均誤差為0.54度,標準差0.48度,處理速度0.78 fps。
zh_TW
dc.description.abstractIn gaze estimation, the principal purpose is estimating coordinate of user’s gaze point on screen coordinate system. Accuracy of gaze point is depend on user’s head movement and working distance. In this thesis, proposing a long distance gaze estimation system configuration and algorithm for gaze point error compensation. Our research is novel in the following two ways: first, the multiple near-infrared illuminators configuration to generate several mapping functions for compensating gaze point error. Second, the algorithm for choosing the best mapping function in several mapping functions.
The system works as follows. Firstly, capture user’s eye images by external camera and estimate the coordinate of pupil center in eye images. There are also several near infrared illuminators in front of the user for generating corneal reflections to be reference points on user’s eyes. After that, calculate the coordinate of corneal reflection points in eye images and the vector between pupil center and each corneal reflection point as an input of a mapping function which from the eye image coordinate system to screen coordinate system. After mapping, we will obtain the user’s gaze point on the screen coordinate system. Experiment results showed the performance of our proposed is better than single near-infrared illuminator configuration and the system mean gaze error is 0.54°, standard deviation is 0.48° and processing speed is 0.78fps.
en
dc.description.provenanceMade available in DSpace on 2021-06-15T11:28:58Z (GMT). No. of bitstreams: 1
ntu-105-R03921063-1.pdf: 3393863 bytes, checksum: 4460b96791655150e4f479774e4557b8 (MD5)
Previous issue date: 2016
en
dc.description.tableofcontents口試委員會審定書 i
誌謝 ii
中文摘要 iii
ABSTRACT iv
CONTENTS v
LIST OF FIGURES viii
LIST OF TABLES xvi
Chapter 1 Introduction 1
1.1 Motivation and Problem Definition 2
1.2 Previous Work of Gaze Estimation System 4
1.3 Proposed Approach 5
1.4 Thesis Overview 7
Chapter 2 Study on Gaze Estimation 8
2.1 Intrusive Based Remote Gaze Estimation 48
2.2 Non-Intrusive Based Gaze Estimation 51
2.2.1 Interpolation Based Gaze Estimation 51
2.2.2 3D Model Based Gaze Estimation 52
Chapter 3 Gaze Estimation and Mapping Function 54
3.1 Pupil Center Extraction 54
3.2 Corneal Reflections Extraction 64
3.3 Mapping Function 70
3.4 Calibration Procedure 76
Chapter 4 Proposed System Configuration and Algorithm for Gaze Estimation 80
4.1 System Overview 81
4.2 Geometric Calculation of PCCR for Gaze Point 84
4.3 Near-Infrared Light Sources and Camera Configuration 87
4.4 Multiple Mapping Functions 93
4.5 Gaze Point Error Compensation 97
Chapter 5 Gaze Estimation Experiment 99
5.1 Experiment Setting 100
5.2 Results of Gaze Estimation 104
5.2.1 Geometric Calculation of PCCR for Gaze Estimation 105
5.2.2 Image Processing of PCCR for Gaze Estimation 115
5.3 Discussions 126
5.4 Comparison 128
Chapter 6 Conclusions and Future Work 130
REFERENCES 132
dc.language.isoen
dc.subject系統配置zh_TW
dc.subject遠距離凝視偵測zh_TW
dc.subject瞳孔中心點偵測zh_TW
dc.subject角膜反射zh_TW
dc.subject座標映射函式zh_TW
dc.subjectcorneal reflectionen
dc.subjectsystem configurationen
dc.subjectcoordinate mapping functionen
dc.subjectremote gaze estimationen
dc.subjectpupil center extractionen
dc.title以補償凝視點估計誤差之新穎遠距離凝視估計系統配置及演算法zh_TW
dc.titleA Novel Configuration and Algorithm of Long Distance Gaze Estimation System for Gaze Point Error Compensation.en
dc.typeThesis
dc.date.schoolyear104-2
dc.description.degree碩士
dc.contributor.oralexamcommittee林文澧(Win-Li Lin),顏家鈺(Jia-Yush Yen),何明志(Ming-Chih Ho)
dc.subject.keyword遠距離凝視偵測,瞳孔中心點偵測,角膜反射,座標映射函式,系統配置,zh_TW
dc.subject.keywordremote gaze estimation,pupil center extraction,corneal reflection,coordinate mapping function,system configuration,en
dc.relation.page137
dc.identifier.doi10.6342/NTU201602893
dc.rights.note有償授權
dc.date.accepted2016-08-17
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept電機工程學研究所zh_TW
顯示於系所單位:電機工程學系

文件中的檔案:
檔案 大小格式 
ntu-105-1.pdf
  未授權公開取用
3.31 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved