請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/43566
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 洪一平(Yi-Ping Hung) | |
dc.contributor.author | Chien-Nan Chou | en |
dc.contributor.author | 周建男 | zh_TW |
dc.date.accessioned | 2021-06-15T02:23:32Z | - |
dc.date.available | 2009-08-20 | |
dc.date.copyright | 2009-08-20 | |
dc.date.issued | 2009 | |
dc.date.submitted | 2009-08-18 | |
dc.identifier.citation | [1] Hansen, J.P., Hansen, D.W., and Johansen, A.S. Bringing Gaze-based Interaction Back to Basics. In Proceedings of Universal Access in Human-Computer Interaction. UAHCI '01.
[2] Hansen, D. and Hansen, J. Review of Current Camera-based Eye Trackers. First Conference on Communication by Gaze Interaction, pp 7-9, 2005. [3] A. T. Duchowski. Eye Tracking Methodology: Theory and Practice. Springer, 2007. [4] A.Pérez, M.L.Córdoba, A.García, R.Méndez, M.L.Muñoz, J.L.Pedraza, and F.Sánchez. A precise eye-gaze detection and tracking system, in: Proc. of the 11th International Conference in Central Europe of Computer Graphics, Visualization and Computer Vision'2003, Plzen, Czech Republic, 2003. [5] Daunys, G. and Ramanauskas, N. The accuracy of eye tracking using image processing. In Proceedings of the Third Nordic Conference on Human-Computer interaction (Tampere, Finland, October 23 - 27, 2004). NordiCHI ’04, vol. 82. ACM, New York, NY, 377-380, 2004. [6] Zhiwei Zhu and Qiang Ji, Robust real-time eye detection and tracking under variable lighting conditions and various face orientations, Computer Vision and Image Understanding, vol. 98, no. 1, p.124-154, April 2005. [7] Z. Zhu, Q. Ji, K. Fujimura, and K. Lee. Combining Kalman Filtering and Mean Shift for Real Time Eye Tracking Under Active IR Illumination,' in International Conference on Pattern Recognition, Quebec, Canada, 2002. [8] Yuille, A., Hallinan, P., and Cohen, D. Feature extraction from faces using deformable templates. International Journal of Computer Vision, vol. 8, no. 2, 99–111, 1992. [9] Lam, K.-M. and Yan, H. Locating and extracting the covered eye in human face images. Pattern Recognition, vol. 29,no. 5, 771–779, 1996. [10] J. Deng and F. Lai. Region-based template deformation and masking for eye-feature extraction and description, Pattern Recognition, vol. 30, 403-419, 1997. [11] Z. H. Zhou and X. Geng. Projection functions for eye detection. In Pattern Recognition, pages 1049-1056, 2004. [12] J. Wu and Z.-H. Zhou. Efficient face candidates selector for face detection. Pattern Recognition, vol. 36, no. 5, 1175-1186, 2003. [13] M. Tűrkan, M. Pardás, and A. E. Cetin. Human eye localization using edge projection. In Comp. Vis. Theory and App., 2007. [14] K. Peng, L. Chen, S. Ruan, and G. Kukharev, A Robust Algorithm for Eye Detection on Gray Intensity Face without Spectacles. Journal of Computer Science and Technology, vol. 5, pp. 127-132, 2005. [15] L. Bai, L. Shen, and Y. Wang. A novel eye location algorithm based on radial symmetry transform. In International Conference on Pattern Recognition, pages 511-514, 2006. [16] D. Reisfeld, H. Wolfson, and Y. Yeshurun, Context-Free Attentional Operators: the Generalized Symmetry Transform. International Journal of Computer Vision, vol. 14, no. 2, pp. 119-130, 1995. [17] R. Valenti and T. Gevers. Accurate eye center location and tracking using isophote curvature. In Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, 2008. [18] Vezhnevets, V. and Degtiareva, A. Robust and accurate eye contour extraction. In Proc. Graphicon, pp. 81-84, 2003 [19] J. Ahlberg. A system for face localization and facial feature extraction. Technical Report LiTH-ISY-R-2172, 1999. [20] Z. Zheng, J. Yang, and L. Yang. A robust method for eye features extraction on color image. In Pattern Recognition Letters, vol. 26, pages 2252–2261, 2005. [21] J. Zhu and J. Yang. Subpixel eye gaze tracking. In IEEE Conference on Automatic Face and Gesture Recognition, pp. 124–129, May 2002. [22] M. Pilu, A.W. Fitzgibbon, and R.B. Fisher. Ellipse-specific direct least-square fitting. International Conference on Image Processing, pages 599-602, 1996. [23] Wang, J., Sung, E., and Venkateswarlu, R. Estimating the eye gaze from one eye. Computer Vision and Image Understanding, vol. 98, 83-103, 2005 [24] Li, D. and Parkhurst, D. J. Open-source software for real-time visible-spectrum eye tracking. In Proceedings of the COGAIN Conference, p. 18-20, 2006. [25] Li, D., Winfield, D. and Parkhurst, D. J. Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In Proceedings of the IEEE Vision for Human-Computer Interaction Workshop at CVPR, 1-8, 2005. [26] P. Viola and M.J. Jones. Robust real-time face detection. International Journal of Computer Vision, vol. 57, no. 2, 137-154, 2004. [27] A. Al-Oayedi and A.F. Clark. An algorithm for face and facial-feature location based on gray-scale information and facial geometry, Proceedings of International Conference on Image Processing and Its Applications, vol. 2, pp. 625-629, 1999. [28] H. Gu, G. Su, and C. Du. Feature points extraction from face, In Proceedings of Conference on Image and Vision Computing, 2003. [29] M. Castrillón, O. Déniz, C. Guerra, and M. Hernández. Encara2: Real-time detection of multiple faces at different resolutions in video streams. Journal of Visual Communication and Image Representation, vol. 18, no. 2, 2007 [30] S. Sirohey and A. Rosenfeld. Eye detection in a face image using linear and nonlinear filters. Pattern Recognition, vol. 34, 1367-1391, 2001. [31] M. Lades, J.C. Vorbruggen, J. Buhmann, and J. Lange. Distortion invariant object recognition in the dynamic link architecture. IEEE Transactions on Computters, vol. 42, no, 3, 570-582, 1993. [32] K. Ville, J.-K. Kamarainen, and H. Kalviainen. Simple Gabor feature space for invariant object recognition. Patter Recognition Letters, vol. 25, no. 3, 311-318, 2004. [33] M. Betke, J. Gips, and P. Fleming. The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities. IEEE Transactions on Rehabilitation Engineering, vol. 10, no. 1, pp. 1-10, 2002. [34] B. D. Lucas and T. Kanade. An Iterative Image Registration Technique with an Application to Stereo Vision. In Proceedings of Imaging Understanding Workshop, pages 121-130, 1981. [35] M. Fischler and R. Bolles. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM, vol. 24, pp. 381-395, 1981. [36] M. Chau and M. Betke. Real Time Eye Tracking and Blink Detectionwith USB Cameras. Boston University Computer Science Technical Report, no. 2005-12, May 2005. [37] BioID Technology Research. The BioID Face Database. http://www.bioid.com, 2001. [38] O. Jesorsky, K. J. Kirchbergand, and R. Frischholz. Robust face detection using the Hausdorff distance. In Audio and Video Biom. Pers. Auth., pages 90-95, 1992. [39] S. Asteriadis, N. Nikolaidis, A. Hajdu, and I. Pitas. An eye detection algorithm using pixel to edge information. In Int. Symp. on Control, Commun. and Sign. Proc., 2006. [40] P. Campadelli, R. Lanzarotti, and G. Lipori. Precise eye localization through a general-to-specific model definition. In BMVC, 2006. [41] M. Hamouz, J. Kittlerand, J. K. Kamarainen, P. Paalanen, H. Kalviainen, and J. Matas. Feature-based affine-invariant localization of faces. PAMI, vol. 27, no. 9, 1490-1495, 2005. [42] D. Cristinacce, T. Cootes, and I. Scott. A multi-stage approach to facial feature detection. In BMVC, pages 277-286, 2004. [43] Y.F. Ma, X.S. Hua, L. Lu, and H.J. Zhang. A generic framework of user attention model and its application in video summarization. IEEE Trans. on Multimedia, vol. 7, no. 5, 907- 919, 2005. [44] A. Hanjalic. Multimodal approach to measuring excitement in video. In Proceedings of IEEE International Conference Multimedia and Expo., 2003. [45] J. Kleban, A. Sarkar, E. Moxley, S. Mangiat, S. Joshi, T. Kuo, and B. S. Manjunath. Feature fusion and redundancy pruning for rush video summarization. In Proceedings of the international workshop on TRECVID video summarization, 2007. [46] T. Mei, X.S. Hua, H.Q. Zhou, and S. Li. Modeling and mining of users’ capture intention for home videos. IEEE TMM, vol. 9, no. 1, 66–77, Jan. 2007. [47] R.B. Goldstein, E. Peli, S. Lerner, and G. Luo. Eye Movements While Watching a Video: Comparisons Across Viewer Groups. Vision Science Society, 2004. [48] Steinman, Scott B. and Garzia, Ralph Philip. Foundations of Binocular Vision: A Clinical perspective, McGraw-Hill Professional, ISBN 0-8385-2670-5, pp. 2-5, 2000. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/43566 | - |
dc.description.abstract | 近年來人機互動模式的多元化發展,提供給使用者許多與以往不同的互動體驗。眼睛是人類接收與傳達資訊的重要感官之一,我們可以藉由觀察眼睛的變化來賦予使用者更貼近其需求的互動方式。本篇論文提出一個可即時偵測並追蹤使用者眼睛特徵點的方法,配合現有的低成本非紅外線視訊攝影機便可準確得到這些特徵點的位置,其中的關鍵技術在於眼睛位於人臉的位置變化差異性不大,因此能透過我們提出基於人臉結構的三階段式偵測法加快搜尋速度並提升穩定性。另一方面,由於系統使用一般的可見光視訊攝影機,不同於紅外光攝影機容易受到環境光的影響,室內、外皆可保有高度的準確性,進而確保此方法在現實生活中的簡易性與可行性。
同時,為了驗證系統的高效率與準確度,我們提出基於這些特徵點資訊的三種不同類型應用方式:一是計算虹膜中心點與眼角的距離變化判斷眼球的移動方式,配合臉部表情資訊可做為影片自動剪輯的參考依據。二是藉由計算使用者與視訊攝影機之間的相對位置來提供使用者類3D的顯示畫面。三是利用相同資訊藉由校正方法找出使用者注視在螢幕上的位置。 | zh_TW |
dc.description.abstract | The diversification of the development in human-computer interaction provides users more and more interactive experience which is totally different than usual. As one of the sense organs that serve as a proxy for human attention and intention, we can give users better interaction closer to their need by observing the change of the eyes. This paper proposes a low-cost system, using a non-infrared off-the-shelf webcam, which can extract and track the eye features on the face. For getting a real-time and robust result, we propose a three-stage method based on the low variety in the geometry of faces so that these eye features just need to be searched in the small eye rectangle derived from that. Also, our system works without IR lights which are easily affected by environmental ambient light, and it ensures getting high accuracy outdoors as well as indoors and being easy to set up in real life.
Meanwhile, we propose three different kinds of applications based on our eye feature tracking system in order to validate the efficiency and potentiality. First, we determine the eye movement by calculating the distance between the inner eye corner and the iris center, which will be one of the cues including facial expression for automatic video summarization. Second, we provide a view-dependent display to single user by tracking the relative location between his/her eyes and the camera. Finally, we estimate the point-of-gaze on the screen after calibrating the relation between these features and the sample dots shown on the screen. | en |
dc.description.provenance | Made available in DSpace on 2021-06-15T02:23:32Z (GMT). No. of bitstreams: 1 ntu-98-R96922021-1.pdf: 12519485 bytes, checksum: 460ed59a73893c30e46a0be28416f6b5 (MD5) Previous issue date: 2009 | en |
dc.description.tableofcontents | 1 Introduction 1
2 Related Works 4 2.1 Active Infrared-Illumination Approaches 5 2.2 Passive Appearance-Based Approaches 5 3 Eye Tracking System 10 3.1 System Architecture 10 3.2 Face and Eye Detection 11 3.3 Eye Features Extraction 13 3.3.1 Eye Corner Location 14 3.3.2 Iris Circle Location 21 3.3.3 Blink Detection 24 4 Evaluation 26 4.1 Accuracy of Iris Center Location 26 4.2 Accuracy of Blink Detection 30 4.3 System Efficiency 32 5 Applications 33 5.1 Eye Movement Measurements for Home Video Summarization 33 5.2 Eye Tracking for View-Dependent Display 36 5.3 Eye Gaze Estimation 38 6 Conclusions and Future Works 40 6.1 Conclusions 40 6.2 Future Works 41 Bibliography 42 | |
dc.language.iso | en | |
dc.title | 階段式即時眼睛特徵擷取及其應用 | zh_TW |
dc.title | Real-time Three-stage Eye Feature Extraction and Its Applications | en |
dc.type | Thesis | |
dc.date.schoolyear | 97-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 莊永裕,徐繼聖,江政杰 | |
dc.subject.keyword | 眼睛特徵擷取,人機互動, | zh_TW |
dc.subject.keyword | Eye Feature Extraction,Human-Computer Interaction, | en |
dc.relation.page | 47 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2009-08-18 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
顯示於系所單位: | 資訊工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-98-1.pdf 目前未授權公開取用 | 12.23 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。