Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 重點科技研究學院
  3. 積體電路設計與自動化學位學程
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/97434
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor簡韶逸zh_TW
dc.contributor.advisorShao-Yi Chienen
dc.contributor.author王辰淯zh_TW
dc.contributor.authorChen-Yu Wangen
dc.date.accessioned2025-06-18T16:07:03Z-
dc.date.available2025-06-19-
dc.date.copyright2025-06-18-
dc.date.issued2025-
dc.date.submitted2025-05-26-
dc.identifier.citationReference
[1] M. Spectra, “Event camera: The next generation of visual perception system,” 2022, accessed: March 14, 2025. [Online]. Available: https://spectra.mathpix.com/article/2022.03.00339/event-camera-the-next-generation-of-visual-perception-system xi, 2
[2] J. R. Pauszek, “An introduction to eye tracking in human factors healthcare research and medical device testing,” Human Factors in Healthcare, vol. 3, p. 100031, 2023. [Online]. Available: https: //www.sciencedirect.com/science/article/pii/S2772501422000288 xi, 4
[3] Y. Liu, S. Zhang, and S. Chen, “Wearable near-eye tracking technologies for health: A review,” Bioengineering, vol. 11, no. 7, p. 738, 2024. [Online]. Available: https://www.mdpi.com/2306-5354/11/7/738 xi, 9, 10
[4] B. Li, H. Fu, D. Wen, and W. L. Lo, “Etracker: A Mobile Gaze-Tracking System with Near-Eye Display Based on a Combined Gaze-Tracking Algorithm,” Sensors, vol. 18, no. 5, p. 1626, 2018. [Online]. Available: https://www.mdpi.com/1424-8220/18/5/1626 xi, 12, 13
[5] Y. Feng, N. Goulding-Hotta, A. Khan, H. Reyserhove, and Y. Zhu, “Real-time gaze tracking with event-driven eye segmentation,” in Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 2022, pp. 399–408. [Online]. Available: https://doi.org/10.1109/VR51125.2022.00059 xi, 5, 6, 15
[6] J. L. N. C. Y. S. H. W. Guangrong Zhao, Yurun Yang and G. Lan, “Ev-eye: Rethinking high-frequency eye tracking through the lenses of event cameras,” in 37th Conference on Neural Information Processing Systems Datasets and Benchmarks Track, New Orleans, USA, 2023. xi, 1, 4, 5, 6, 16, 17, 26, 29, 30, 31, 32, 33, 36, 37, 40, 41, 42, 43, 44, 45, 46, 47, 48, 51, 52, 53, 54, 55
[7] Q. Chen, Z. Wang, S.-C. Liu, and C. Gao, “3et: Efficient event-based eye tracking using a change-based convlstm network,” arXiv preprint arXiv:2308.11771, 2023, to be presented at the 2023 IEEE Biomedical Circuits and Systems Conference (BioCAS). xi, 6, 16, 18, 40, 41, 42, 43, 48, 49, 51, 52, 53
[8] Z. Wang, Z. Wan, H. Han, B. Liao, Y. Wu, W. Zhai, Y. Cao, and Z. jun Zha, “Mambapupil: Bidirectional selective recurrent model for event-based eye tracking,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). IEEE, 2024. [Online]. Available: https://doi.org/10.1109/CVPRW63382.2024.00585 xi, 16, 17, 18
[9] H. Rebecq, R. Ranftl, V. Koltun, and D. Scaramuzza, “High speed and high dynamic range video with an event camera,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 43, no. 6, pp. 1964–1980, 2021. xi, 27, 28, 29
[10] Keras Team, “Keras applications- efficientnet,” 2023, accessed: 2024-04-13. [Online]. Available: https://keras.io/api/applications/efficientnet/ xi, 29
[11] M. Haidekker, The Hough Transform. IEEE, 2011, pp. 211–235. xii, 21, 25, 31, 32, 35, 36, 37, 42
[12] G. Gallego, T. Delbr¨uck, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: A survey,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 44, no. 1, pp. 154–180, 2022. 1, 2
[13] A. N. Angelopoulos, J. N. Martel, A. P. Kohli, J. Conradt, and G. Wetzstein, “Event based, near eye gaze tracking beyond 10,000 hz,” arXiv preprint arXiv:2004.03577, 2020. 1, 2, 5, 6, 15, 16, 29, 40
[14] T. Stoffregen, H. Daraei, C. Robinson, and A. Fix, “Event-based kilohertz eye tracking using coded differential lighting,” in Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2022, pp. 2515–2523. [Online]. Available: https://openaccess.thecvf. com/content/WACV2022/papers/Stoffregen Event-Based Kilohertz Eye Tracking Using Coded Differential Lighting WACV 2022 paper.pdf 2
[15] E. Mueggler, “Event-based vision for high-speed robotics,” PhD thesis, University of Zurich, 2017. [Online]. Available: https://doi.org/10.5167/ uzh-144555 2
[16] G. Chen, C. Bartolozzi, and G. Indiveri, “A neuromorphic avlsi model of spike-based retinal motion sensitivity,” in 2014 Biomedical Circuits and Systems Conference (BioCAS). IEEE, 2014, pp. 512–515. [Online]. Available: https://doi.org/10.1109/BioCAS.2014.6981776 2
[17] A. Amir, B. Taba, D. Berg, T. Melano, J. L. McKinstry, C. Di Nolfo, T. Nayak, A. Andreopoulos, G. Garreau, E. Mendoza, J. Kusnitz, S. K. Esser, T. Delbruck, and D. S. Modha, “A low power, fully event-based gesture recognition system,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2017, pp. 7243–7252. [Online]. Available: https://doi.org/10.1109/CVPR.2017.781 2
[18] M. Cannici, M. Ciccone, A. Romanoni, and M. Matteucci, “Space-time event clouds for gesture recognition: From rgb cameras to event cameras,” in 2019 IEEE Winter Conference on Applications of Computer Vision (WACV). IEEE, 2019, pp. 1689–1698. [Online]. Available: https://doi.org/10.1109/WACV.2019.00199 2
[19] R. Zhang, L. Leng, K. Che, H. Zhang, J. Cheng, Q. Guo, J. Liao, and R. Cheng, “Accurate and efficient event-based semantic segmentation using adaptive spiking encoder-decoder network,” 2024. [Online]. Available: https://arxiv.org/abs/2304.11857 3
[20] Z. Wang, Z. Wang, H. Li, L. Qin, R. Jiang, D. Ma, and H. Tang, “Eas-snn: End-to-end adaptive sampling and representation for event-based detection with recurrent spiking neural networks,” in Computer Vision– ECCV 2024, A. Leonardis, E. Ricci, S. Roth, O. Russakovsky, T. Sattler, and G. Varol, Eds. Cham: Springer Nature Switzerland, 2024, pp. 310–328. [Online]. Available: https://doi.org/10.1007/978-3-031-73027-6 18 3
[21] X. Jin, S. Chai, J. Tang, X. Zhou, and K. Wang, “Eye-tracking in ar/vr: A technological review and future directions,” IEEE Open Journal of Immersive Displays, vol. 1, pp. 146–154, 2024. 4
[22] A. Plopski, T. Hirzle, N. Norouzi, L. Qian, G. Bruder, and T. Langlotz, “The eye in extended reality: A survey on gaze interaction and eye tracking in head-worn extended reality,” ACM Computing Surveys, vol. 55, no. 3, pp. 53:1–53:39, 2022. 4
[23] R. Konrad, A. Angelopoulos, and G. Wetzstein, “Gaze-contingent ocular parallax rendering for virtual reality,” ACM Transactions on Graphics, vol. 39, no. 2, pp. 10:1–10:12, 2020. 4
[24] M. K. Eckstein, B. Guerra-Carrillo, A. T. Miller Singley, and S. A. Bunge, “Beyond eye gaze: What else can eyetracking reveal about cognition and cognitive development?” Developmental Cognitive Neuroscience, vol. 25, pp. 69–91, 2017. 4
[25] N. N. J. Rommelse, S. M. van der Stigchel, and J. K. Buitelaar, “A review on eye movement studies in childhood and adolescent psychiatry,” Brain and Cognition, vol. 68, no. 3, pp. 391–414, 2008. 4
[26] I. M. Neuhann, B. A. M. Lege, M. Bauer, J. M. Hassel, A. Hilger, and T. F. Neuhann, “Static and dynamic rotational eye tracking during lasik treatment of myopic astigmatism with the zyoptix laser platform and advanced control eye tracker,” Journal of Refractive Surgery, vol. 26, no. 1, pp. 17–27, 2010. 4
[27] Tobii, “Tobii pro glasses 3- wearable eye tracker,” 2024, accessed: March 18, 2025. [Online]. Available: https://www.tobii.com/products/eye-trackers/ wearables/tobii-pro-glasses-3 4, 40
[28] SR Research, “Eyelink 1000 plus eye tracker,” 2024, accessed: March 18, 2025. [Online]. Available: https://www.sr-research.com/eyelink-1000-plus/ 4
[29] T. Zhang, Y. Shen, G. Zhao, L. Wang, X. Chen, L. Bai, and Y. Zhou, “SwiftEye: Towards anti-blink pupil tracking for precise and robust high-frequency near-eye movement analysis with event cameras,” IEEE Transactions on Visualization and Computer Graphics, vol. 30, no. 5, pp. 2077–2086, 2024. [Online]. Available: https://doi.org/10.1109/TVCG.2024.3372039 6
[30] R. Konrad, S. Shrestha, J. Lisinski, O. Wang, and R. Ng, “Near-eye display gaze tracking via convolutional neural networks,” in Stanford University CS231A Course Project Reports, 2016. [Online]. Available: https: //web.stanford.edu/class/cs231a/prev projects 2016/eye-display-gaze-2.pdf 9
[31] L. D. Kaufmann, H. Rosenzopf, E. Gruber, C. Scharinger, G. Reishofer, B. Kopp, V. Frey, R. Zink, M. Kloimstein, H. Himmelbauer, M. Schmidt, T. Benke, M. F. Wurm, F. Zimprich, G. Kerkhoff, T. Wilke, A. Pittermann, F. Brandl, and S. Reinhart, “Video-oculography during free visual exploration to detect right-sided neglect in left-hemispheric stroke patients with aphasia,” Frontiers in Neuroscience, vol. 15, p. 640049, 2021. [Online]. Available: https://www.frontiersin.org/articles/10.3389/fnins.2021.640049/full 10
[32] E. Portales-Casamar, E. A. Atr´ ıan-Blasco, ´ Alvaro S´ anchez-Ferro, R. Garc´ ıaRamos, B. de la Torre, B. Moreno-Garc´ıa, ´ Angel Mart´ınez-Mart´ın, S. P´erez-Hoyos, A. del Pino-Sede˜no, M. Culebras, and E. Rocon, “Automatic video-oculography system for detection of minimal hepatic encephalopathy,” Sensors, vol. 23, no. 19, p. 8073, 2023. [Online]. Available: https://www.mdpi.com/1424-8220/23/19/8073 10
[33] N. Nair, R. Kothari, A. K. Chaudhary, Z. Yang, G. J. Diaz, J. B. Pelz, and R. J. Bailey, “RIT-Eyes: Rendering of Near-Eye Images for Eye-Tracking Applications,” in ETRA ’20: 2020 Symposium on Eye Tracking Research and Applications, 2020. [Online]. Available: https://doi.org/10.1145/3379157.3391990 13
[34] W.Zhang, J. Cao, X.Wang, E.Tian, andB.Li, “Slippage-robust gaze tracking for near-eye display,” arXiv preprint arXiv:2210.11637, 2022. 14
[35] N. Li, A. Bhat, and A. Raychowdhury, “E-track: Eye tracking with event camera for extended reality (xr) applications,” in 2023 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS). IEEE, 2023, pp. 233–236. [Online]. Available: https://doi.org/10.1109/AICAS57966.2023.10168551 16
[36] R. S. Kothari, A. K. Chaudhary, R. J. Bailey, J. B. Pelz, and G. J. Diaz, “Ellseg: An ellipse segmentation framework for robust gaze tracking,” IEEE Transactions on Visualization and Computer Graphics, vol. 27, no. 5, 2021. 28, 30
[37] Y. Hu, S.-C. Liu, and T. Delbruck, “v2e: From video frames to realistic dvs events,” in 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2021, pp. 1312–1321. 40
[38] iniVation AG, “Davis346 product documentation,” 2024, accessed: 2025-04-11. [Online]. Available: https://docs.inivation.com/hardware/ current-products/davis346.html 40
[39] A. Dutta and A. Zisserman, “Vgg image annotator (via),” https://www.robots. ox.ac.uk/∼vgg/software/via/via.html, 2019, accessed: 2025-04-11. 40
[40] M. Tonsen, X. Zhang, Y. Sugano, and A. Bulling, “Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments,” in Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, ser. ETRA ’16. Association for Computing Machinery, 2016, p. 139–142. [Online]. Available: https://doi.org/10.1145/2857491.2857520 40, 49
[41] Pupil Labs, “Pupil labs: Eye tracking for real-world applications,” https: //pupil-labs.com/, 2024, accessed: 2025-04-11. 41
[42] D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014. 42
[43] PyTorch, “torch.optim.lr scheduler.multisteplr — pytorch documentation,” https://pytorch.org/docs/stable/generated/torch.optim.lr scheduler. MultiStepLR.html, 2024, accessed: 2025-04-11. 42
[44] Prophesee, “Genx320 event-based vision sensor,” https://www.prophesee.ai/ genx320-info/, 2025, accessed: 2025-04-18. 59, 60, 61
[45] Basler AG, “Basler a2a4504-18umbas usb3.0 camera,” https://www. baslerweb.com/zh-hant-tw/shop/a2a4504-18umbas/, 2025, accessed: 202504-18. 59, 60
[46] iCatch Technology Inc., “icatch technology official website,” https://www. icatchinc.com/tw/, accessed: 2025-04-21. 61
[47] Prophesee, “Prophesee– event-based vision for machines,” https://www. prophesee.ai/, accessed: 2025-04-21. 61
[48] iCatch Technology Inc., “ievcam– event-based vision camera,” https://www. icatchtek.com/ProductContent/6f5a3d19c0494e599e57c1289bb39d4d, accessed: 2025-04-21. 61
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/97434-
dc.description.abstract事件相機(Event Camera)憑藉其高時間解析度、低延遲、寬動態範圍及低功耗等特性,特別適合需要高響應性與強健性的眼動追蹤應用。相較於傳統幀式相機(Frame-based Camera)易受運動模糊與冗餘數據採集之限制,事件相機僅在場景亮度變化時輸出非同步數據,此特性使其能以極低延遲精準捕捉快速眼球運動。
本論文提出一套即時混合式眼動追蹤系統,結合外觀式與特徵式方法,並建構於事件驅動框架之上。透過充分發揮事件相機的優勢,所提出之系統可實現準確且穩定的注視點估計,適用於 AR/VR 以及其他互動式應用環境。此系統的一項重要貢獻為在追蹤階段引入了信心機制(confidence mechanism),進一步提升了混合方法在可靠性與精準度上的表現。相較於傳統的深度學習方法,本系統展現出更高的即時性與響應能力,透過混合初始化的分割策略結合輕量化的匹配模組與連續追蹤模組,成功省略了頻繁重新初始化的需求。實驗結果顯示,本研究所提出的架構在準確率與即時性方面均優於目前的先進方法,突顯了混合式與事件驅動策略在推進眼動追蹤技術,尤其是在沉浸式應用 (如 AR/VR)中所具備的潛力與價值。
zh_TW
dc.description.abstractEvent cameras, with their high temporal resolution, low latency, wide dynamic range, and low power consumption, are particularly well-suited for eye tracking applications that demand responsiveness and robustness under challenging conditions. Unlike conventional frame-based cameras that suffer from motion blur and redundant data capture, event cameras output asynchronous data only when brightness changes occur in the scene, making them ideal for capturing rapid eye movements with minimal delay.
This thesis presents a real-time hybrid gaze tracking system that integrates appearance-based and feature-based methods within an event-driven framework. By leveraging the unique advantages of event cameras, the proposed system enables accurate and robust online gaze estimation suitable for AR/VR and other interactive environments. A key contribution is the introduction of a confidence mechanism in the tracking stage, which improves reliability and precision over existing hybrid approaches. Compared to deep learning-based methods, the proposed system achieves higher accuracy and lower inference latency. Furthermore, experimental results show that the proposed framework outperforms the state-of-the-art method in both accuracy and responsiveness, enabled by a hybrid-initialized segmentation strategy with lightweight matching and continuous tracking—eliminating the need for frequent reinitialization. Additionally, the low power consumption of event-based processing supports deployment on resource-constrained or wearable platforms. This work highlights the promise of hybrid and event-driven techniques in advancing gaze tracking, particularly for immersive applications such as AR/VR.
en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2025-06-18T16:07:03Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2025-06-18T16:07:03Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontentsContents
Master’s Thesis Acceptance Certificate i
Acknowledgement iii
Chinese Abstract v
Abstract vii
Contents ix
List of Figures xi
List of Tables xiii
1 Introduction 1
1.1 Introduction of Event Camera . . . . . . . . . . . . . . . . . . . 1
1.2 Gaze Tracking. . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Challenges and Contribution . . . . . . . . . . . . . . . . . . . . 5
1.4 Thesis Organization . . . . . . . . . . . . . . . . . . . . . . . . . 7
2 Related Work 9
2.1 Near-eye Gaze Tracking . . . . . . . . . . . . . . . . . . . . . . 9
2.2 Frame-based Near-eye Gaze Tracking . . . . . . . . . . . . . . . 12
2.3 Event-based Near-eye Gaze Tracking . . . . . . . . . . . . . . . . 15
3 Proposed Method 21
3.1 Overview of Event-based Gaze Tracking System . . . . . . . . . 22
3.2 Event Representation . . . . . . . . . . . . . . . . . . . . . . . . 25
3.3 Initial Pupil Localization(Appearance-Based) . . . . . . . . . . . 27
3.4 Event-Based Pupil Tracking(Feature-Based) . . . . . . . . . . . 30
3.4.1 Candidates Selection . . . . . . . . . . . . . . . . . . . . 30
3.4.2 Confidence Mechanism. . . . . . . . . . . . . . . . . . . 32
3.4.3 Hough Transform. . . . . . . . . . . . . . . . . . . . . . 32
3.5 Event-based Pupil Matching(Feature-Based) . . . . . . . . . . . 36
3.6 Gaze Estimation. . . . . . . . . . . . . . . . . . . . . . . . . . . 37
4 Experiments 39
4.1 Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.2 Implementation Details . . . . . . . . . . . . . . . . . . . . . . . 41
4.3 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
4.3.1 Evaluation Metrics . . . . . . . . . . . . . . . . . . . . . 43
4.3.2 Comparison with the State-of-the-Art Hybrid Method. . . 44
4.3.3 Comparison with Learning-based Methods . . . . . . . . 48
4.4 Optimal Configuration: Event Count=70 . . . . . . . . . . . . . 49
4.5 System Efficiency . . . . . . . . . . . . . . . . . . . . . . . . . . 51
4.6 Ablation Study . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
5 Conclusion 57
6 Future Work 59
Reference 63
-
dc.language.isoen-
dc.subject眼球追蹤zh_TW
dc.subject近眼凝視追蹤zh_TW
dc.subject事件相機zh_TW
dc.subjectNear-eye Gaze Trackingen
dc.subjectEvent Cameraen
dc.subjectEye Trackingen
dc.title即時事件驅動的混合式外觀與特徵型凝視追蹤方法zh_TW
dc.titleReal-Time Hybrid Appearance-Based and Feature-Based Event-Driven Gaze Trackingen
dc.typeThesis-
dc.date.schoolyear113-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee曹昱;陳駿丞;陳祝嵩;塗偉志zh_TW
dc.contributor.oralexamcommitteeYu Tsao;Jun-Cheng Chen;Chu-Song Chen;Wei-Chih Tuen
dc.subject.keyword事件相機,眼球追蹤,近眼凝視追蹤,zh_TW
dc.subject.keywordEvent Camera,Eye Tracking,Near-eye Gaze Tracking,en
dc.relation.page70-
dc.identifier.doi10.6342/NTU202500985-
dc.rights.note未授權-
dc.date.accepted2025-05-26-
dc.contributor.author-college重點科技研究學院-
dc.contributor.author-dept積體電路設計與自動化學位學程-
dc.date.embargo-liftN/A-
顯示於系所單位:積體電路設計與自動化學位學程

文件中的檔案:
檔案 大小格式 
ntu-113-2.pdf
  未授權公開取用
32.88 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved