Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 重點科技研究學院
  3. 積體電路設計與自動化學位學程
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/97434
Title: 即時事件驅動的混合式外觀與特徵型凝視追蹤方法
Real-Time Hybrid Appearance-Based and Feature-Based Event-Driven Gaze Tracking
Authors: 王辰淯
Chen-Yu Wang
Advisor: 簡韶逸
Shao-Yi Chien
Keyword: 事件相機,眼球追蹤,近眼凝視追蹤,
Event Camera,Eye Tracking,Near-eye Gaze Tracking,
Publication Year : 2025
Degree: 碩士
Abstract: 事件相機(Event Camera)憑藉其高時間解析度、低延遲、寬動態範圍及低功耗等特性,特別適合需要高響應性與強健性的眼動追蹤應用。相較於傳統幀式相機(Frame-based Camera)易受運動模糊與冗餘數據採集之限制,事件相機僅在場景亮度變化時輸出非同步數據,此特性使其能以極低延遲精準捕捉快速眼球運動。
本論文提出一套即時混合式眼動追蹤系統,結合外觀式與特徵式方法,並建構於事件驅動框架之上。透過充分發揮事件相機的優勢,所提出之系統可實現準確且穩定的注視點估計,適用於 AR/VR 以及其他互動式應用環境。此系統的一項重要貢獻為在追蹤階段引入了信心機制(confidence mechanism),進一步提升了混合方法在可靠性與精準度上的表現。相較於傳統的深度學習方法,本系統展現出更高的即時性與響應能力,透過混合初始化的分割策略結合輕量化的匹配模組與連續追蹤模組,成功省略了頻繁重新初始化的需求。實驗結果顯示,本研究所提出的架構在準確率與即時性方面均優於目前的先進方法,突顯了混合式與事件驅動策略在推進眼動追蹤技術,尤其是在沉浸式應用 (如 AR/VR)中所具備的潛力與價值。
Event cameras, with their high temporal resolution, low latency, wide dynamic range, and low power consumption, are particularly well-suited for eye tracking applications that demand responsiveness and robustness under challenging conditions. Unlike conventional frame-based cameras that suffer from motion blur and redundant data capture, event cameras output asynchronous data only when brightness changes occur in the scene, making them ideal for capturing rapid eye movements with minimal delay.
This thesis presents a real-time hybrid gaze tracking system that integrates appearance-based and feature-based methods within an event-driven framework. By leveraging the unique advantages of event cameras, the proposed system enables accurate and robust online gaze estimation suitable for AR/VR and other interactive environments. A key contribution is the introduction of a confidence mechanism in the tracking stage, which improves reliability and precision over existing hybrid approaches. Compared to deep learning-based methods, the proposed system achieves higher accuracy and lower inference latency. Furthermore, experimental results show that the proposed framework outperforms the state-of-the-art method in both accuracy and responsiveness, enabled by a hybrid-initialized segmentation strategy with lightweight matching and continuous tracking—eliminating the need for frequent reinitialization. Additionally, the low power consumption of event-based processing supports deployment on resource-constrained or wearable platforms. This work highlights the promise of hybrid and event-driven techniques in advancing gaze tracking, particularly for immersive applications such as AR/VR.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/97434
DOI: 10.6342/NTU202500985
Fulltext Rights: 未授權
metadata.dc.date.embargo-lift: N/A
Appears in Collections:積體電路設計與自動化學位學程

Files in This Item:
File SizeFormat 
ntu-113-2.pdf
  Restricted Access
32.88 MBAdobe PDF
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved