請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/50560完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 歐陽明(Ming Ouhyoung) | |
| dc.contributor.author | Tzu-Chieh Yu | en |
| dc.contributor.author | 游子杰 | zh_TW |
| dc.date.accessioned | 2021-06-15T12:46:12Z | - |
| dc.date.available | 2018-10-26 | |
| dc.date.copyright | 2016-10-26 | |
| dc.date.issued | 2016 | |
| dc.date.submitted | 2016-07-25 | |
| dc.identifier.citation | [1] DavidJ.Coombs.Real-timegazeholdinginbinocularrobotvision.Technicalreport, Rochester, NY, USA, 1992.
[2] William Steptoe, Simon Julier, and Anthony Steed. Presence and discernability in conventional and non-photorealistic immersive augmented reality. In Mixed and Augmented Reality (ISMAR), 2014 IEEE International Symposium on, pages 213– 218. IEEE, 2014. [3] Yu-HsuanHuang,Tzu-ChiehYu,Pei-HsuanTsai,Yu-XiangWang,Wan-LingYang, and Ming Ouhyoung. Scope+: a stereoscopic video see-through augmented reality microscope. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, pages 33–34. ACM, 2015. [4] Andrei Sherstyuk, Arindam Dey, Christian Sandor, et al. Dynamic eye convergence for head-mounted displays improves user performance in virtual environments. In Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games, pages 23–30. ACM, 2012. [5] KurtisPKeller,HenryFuchs,etal.Simulation-baseddesignandrapidprototypingof a parallax-free, orthoscopic video see-through head-mounted display. In Proceedings of the 4th IEEE/ACM International Symposium on Mixed and Augmented Reality, pages 28–31. IEEE Computer Society, 2005. [6] Andrei State, Jeremy Ackerman, Gentaro Hirota, Joohi Lee, and Henry Fuchs. Dy- namic virtual convergence for video see-through head-mounted displays: maintain- ing maximum stereo overlap throughout a close-range work space. In Augmented Reality, 2001. Proceedings. IEEE and ACM International Symposium on, pages 137– 146. IEEE, 2001. [7] Thomas J Olson and David J Coombs. Real-time vergence control for binocular robots. International Journal of Computer Vision, 7(1):67–89, 1991. [8] C. Brown. Gaze controls cooperating through prediction. Image Vision Comput., 8(1):10–17, February 1990. [9] Aladdin M Ariyaeeinia. The design and performance of stereoscopic television sys- tems. In 1989 Advances in Intelligent Robotics Systems Conference, pages 362–370. International Society for Optics and Photonics, 1990. [10] AM Ariyaeeinia. Analysis and design of stereoscopic television systems. Signal Processing: Image Communication, 13(3):201–208, 1998. [11] Hirokazu Kato and Mark Billinghurst. Marker tracking and hmd calibration for a video-based augmented reality conferencing system. In Augmented Reality, 1999. (IWAR’99) Proceedings. 2nd IEEE and ACM International Workshop on, pages 85– 94. IEEE, 1999. [12] Nassir Navab, Benedicte Bascle, Mirko Appel, and Echeyde Cubillo. Scene aug- mentation via the fusion of industrial drawings and uncalibrated images with a view to marker-less calibration. In Augmented Reality, 1999.(IWAR’99) Proceedings. 2nd IEEE and ACM International Workshop on, pages 125–133. IEEE, 1999. [13] Ronald Azuma, Yohan Baillot, Reinhold Behringer, Steven Feiner, Simon Julier, and Blair MacIntyre. Recent advances in augmented reality. Computer Graphics and Applications, IEEE, 21(6):34–47, 2001. [14] Hirokazu Yamanoue. The differences between toed-in camera configurations and parallel camera configurations in shooting stereoscopic images. In Multimedia and Expo, 2006 IEEE International Conference on, pages 1701–1704. IEEE, 2006. [15] Hoonjong Kang, Namho Hur, Seunghyun Lee, and Hiroshi Yoshikawa. Horizontal parallax distortion in toed-in camera with wide-angle lens for mobile device. Optics Communications, 281(6):1430–1437, 2008. [16] Namho Lee, Jaekyoung Moon, and Soon-Yong Park. Microscopic stereo camera with simultaneous vergence and focus control. In Proc. Intl. Tech. Conf. Circuit/ Syst. Computers Commun.(ITC-CSCC), pages 1637–1640, 2008. [17] Huan Deng, Qiong-Hua Wang, Da-Hai Li, and Ai-Hong Wang. Virtual toed-in cam- era method to eliminate parallax distortions of stereoscopic images for stereoscopic displays. Journal of the Society for Information Display, 18(3):193–198, 2010. [18] Yiwen Wang and Bertram E Shi. Improved binocular vergence control via a neural network that maximizes an internally defined reward. Autonomous Mental Develop- ment, IEEE Transactions on, 3(3):247–256, 2011. [19] Peter Venero, Allen Rowe, and James Boyer. Using augmented reality to help main- tain persistent stare of a moving target inn an urban environment. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, volume 56, pages 2575–2579. SAGE Publications, 2012. [20] KhushalKhairnar,KamleshwarKhairnar,SanketkumarMane,andRahulChaudhari. Furniture layout application based on marker detection and using augmented reality. 2015. [21] Masayuki Kanbara, Takashi Okuma, Haruo Takemura, and Naokazu Yokoya. A stereoscopic video see-through augmented reality system based on real-time vision- based registration. In Virtual Reality, 2000. Proceedings. IEEE, pages 255–262. IEEE, 2000. | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/50560 | - |
| dc.description.abstract | 這篇碩論實作了Gaze+,一個能夠正確的顯示近距離(小於30公分)虛擬物體的位置、大小以及方向的影像穿透式擴增實境的裝置。我們實作的頭戴式顯示器原型提供了Vergence Control以及Gaze holding兩種眼球運動的模擬。由於人眼的特性,我們需要透過眼球的運動來讓我們注視在物體上,這可以讓人類的視覺有正確的深度視覺。而Gaze+透過實作機械結構來讓相機模擬兩眼運動的效果,讓相機的視線方向可以注視在同一個點上。
我們已經尋找了十位受試者。並且讓他們比較兩種雙眼立體視覺的頭戴式顯示器,有模擬眼球運動以及沒有模擬眼球運動。結果顯示有模擬兩眼球運動的裝置能讓受試者比較容易地在近距離的觀看時產生正確的立體視覺。Gaze+是第一個有模擬眼球運動的影像穿透式擴增實境頭戴式顯示器,而這會讓近距離的擴增實境應用能夠正確的顯示。 我們製作了一個電路板組裝的擴增實境應用。透過在電路板上顯示虛擬的電子零件來引導新手進行電路板的組裝。由於這是一個近距離的擴增實境應用,使用Gaze+的眼球模擬可以讓雙眼相機的視線維持在電路板上,這讓使用者透過頭戴式顯示器觀看電路板時可以有正確的雙眼立體視覺。 Gaze+的Vergence Control系統使用AX-18A高解析度伺服馬達作為眼球轉動的機械結構中心,透過高精確度細微的運動來模擬眼球的轉動,並且使用物體追蹤演算法來模擬眼睛視線注視在一個點的狀況。Gaze holding系統則是選擇使用空拍機在用的兩軸穩定器來改裝,兩軸穩定器的功用可以讓相機的視線維持在同一個方向上,正好具有視線穩定的功能,在加上近年來空拍機的流行,選擇其來使用Gaze+的核心元件可降低實作Gaze+的難度。 | zh_TW |
| dc.description.abstract | This thesis presents Gaze+, a video see-through Augmented Reality (AR) device that can display close-ranged stereoscopic Mixed Reality, showing real and virtual objects up close (about 30cm from the camera) in their correct positions, scales, and orientations. Our head-mounted display (HMD) prototype offers vergence control and gaze holding. Because humans have two eyes placed side by side, they need to simultaneously rotate and properly converge onto an object of interest. This is important because it allows humans to see properly and have proper depth perception. We made a stereo camera device that can simulate the motion of human eyes so that both eyes are directed towards the same world point.
We had conducted a user study with 10 participants. We had let them compare two kinds of stereo camera AR HMD, one with and one without vergence control. Results show that their eyes can converge better and more accurately when they are observing with Gaze+. Gaze+ is the first video see-through AR HMD that can correct optical vergence for close-ranged Mixed Reality applications. | en |
| dc.description.provenance | Made available in DSpace on 2021-06-15T12:46:12Z (GMT). No. of bitstreams: 1 ntu-105-R03922157-1.pdf: 31308087 bytes, checksum: 72868308227bb0d9ee8cd811897e9d31 (MD5) Previous issue date: 2016 | en |
| dc.description.tableofcontents | 誌謝 ii
中文摘要 iii Abstract iv Contents v List of Figures vii 1 Introduction 1 1.1 Motivation.................................. 1 1.2 Gaze+.................................... 2 1.3 Contribution................................. 3 2 Related Works 4 2.1 AugmentedReality ............................. 4 2.2 BinocularVisionSystems.......................... 5 2.3 Stereoscopic Video See-Through Augmented Reality . . . . . . . . . . . 6 3 Architecture 7 3.1 VergenceControlSystem.......................... 7 3.1.1 TheNodalPoint .......................... 9 3.2 GazeStabilizationSystem ......................... 12 3.3 AugmentedReality ............................. 12 3.3.1 CameraCalibration......................... 12 3.3.2 Featuretracking(Vuforia) ..................... 14 3.3.3 VirtualCameraControl....................... 15 4 Evaluation 16 4.0.1 ExperimentalSettings ....................... 16 4.0.2 Result................................ 18 5 Discussion 19 5.1 Discussion.................................. 19 6 Applications 20 6.1 Applications................................. 20 7 Limitation and Futurework 22 7.1DifferencesBetweenHumanEyesandGaze+ . . . . . . . . . 22 8 Conclusion 24 8.1 Conclusion ................................. 24 Bibliography..................................25 | |
| dc.language.iso | en | |
| dc.subject | 擴增實境 | zh_TW |
| dc.subject | 影像穿透式 | zh_TW |
| dc.subject | 頭戴式顯示器 | zh_TW |
| dc.subject | 眼球運動 | zh_TW |
| dc.subject | 三軸穩定器 | zh_TW |
| dc.subject | 擴增實境 | zh_TW |
| dc.subject | 影像穿透式 | zh_TW |
| dc.subject | 頭戴式顯示器 | zh_TW |
| dc.subject | 眼球運動 | zh_TW |
| dc.subject | 三軸穩定器 | zh_TW |
| dc.subject | Video See-Through | en |
| dc.subject | Gaze holding | en |
| dc.subject | Gaze stabilization | en |
| dc.subject | Augmented Reality | en |
| dc.subject | Augmented Reality | en |
| dc.subject | Head-Mounted Display | en |
| dc.subject | Video See-Through | en |
| dc.subject | Head-Mounted Display | en |
| dc.subject | Gaze stabilization | en |
| dc.subject | Gaze holding | en |
| dc.subject | Vergence Control | en |
| dc.subject | Vergence Control | en |
| dc.title | 為近距離應用設計之眼球凝視運動之影像穿透式擴增實境裝置 | zh_TW |
| dc.title | Gaze+ : A Stereoscopic Video See-Through Augmented Reality Device with Vergence Control and Gaze Stabilization for Near-Field Applications | en |
| dc.type | Thesis | |
| dc.date.schoolyear | 104-2 | |
| dc.description.degree | 碩士 | |
| dc.contributor.oralexamcommittee | 傅楸善(Chiou-Shann Fuh),葉正聖 | |
| dc.subject.keyword | 擴增實境,影像穿透式,頭戴式顯示器,眼球運動,三軸穩定器, | zh_TW |
| dc.subject.keyword | Augmented Reality,Head-Mounted Display,Video See-Through,Vergence Control,Gaze holding,Gaze stabilization, | en |
| dc.relation.page | 29 | |
| dc.identifier.doi | 10.6342/NTU201601092 | |
| dc.rights.note | 有償授權 | |
| dc.date.accepted | 2016-07-25 | |
| dc.contributor.author-college | 電機資訊學院 | zh_TW |
| dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
| 顯示於系所單位: | 資訊工程學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-105-1.pdf 未授權公開取用 | 30.57 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
