請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/4930完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 洪一平 | |
| dc.contributor.author | Yen-Chi E | en |
| dc.contributor.author | 鄂彥齊 | zh_TW |
| dc.date.accessioned | 2021-05-14T17:50:54Z | - |
| dc.date.available | 2018-08-21 | |
| dc.date.available | 2021-05-14T17:50:54Z | - |
| dc.date.copyright | 2015-08-21 | |
| dc.date.issued | 2015 | |
| dc.date.submitted | 2015-08-20 | |
| dc.identifier.citation | [1] Andrew J Davison. Real-time simultaneous localisation and mapping with a single camera. In Computer Vision, 2003. Proceedings. Ninth IEEE International Conference on, pages 1403–1410. IEEE, 2003.
[2] Davide Scaramuzza and Friedrich Fraundorfer. Visual odometry [tutorial]. Robotics & Automation Magazine, IEEE, 18(4):80–92, 2011. [3] David Nistér, Oleg Naroditsky, and James Bergen. Visual odometry. In Computer Vision and Pattern Recognition, 2004. CVPR 2004. Proceedings of the 2004 IEEE Computer Society Conference on, volume 1, pages I–652. IEEE, 2004. [4] Laurent Kneip, Margarita Chli, Roland Siegwart, Roland Yves Siegwart, and Roland Yves Siegwart. Robust real-time visual odometry with a single camera and an imu. In BMVC, pages 1–11, 2011. [5] Andreas Geiger, Julius Ziegler, and Christoph Stiller. Stereoscan: Dense 3d reconstruction in real-time. In Intelligent Vehicles Symposium (IV), 2011 IEEE, pages 963–968. IEEE, 2011. [6] Korbinian Schmid and Heiko Hirschmuller. Stereo vision and imu based real-time ego-motion and depth image computation on a handheld device. In Robotics and Automation (ICRA), 2013 IEEE International Conference on, pages 4671–4678. IEEE, 2013. [7] Zheng Fang and Yu Zhang. Experimental evaluation of rgb-d visual odometry methods. International Journal of Advanced Robotic Systems, 12, 2015. [8] Albert S Huang, Abraham Bachrach, Peter Henry, Michael Krainin, Daniel Maturana, Dieter Fox, and Nicholas Roy. Visual odometry and mapping for autonomous flight using an rgb-d camera. In International Symposium on Robotics Research (ISRR), pages 1–16, 2011. [9] Christian Kerl, Jurgen Sturm, and Daniel Cremers. Robust odometry estimation for rgb-d cameras. In Robotics and Automation (ICRA), 2013 IEEE International Conference on, pages 3748–3754. IEEE, 2013. [10] François Pomerleau, Francis Colas, Roland Siegwart, and Stéphane Magnenat. Comparing icp variants on real-world data sets. Autonomous Robots, 34(3):133–148, 2013. [11] Graeme Jones. Accurate and computationally-inexpensive recovery of ego-motion using optical flow and range flow with extended temporal support. In Procedings of the British Machine Vision Conference 2013, pages 75–1, 2013. [12] Henrik Andreasson and Todor Stoyanov. Real time registration of rgb-d data using local visual features and 3d-ndt registration. In SPME Workshop at Int. Conf. on Robotics and Automation (ICRA), 2012. [13] Ivan Dryanovski, Roberto G Valenti, and Jizhong Xiao. Fast visual odometry and mapping from rgb-d data. In Robotics and Automation (ICRA), 2013 IEEE International Conference on, pages 2305–2310. IEEE, 2013. [14] Ji Zhang, Michael Kaess, and Sushil Singh. Real-time depth enhanced monocular odometry. In Intelligent Robots and Systems (IROS 2014), 2014 IEEE/RSJ International Conference on, pages 4973–4980. IEEE, 2014. [15] Herbert Bay, Tinne Tuytelaars, and Luc Van Gool. Surf: Speeded up robust features. In Computer vision–ECCV 2006, pages 404–417. Springer, 2006. [16] Robert M Haralick, Hyonam Joo, Chung-Nan Lee, Xinhua Zhuang, Vinay G Vaidya, and Man Bae Kim. Pose estimation from corresponding point data. Systems, Man and Cybernetics, IEEE Transactions on, 19(6):1426–1446, 1989. [17] Martin A Fischler and Robert C Bolles. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM, 24(6):381–395, 1981. [18] Heiko Hirschmuller, Peter R Innocent, and Jon M Garibaldi. Fast, unconstrained camera motion estimation from stereo without tracking and robust statistics. In Control, Automation, Robotics and Vision, 2002. ICARCV 2002. 7th International Conference on, volume 2, pages 1099–1104. IEEE, 2002. [19] Annett Stelzer, Heiko Hirschmüller, and Martin Görner. Stereo-vision-based navigation of a six-legged walking robot in unknown rough terrain. The International Journal of Robotics Research, 31(4):381–402, 2012. | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/4930 | - |
| dc.description.abstract | 自我運動估測在機器人控制及自動化上有相當廣泛的應用。正確的
區域自我運動估測可以幫助機器人了解、感知周遭環境,並建構出走 過的路徑。在這篇論文裡,我們提出了一個結合基於關鍵影格的視覺 里程計及慣性資料的自我運動估測系統。系統硬體包括擷取影像的彩 色深度攝影機和取得慣性資料的慣性測量單元。 兩張連續影像間的攝影機運動經由視覺特徵的對應關係來進行計 算。剛體限制可以有效地將初始對應點裡的異常對應點去除。此外, 我們估測運動的過程中利用隨機抽樣一致性算法來處理剩餘異常對應 點的影響。這些方式都能讓我們確保在進行攝影機運動估算時所用的 對應點幾乎都是正確的對應。 我們進行了各種實驗來證明演算法的穩固性和正確性,以及正確地 處理真實場景的能力。 | zh_TW |
| dc.description.abstract | Ego-motion estimation has a wide variety of applications in robot control and automation. Proper local estimation of ego-motion benefits to recognize surrounding environment and recover the trajectory traversed for autonomous robot. In this thesis, we present a system that estimates ego-motion by fusing key frame based visual odometry and inertial measurements. The hardware
of the system includes a RGB-D camera for capturing color and depth images and an Inertial Measurement Unit (IMU) for acquiring inertial measurements. Motion of camera between two consecutive images is estimated by finding correspondences of visual features. Rigidity constraints are used to efficiently remove outliers from a set of initial correspondence. Moreover, we apply random sample consensus (RANSAC) to handle the effect of the remaining outliers in the motion estimation step. These strategies are reasonable to insure that the remaining correspondences which involved in motion estimation almost contain inliers. Several experiments with different kind of camera movements are performed to show that the robustness and accuracy of the ego-motion estimation algorithm, and the ability of our system to handle the real scene data correctly. | en |
| dc.description.provenance | Made available in DSpace on 2021-05-14T17:50:54Z (GMT). No. of bitstreams: 1 ntu-104-R02944016-1.pdf: 15244509 bytes, checksum: b16d17cf2062dd34ee4ab9088cab8d86 (MD5) Previous issue date: 2015 | en |
| dc.description.tableofcontents | 口試委員會審定書i
致謝 ii 中文摘要 iii Abstract iv Contents v List of Figures vii List of Tables ix 1 Introduction 1 1.1 Ego-motion Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Hardware Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.3 Organization of the Thesis . . . . . . . . . . . . . . . . . . . . . . . . . 3 2 Related work 4 2.1 Single Camera and IMU . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.2 Stereo and IMU . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.3 RGB-D visual odometry . . . . . . . . . . . . . . . . . . . . . . . . . . 6 3 Methodology 8 3.1 Feature Based Visual Odometry with Spatial Constraints . . . . . . . . . 8 3.2 Feature Correspondences . . . . . . . . . . . . . . . . . . . . . . . . . . 10 3.2.1 Initial Correspondences . . . . . . . . . . . . . . . . . . . . . . 10 3.2.2 2-D to 3-D correspondences . . . . . . . . . . . . . . . . . . . . 11 3.3 Outlier Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 3.3.1 Relative Distance Constraint . . . . . . . . . . . . . . . . . . . . 13 3.3.2 Rotation Angle Constraint . . . . . . . . . . . . . . . . . . . . . 14 3.3.3 Consistent Subset of Correspondences . . . . . . . . . . . . . . . 15 3.4 Motion Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 3.5 Fusion with Inertial Measurement Unit . . . . . . . . . . . . . . . . . . . 19 3.6 Key Frame Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 4 Experimental Results 22 4.1 Experimental Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 4.2 Effect of Outlier Detection . . . . . . . . . . . . . . . . . . . . . . . . . 23 4.3 Translatory Motion Experiments . . . . . . . . . . . . . . . . . . . . . . 24 4.4 Comparison of visual results and fusion results . . . . . . . . . . . . . . 25 4.5 Long-term Round Trip Experiment . . . . . . . . . . . . . . . . . . . . . 26 5 Conclusion and Future Work 28 Bibliography 30 | |
| dc.language.iso | en | |
| dc.subject | 慣性測量單元 | zh_TW |
| dc.subject | 自我運動 | zh_TW |
| dc.subject | 運動估測 | zh_TW |
| dc.subject | 視覺里程計 | zh_TW |
| dc.subject | 彩色深度攝影機 | zh_TW |
| dc.subject | Inertial measurement unit | en |
| dc.subject | Ego-motion | en |
| dc.subject | Motion estimation | en |
| dc.subject | Visual odometry | en |
| dc.subject | RGB-D camera | en |
| dc.title | 基於彩色深度攝影機及慣性感應器之自我運動估測 | zh_TW |
| dc.title | Ego-motion Estimation Based on RGB-D Camera and Inertial Sensor | en |
| dc.type | Thesis | |
| dc.date.schoolyear | 103-2 | |
| dc.description.degree | 碩士 | |
| dc.contributor.oralexamcommittee | 莊仁輝,賴尚宏,陳祝嵩,蔡玉寶 | |
| dc.subject.keyword | 自我運動,運動估測,視覺里程計,彩色深度攝影機,慣性測量單元, | zh_TW |
| dc.subject.keyword | Ego-motion,Motion estimation,Visual odometry,RGB-D camera,Inertial measurement unit, | en |
| dc.relation.page | 32 | |
| dc.rights.note | 同意授權(全球公開) | |
| dc.date.accepted | 2015-08-20 | |
| dc.contributor.author-college | 電機資訊學院 | zh_TW |
| dc.contributor.author-dept | 資訊網路與多媒體研究所 | zh_TW |
| 顯示於系所單位: | 資訊網路與多媒體研究所 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-104-1.pdf | 14.89 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
