請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/69556
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 洪一平 | |
dc.contributor.author | Yan-Bin Song | en |
dc.contributor.author | 宋焱檳 | zh_TW |
dc.date.accessioned | 2021-06-17T03:19:11Z | - |
dc.date.available | 2028-06-26 | |
dc.date.copyright | 2018-06-29 | |
dc.date.issued | 2018 | |
dc.date.submitted | 2018-06-26 | |
dc.identifier.citation | [1] Engel, Jakob, Jürgen Sturm, and Daniel Cremers. 'Camera-based navigation of a low-cost quadrocopter.' Intelligent Robots and Systems (IROS), 2012 IEEE/RSJ International Conference on. IEEE, 2012.
[2] Concha, Alejo, et al. 'Visual-inertial direct SLAM.' Robotics and Automation (ICRA), 2016 IEEE International Conference on. IEEE, 2016. [3] WIKI, Infrared category, https://en.wikipedia.org/wiki/Infrared [4] Klein, Georg, and David Murray. 'Parallel tracking and mapping for small AR workspaces.' Mixed and Augmented Reality, 2007. ISMAR 2007. 6th IEEE and ACM International Symposium on. IEEE, 2007. [5] Engel, Jakob, Thomas Schöps, and Daniel Cremers. 'LSD-SLAM: Large-scale direct monocular SLAM.' European Conference on Computer Vision. Springer, Cham, 2014. [6] Mur-Artal, Raul, Jose Maria Martinez Montiel, and Juan D. Tardos. 'ORB-SLAM: a versatile and accurate monocular SLAM system.' IEEE Transactions on Robotics 31.5 (2015): 1147-1163. [7] Caruso, David, Jakob Engel, and Daniel Cremers. 'Large-scale direct slam for omnidirectional cameras.' Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on. IEEE, 2015. [8] Urban, Steffen, and Stefan Hinz. 'MultiCol-SLAM-A Modular Real-Time Multi-Camera SLAM System.' arXiv preprint arXiv:1610.07336 (2016). [9] Trnovszký, Tibor, and Róbert Hudec. 'Local Feature Extraction in the Near Infra-Red Domain for Wildlife Mammals Tracking Purpose.' Civil and Environmental Engineering 12.1 (2016): 34-41. [10] Ricaurte, Pablo, et al. 'Feature point descriptors: Infrared and visible spectra.' Sensors 14.2 (2014): 3690-3701. [11] Zheng, Yali, et al. 'Visual search based indoor localization in low light via rgb-d camera.' World Academy of Science, Engineering and Technology, International Journal of Computer, Electrical, Automation, Control and Information Engineering 11.3 (2017): 349-352. [12] Chen, Long, et al. 'RGB-T SLAM: A flexible SLAM framework by combining appearance and thermal information.' Robotics and Automation (ICRA), 2017 IEEE International Conference on. IEEE, 2017. [13] Li, Fan, et al. 'Line Features of Detail Enhancement Thermal Infrared Image for SLAM.' MATEC Web of Conferences. Vol. 139. EDP Sciences, 2017. [14] Chen, Kuan-Wen, et al. 'Vision-Based Positioning for Internet-of-Vehicles.' IEEE Transactions on Intelligent Transportation Systems 18.2 (2017): 364-376. [15] Shen, Tian-Yi, et al. 'Sensor Fusion and Complementary Ego-Positioning of Drone.' CVGIP. 2017. [16] Vicon. Vicon bonita. https://www.vicon.com/products/camerasystems/bonita [17] Scaramuzza, Davide, Agostino Martinelli, and Roland Siegwart. 'A toolbox for easily calibrating omnidirectional cameras.' Intelligent Robots and Systems, 2006 IEEE/RSJ International Conference on. IEEE, 2006. [18] Zhang, Zhengyou. 'A flexible new technique for camera calibration.' IEEE Transactions on pattern analysis and machine intelligence 22.11 (2000): 1330-1334. [19] Raspberry Pi, Raspberry Pi Model B+, https://www.raspberrypi.org/products/raspberry-pi-3-model-b/ [20] Raspberry Pi, IR-CUT camera, https://www.waveshare.com/wiki/RPi_IR-CUT_Camera [21] Urban, Steffen, Jens Leitloff, and Stefan Hinz. 'Improved wide-angle, fisheye and omnidirectional camera calibration.' ISPRS Journal of Photogrammetry and Remote Sensing 108 (2015): 72-79. [22] Li, Bo, et al. 'A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern.' Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference on. IEEE, 2013. [23] Gálvez-López, Dorian, and Juan D. Tardos. 'Bags of binary words for fast place recognition in image sequences.' IEEE Transactions on Robotics 28.5 (2012): 1188-1197. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/69556 | - |
dc.description.abstract | 即時定位與地圖構建(SLAM)是一種用來解決機器人自我定位問題的通用方法。目前主流的SLAM方法主要分為直接法和基於特徵點的方法。直接法對於亮度資訊比較敏感,基於特徵點的方法對於亮度變化有一定的容忍度。在夜間環境下,正常相機的信息變得很難識別。因此,我們提出了使用近紅外攝影機進行夜間機器人的自我定位。通過近紅外線(NIR)得到的影像會因為距離紅外線燈的遠近而出現不同的光照強度的變化,所以不適合用直接法進行SLAM。我們使用了對光照有一定容忍性的基於特徵點的方法進行夜間自我定位。該方法將結合多台廣角红外相機以及近紅外線進行自我定位,不僅可以解決低亮度條件下特徵點的捕捉問題,還可以獲得準確的定位結果。使得機器人在室内夜間低亮度的情況下仍然可以繼續進行自我定位,從而更好地適應複雜多變的情況。 | zh_TW |
dc.description.abstract | Simultaneous Localization and Mapping(SLAM)is a generic method used to solve robot ego-positioning problems. At present, popular SLAM methods are mainly divided into direct methods and feature-based methods. The direct method is more sensitive to brightness information, and the feature-based method has been demonstrated more tolerance toward changes in brightness. In nighttime environment, information from normal camera become difficult to identify. Therefore, we propose to use near-infrared (NIR) cameras for ego-positioning of night robots. Images obtained by NIR light have different light intensities due to their distance from the infrared light sources, so SLAM is not suitable for direct methods. We used a feature-based method which is tolerant to light intensities to ego-positioning. This method combines ego-positioning and NIR with multi-wide-angle NIR cameras to not only solve the problem of capturing the feature points under low-light conditions, but also obtain more accurate ego-positioning results. This allows the robot to do SLAM even at night in low illumination indoor scenario. Therefore, this system has more robust results in different challenging situations. | en |
dc.description.provenance | Made available in DSpace on 2021-06-17T03:19:11Z (GMT). No. of bitstreams: 1 ntu-107-R05944043-1.pdf: 3359030 bytes, checksum: a9097600e0f8d8fd74adc58737405e24 (MD5) Previous issue date: 2018 | en |
dc.description.tableofcontents | 口試委員會審定書 I
誌謝 II 中文摘要 III ABSTRACT IV CONTENTS V LIST OF FIGURES VII LIST OF TABLES IX Chapter 1 Introduction 1 1.1 Motivation 1 1.2 Infrared Radiation Light Category 2 1.3 Ego-positioning Methods 3 Chapter 2 Related Work 5 2.1 Visual SLAM 5 2.2 IR Feature Detection 6 2.3 Visual SLAM in IR Domain 7 Chapter 3 Multi-wide-angle IR Camera SLAM 9 3.1 ORB-SLAM 9 3.2 Framework Overview 10 3.3 MultiCol-SLAM 12 3.3.1 Multi-Keyframe (MKF) 12 3.3.2 The MultiCol Model 13 3.3.3 Map Points Fusion 14 Chapter 4 Multiple Camera Calibration 15 4.1 Camera Intrinsics 15 4.1.1 Pinhole Camera Model 15 4.1.2 Omnidirectional Camera Model 16 4.2 Camera Extrinsics 17 4.3 Result Combination 19 4.4 Summary 20 Chapter 5 Experiment 21 5.1 Experimental Device 21 5.2 Experimental Scenario 22 5.3 Experiment Result 23 5.3.1 Different FOV IR Cameras 23 5.3.2 Multi-wide-angle IR Camera System in Normal Illumination 25 5.3.3 Multi-wide-angle IR Camera System in Low Illumination 28 5.4 Summary 30 Chapter 6 Conclusion 31 REFERENCE 32 | |
dc.language.iso | en | |
dc.title | 基於近紅外線廣角相機的自我定位演算法之研究 | zh_TW |
dc.title | Research on Near-infrared-based Ego-positioning Algorithm with Wide-angle Camera | en |
dc.type | Thesis | |
dc.date.schoolyear | 106-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 陳祝嵩,陳冠文,李明穗,陳嘉平 | |
dc.subject.keyword | 近紅外線,夜間,即時定位與地圖構建,紅外相機,多廣角紅外相機, | zh_TW |
dc.subject.keyword | near-infrared,nighttime,simultaneous localization and mapping,infrared camera,multi-wide-angle infrared camera, | en |
dc.relation.page | 34 | |
dc.identifier.doi | 10.6342/NTU201800978 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2018-06-27 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 資訊網路與多媒體研究所 | zh_TW |
顯示於系所單位: | 資訊網路與多媒體研究所 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-107-1.pdf 目前未授權公開取用 | 3.28 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。