請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/52024
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 傅立成 | |
dc.contributor.author | Tung-Yen Wu | en |
dc.contributor.author | 吳東諺 | zh_TW |
dc.date.accessioned | 2021-06-15T14:03:44Z | - |
dc.date.available | 2018-08-28 | |
dc.date.copyright | 2015-08-28 | |
dc.date.issued | 2015 | |
dc.date.submitted | 2015-08-20 | |
dc.identifier.citation | [1] Beevers and K. R, “Mapping with limited sensing,” Ph.D. dissertation, Rensselaer
Polytechnic Institute, 2007. [2] S. Thrun and J. Leonard, “Simultaneous localization and mapping,” in Springer Handbook of Robotics, 2008, pp. 871–889. [3] S. Thrun et al., “Robotic mapping: A survey,” Exploring artificial intelligence in the new millennium, pp. 1–35, 2002. [4] J. Aulinas, Y. R. Petillot, J. Salvi, and X. Lladó, “The slam problem: a survey.” in CCIA. Citeseer, 2008, pp. 363–371. [5] J. Fuentes-Pacheco, J. Ruiz-Ascencio, and J. M. Rendón-Mancha, “Visual simultaneous localization and mapping: a survey,” Artificial Intelligence Review, vol. 43, no. 1, pp. 55–81, 2015. [6] K. Beevers and W. Huang, “Slam with sparse sensing,” in Robotics and Automation, 2006. ICRA 2006. Proceedings 2006 IEEE International Conference on, May 2006, pp. 2285–2290. [7] K. R. Beevers and W. H. Huang, “An embedded implementation of slam with sparse sensing,” 2008. 67 [8] T. N. Yap Jr and C. R. Shelton, “Slam in large indoor environments with low-cost, noisy, and sparse sonars,” in Robotics and Automation, 2009. ICRA’09. IEEE International Conference on. IEEE, 2009, pp. 1395–1401. [9] Aldebaran, “Naoqi documentation 2.3,” in Aldebaran Website, 2015. [Online]. Available: http://doc.aldebaran.com/ [10] E. Rosten and T. Drummond, “Fusing points and lines for high performance tracking,” in Computer Vision, 2005. ICCV 2005. Tenth IEEE International Conference on, vol. 2. IEEE, 2005, pp. 1508–1515. [11] E. Rosten, R. Porter, and T. Drummond, “Faster and better: A machine learning approach to corner detection,” Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 32, no. 1, pp. 105–119, 2010. [12] E. Ramsden, “Hall-effect sensors: theory and application.” Newnes, 2011. [13] A. Doucet, S. Godsill, and C. Andrieu, “On sequential monte carlo sampling methods for bayesian filtering,” Statistics and computing, vol. 10, no. 3, pp. 197–208, 2000. [14] A. Doucet, N. De Freitas, K. Murphy, and S. Russell, “Rao-blackwellised particle filtering for dynamic bayesian networks,” in Proceedings of the Sixteenth conference on Uncertainty in artificial intelligence. Morgan Kaufmann Publishers Inc., 2000, pp. 176–183. [15] R. E. Kalman, “A new approach to linear filtering and prediction problems,” Journal of Fluids Engineering, vol. 82, no. 1, pp. 35–45, 1960. [16] J. S. Simonoff, “Smoothing methods in statistics,” in Springer Handbook of Robotics, 1998. 68 [17] V. J. Hodge and J. Austin, “A survey of outlier detection methodologies,” Artificial Intelligence Review, vol. 22, no. 2, pp. 85–126, 2004. [18] V. Chandola, A. Banerjee, and V. Kumar, “Outlier detection: A survey,” ACM Computing Surveys, 2007. [19] C.-C. Wang, C. Thorpe, S. Thrun, M. Hebert, and H. Durrant-Whyte, “Simultaneous localization, mapping and moving object tracking,” The International Journal of Robotics Research, vol. 26, no. 9, pp. 889–916, 2007. [20] Z. Zhang, “Iterative point matching for registration of free-form curves,” 1992. [21] J. L. Martínez, J. González, J. Morales, A. Mandow, and A. J. García-Cerezo, “Mobile robot motion estimation by 2d scan matching with genetic and iterative closest point algorithms,” Journal of Field Robotics, vol. 23, no. 1, pp. 21–34, 2006. [22] A. E. Johnson, “Spin-images: a representation for 3-d surface matching,” 1997. [23] P. J. Besl and N. D. McKay, “Method for registration of 3-d shapes,” in Robotics-DL tentative. International Society for Optics and Photonics, 1992, pp. 586–606. [24] J. Feldmar and N. Ayache, “Rigid, affine and locally affine registration of free-form surfaces,” International journal of computer vision, vol. 18, no. 2, pp. 99–119, 1996. [25] M. Website, “Mrpt 1.2.1,” in Mobile Robot Toolkit, 2015. [Online]. Available: http://www.mrpt.org/ | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/52024 | - |
dc.description.abstract | 本論文提出一個在有人室內環境中,基於感知能力受限機器人所提供的彩度與深度照相機與短程雷射來進行環境的地圖建製與即時定位。由於感知能力受限機器人目前的移動能力會受周圍的環境影響無法產生一個可重復使用的環境參考地圖,而身為一個室內服務型機器人,它必須要具備能在環境中定位與移動的能力。然而機器人本身的設計原因,它並不具備一般常使用於定位的雷射測距儀來做長距離、密集且精準的資料蒐集,取而代之的是是稀疏、短距離且容易被干擾的資料。於是我們需要其他的感測裝置來做輔助,也就是位於頭部的深度與彩度照相機來彌補短程雷射對環境偵測的不足。在這篇論文中,我們會先遠端取得以短程雷射所偵測到離機器人的資料進行疊代最近點演算法以修正機器人自身測距的誤差,同時以深度照相機所取得的三維資料來描述所感測到的環境資料代替遠程雷射的不足。 | zh_TW |
dc.description.abstract | This thesis proposes a kind of simultaneous localization and mapping framework based on data of color camera, depth camera and short range infrared finder from the limited sensing robot in an indoor environment with human. As an indoor social robot, it needs to be equipped abilities of localization and navigation in the environment with human. However, due to the design of the limited sensing robot, it does not have a regular laser range finder which is commonly used to obtain dense, long-range and precise data in SLAM. Instead of using regular laser range finder, the limited sensing robot uses short range infrared finder to estimate the distance. The sample data of short range infrared finder are sparse, short-range and easier to be disturbed than the sample data of long range laser. To compensate the lack of information obtained from short range infrared finder, we use other sensors, RGB and depth camera, on the robot to support the perception from the environment. In this work, we will take the estimation obtained from short range infrared finder to do the Iterative Closet Point (ICP) algorithm for fixing the error of robot odometry. Then, the 3D data from the depth camera are used to make a map describing the physical situation of environment. When the robot makes the reference map, there may be some un-fixed object in the environment and we have to filter out those 3D data belonging human. Considering the limited sensing robot is a social robot and it inevitably avoids interacting with human, it should not only locate itself but also locate human in the environment. | en |
dc.description.provenance | Made available in DSpace on 2021-06-15T14:03:44Z (GMT). No. of bitstreams: 1 ntu-104-R02922043-1.pdf: 7636971 bytes, checksum: 1b9c9fcfd821d477a1a54da46322088e (MD5) Previous issue date: 2015 | en |
dc.description.tableofcontents | 口試委員會審定書 i
誌謝ii 摘要iv Abstract v Contents vii List of Figures xi List of Tables xiii 1 Introduction 1 1.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.3 Challenge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.4 Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.5 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.6 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 1.6.1 Hardware System . . . . . . . . . . . . . . . . . . . . . . . . . . 8 vii 1.6.2 Software System . . . . . . . . . . . . . . . . . . . . . . . . . . 10 1.7 Contribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1.8 Thesis Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2 Preliminaries 13 2.1 Environment Observation of Robot . . . . . . . . . . . . . . . . . . . . . 13 2.1.1 Bayesian Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.1.2 Kalman Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2.1.3 Particle Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 2.1.4 Noise Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 2.2 Uncertainty of Robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 2.2.1 Compounding . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 2.2.2 Inverse Relationship . . . . . . . . . . . . . . . . . . . . . . . . 25 2.2.3 Iterative Closest Point . . . . . . . . . . . . . . . . . . . . . . . 26 2.3 Simultaneous Localization and Mapping . . . . . . . . . . . . . . . . . . 26 2.4 Tracking of Humans . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 2.4.1 Movement Detection . . . . . . . . . . . . . . . . . . . . . . . . 28 2.4.2 Face Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 3 Data Fusion under Limited Sensing for SLAM 30 3.1 Sensors Overview for Mapping . . . . . . . . . . . . . . . . . . . . . . . 31 3.1.1 Short Range Finder . . . . . . . . . . . . . . . . . . . . . . . . . 31 3.1.2 Depth Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 3.2 Assumptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 3.3 System Architecture of Data Fusion under Limited Sensing for SLAM . . 33 viii 3.4 Perception Module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 3.4.1 Sample Data Extension . . . . . . . . . . . . . . . . . . . . . . . 35 3.4.2 Sample Data Alignment . . . . . . . . . . . . . . . . . . . . . . 36 3.4.3 Data Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 3.4.4 Observation preprocess . . . . . . . . . . . . . . . . . . . . . . . 38 3.5 Robot Pose Estimation Module . . . . . . . . . . . . . . . . . . . . . . . 39 3.5.1 MRE Odometry . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 3.5.2 Odometry adjustment . . . . . . . . . . . . . . . . . . . . . . . . 40 4 Robot Localization in the Presence of Human Crowds 48 4.1 Sensor for Localization . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 4.1.1 ColorCamera . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 4.2 System Architecture of Localization . . . . . . . . . . . . . . . . . . . . 49 4.3 Human Tracking Module . . . . . . . . . . . . . . . . . . . . . . . . . . 50 4.4 Human Filter in Perception . . . . . . . . . . . . . . . . . . . . . . . . . 50 4.5 Particle Filter Localization . . . . . . . . . . . . . . . . . . . . . . . . . 51 5 Experiment 55 5.1 Experimental Setting and Environment . . . . . . . . . . . . . . . . . . . 55 5.1.1 Platform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 5.1.2 Experimental Environment . . . . . . . . . . . . . . . . . . . . . 56 5.2 Limitation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 5.3 SLAM result with data extension . . . . . . . . . . . . . . . . . . . . . . 57 5.3.1 Hallway . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 5.3.2 Room . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 ix 5.3.3 Big room . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 6 Conclusion 65 References 67 x | |
dc.language.iso | en | |
dc.title | 機器人在感測範圍受限情形下完成室內環境定位與製圖 | zh_TW |
dc.title | Simultaneous Localization and Mapping of a Robot with Limited Sensing in Indoor Environment | en |
dc.type | Thesis | |
dc.date.schoolyear | 103-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 黃正民,簡忠漢,林沛群,郭重顯 | |
dc.subject.keyword | 定位,地圖,有限範圍偵測, | zh_TW |
dc.subject.keyword | Mapping,Localization,Limited Sensing, | en |
dc.relation.page | 69 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2015-08-20 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
顯示於系所單位: | 資訊工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-104-1.pdf 目前未授權公開取用 | 7.46 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。