Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 生物資源暨農學院
  3. 生物機電工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/56450
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor葉仲基(Chung-Kee Yeh)
dc.contributor.authorChun-Yen Taien
dc.contributor.author戴君諺zh_TW
dc.date.accessioned2021-06-16T05:29:12Z-
dc.date.available2020-08-03
dc.date.copyright2020-08-03
dc.date.issued2020
dc.date.submitted2020-07-27
dc.identifier.citationArulampalam, M. S., S. Maskell, N. Gordon and T. Clapp. 2002. A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking. IEEE Transactions on Signal Processing 50(2):174-188.
Arun, K. S., T. S. Huang and S. D. Blostein. 1987. Least-Squares Fitting of Two 3-D Point Sets. IEEE Transactions on Pattern Analysis and Machine Intelligence 9(5):698-700.
Besl, P. J. and N. D. McKay. 1992. Method for registration of 3-D shapes. In “Sensor fusion IV: control paradigms and data structures”, 586-606. International Society for Optics and Photonics.
Cadena, C., L. Carlone, H. Carrillo, Y. Latif, D. Scaramuzza, J. Neira, L. Reid and J. Leonard. 2016. Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age. IEEE Transactions on Robotics 2(6): 1309-1332.
Chau, C. K., I. W. Ho, E. R. Magsino, C. M. Tseng and K. Jia. 2017. Efficient Information Dissemination of 3D Point Cloud Mapping Data for Autonomous Vehicles. Hong Kong: The Hong Kong Polytechnic University and Masdar Institute.
Choras, R. S. 2007. Image Feature Extraction Techniques and Their Applications for CBIR and Biometrics Systems. International Journal of Biology and Biomedical Engineering 1(1):6-16.
Elfes, A. 1989. Using Occupancy Grids for Mobile Robot Perception and Navigation. Computer 22(6):46-57.
Grisetti, G., R. Kummerle, C. Stachniss and W. Burgard. 2010. A Tutorial on Graph-Based SLAM. IEEE Intelligent Transportation Systems Magazine 2(4):31-43.
Girshick, R., J. Donahue, T. Darrell and J. Malik. 2014. Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation. Available at: https://arxiv.org/abs/1311.2524. Accessed 10 October 2019.
Girshick, R. 2015. Fast R-CNN. Available at: https://arxiv.org/abs/1504.08083. Accessed 10 October 2019.
Jamiruddin, R., A. O. Sari, J. Shabbir and T. Anwer. Rgb-Depth SLAM Review. Available at: http://arxiv.org/abs/1805.07696. Accessed 20 May 2018.
Kalman, R. E. 1960. A New Approach to Linear Filtering and Prediction Problems. ASME. J. Basic Eng 82(1):35-45.
LeCun, Y., Y. Bengio and G. Hinton. 2015. Deep Learning. Nature 521(7553):436-444.
Lowe, D. G. 2004. Distinctive Image Features from Scale-Invariant Keypoints. International Journal of Computer Vision 60(2):91-110.
MathWorks. 2020. What Is Camera Calibration. Available at:https://www.mathworks.com/help/vision/ug/camera-calibration.html. Accessed 19 June 2020.
Muja, M. and D. G. Lowe. 2009. Fast Approximate Nearest Neighbors with Automatic Algorithm Configuration. In “Proceedings of the International Conference on Computer Vision Theory”, 331-340. Lisboa, Portugal.
Nister, N., O. Naroditsky and J. Bergen. 2004. Visual Odometry. In “IEEE Computer Society Conference on Computer Vision and Pattern Recognition”, 1-1. Washington, DC.
Redmon, J., S. Divvala, R. Girshick and A. Farhadi. 2016. You Only Look Once: Unified, Real-Time Object Detection. Available at: https://arxiv.org/abs/1506.02640. Accessed 10 October 2019.
Ren, S., K. He, R. Girshick and J. Sun. 2017. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE Transactions on Pattern Analysis and Machine Intelligence 39(6):1137-1149.
ROBOTIS. 2020. e-Manual. Available at: https://emanual.robotis.com/. Accessed 19 June 2020.
Rublee, E., V. abaud, K. onolige, and G. Bradski. 2011. ORB: An Efficient Alternative to SIFT or SURF. In “2011 International Conference on Computer Vision”, 2564-2571. Barcelona.
Saxena, A., M. Prasad, A. Gupta, N. Bharill, O. P. Patel, A. Tiwari, E. M. Joo, W. P. Ding and C. T. Lin. 2017. A Review of Clustering Techniques and Developments. Neurocomputing 267:664-681.
Scaramuzza, D. and F. Fraundorfer. 2011. Visual Odometry [Tutorial]. IEEE Robotics Automation Magazine 18(4):80-92.
Smith, L. N. 2017. Cyclical Learning Rates for Training Neural Networks. In “2017 IEEE Winter Conference on Applications of Computer Vision (WACV)”, 464-472. Santa Rosa, CA.
Smith, S. L., P. J. Kindermans, C. Ying and Q. V. Le. 2017. Don't Decay the Learning Rate, Increase the Batch Size. Available at:https://arxiv.org/abs/1711.00489. Accessed 14 June 2020.
Stereolabs. 2020. ZED Mini. Available at: https://www.stereolabs.com/zed-mini/. Accessed 15 June 2020.
Thang, T. M. and J. Kim. 2011. The Anomaly Detection by Using Dbscan Clustering with Multiple Parameters. In “2011 International Conference on Information Science and Applications”, 1-5. Jeju Island.
Thrun, S., W. Burgard and D. Fox. 2005. Probabilistic Robotics. 1st ed., 245-265. Cambridge: The MIT Press.
TurtleBot. 2020. Open Source. Available at: https://www.turtlebot.com/opensource/. Accessed 19 June 2020.
Vázquez-Arellano, M., H. W. Griepentrog, D. Reiser and D. S. Paraforos. 2016. 3-D Imaging Systems for Agricultural Applications—A Review. Sensors 16(5):618.
Zhao, Z., P. Zheng, S. Xu and X. Wu. 2019. Object Detection With Deep Learning: A Review. IEEE Transactions on Neural Networks and Learning Systems 30(11):3212-3232.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/56450-
dc.description.abstract要讓載具在設施栽培時執行高複雜度的移動動作需要使用到室內定位的技術,而即時定位與地圖建構(SLAM)因為其不需要在環境中搭建訊號發射器或軌道,藉由感測器就能將載具定位到特定位置的特性,所以現在於室內定位領域中被廣泛的應用。但傳統SLAM的地圖只限於分辨特定位置是否有存在障礙物,無法提供更多的環境資訊,因此本論文提出藉由深度相機和深度學習模型捕捉環境資訊並映射到地圖的對應位置上的方法,使其可以分辨出作物和其他障礙物差別。實現方法:用SLAM建構出設施地圖,於建構地圖的同時將相機擷取到的影像輸入YOLO V3目標偵測模型中判斷作物相對於載具的分布位置;再經由座標轉換函式將作物分布位置映射到地圖的對應位置上;最後用分群演算法區分同種作物間的差異。實驗結果證實可以用SLAM正確的建構出設施地圖,並將YOLO V3偵測到的作物的真實位置映射到地圖上。於定位誤差的分析中發現相機旋轉角速度會影響誤差,藉由挑選適當的相機旋轉角速度能夠將定位誤差控制在公分等級。此研究建立的環境資訊地圖能夠使載具根據此地圖規劃出高複雜度的移動動作,例如前往指定的作物位置或讓載具在不會撞到作物的前提下盡量靠近作物等,以提高設施作物管理以及摘採、灌溉等工作的效率。zh_TW
dc.description.abstractIn order to perform high-complexity movements during cultivation in a facility, indoor positioning technology is required. Simultaneous localization and mapping(SLAM) which can locate the vehicle to a specific position by sensor does not need to build a signal transmitter or track in the environment, so it is now widely used in the field of indoor positioning. However, traditional SLAM maps are limited to distinguishing whether there is an obstacle at a specific location, and cannot provide more environmental information. Therefore, this paper proposes a method of capturing environmental information and mapping it to the corresponding position of the map through a depth camera and deep learning model. So that it can distinguish the difference between crops and other obstacles. The implementation method is to use SLAM to build a facility map and at the same time to input the image captured by the camera into the YOLO V3 object detection model. The position of the crop relative to the vehicle can be found, then the position of the crop to the corresponding position on the map through the coordinate transformation function can also be mapped. Finally, the clustering algorithm to distinguish the differences between the same crops is used. The experimental results confirmed that the facility map can be constructed with SLAM correctly and the real position of the crop detected by YOLO V3 is also mapped on the map correctly. In the analysis of the positioning error, it was found that the camera rotation angular velocity will affect the error. By selecting the appropriate camera rotation angular velocity, the positioning error can be controlled at a centimeter level. The map with environmental information created by this paper can enable the vehicle to plan high-complexity movements based on this map, such as going to a designated crop location or allowing the vehicle to be as close to the crop as possible without hitting the crop, so as to improve the efficiency of crop management, picking and irrigation etc. in facilities.en
dc.description.provenanceMade available in DSpace on 2021-06-16T05:29:12Z (GMT). No. of bitstreams: 1
U0001-2607202022224600.pdf: 2487390 bytes, checksum: 3a17d47f3b55ba1afa123f89570560ec (MD5)
Previous issue date: 2020
en
dc.description.tableofcontents致謝 i
摘要 ii
Abstract iii
目錄 v
圖目錄 viii
表目錄 x
第1章 緒論 1
1.1 前言 1
1.2 目的 1
第2章 文獻探討 3
2.1 視覺SLAM 3
2.1.1 RGB-D影像系統 3
2.1.2 視覺里程計 4
2.1.2.1 特徵提取 4
2.1.2.2 特徵匹配 5
2.1.2.3 動態預測 5
2.1.3 狀態預測優化 7
2.1.4 建構地圖種類 7
2.1.4.1 點雲地圖 8
2.1.4.2 佔據網格地圖 8
2.2 目標偵測 8
2.2.1 深度學習 8
2.2.2 卷積神經網路 9
2.2.3 目標偵測深度學習模型 9
2.2.3.1 R-CNN 10
2.2.3.2 YOLO 10
第3章 研究方法 11
3.1 實驗設備 11
3.1.1 軟體系統簡介 11
3.1.1.1 Gazebo 13
3.1.1.2 ROS 13
3.1.1.3 OpenCV 13
3.1.1.4 YOLO V3 13
3.1.2 硬體系統簡介 15
3.1.2.1 深度相機 16
3.1.2.2 載具 16
3.2 實驗環境 17
3.3 實驗方法 18
3.3.1 相機校正 19
3.3.2 SLAM地圖建置 20
3.3.3 YOLO V3目標偵測 20
3.3.4 座標轉換 21
3.3.5 分群 22
第4章 結果與討論 26
4.1 SLAM地圖建置結果 26
4.2 位置定位 27
4.3 分群結果 28
4.4 誤差分析 28
4.4.1 誤差計算方式 29
4.4.2 相機與目標夾角 30
4.4.3 相機旋轉角速度 31
4.4.4 誤差比較 32
4.5 實際設施內測試結果 34
第5章 結論與建議 37
5.1 結論 37
5.2 建議 38
參考文獻 39
dc.language.isozh-TW
dc.title設施內帶有環境資訊地圖之建立zh_TW
dc.titleA Map with Environmental Information Established in Facilitiesen
dc.typeThesis
dc.date.schoolyear108-2
dc.description.degree碩士
dc.contributor.oralexamcommittee黃振康(Chen-Kang Huang),吳剛智(Gang-Jhy Wu)
dc.subject.keyword環境資訊地圖,即時定位與地圖建構,深度相機,目標偵測,zh_TW
dc.subject.keywordEnvironmental information map,Simultaneous localization and mapping,Depth camera,Object detection,en
dc.relation.page42
dc.identifier.doi10.6342/NTU202001877
dc.rights.note有償授權
dc.date.accepted2020-07-27
dc.contributor.author-college生物資源暨農學院zh_TW
dc.contributor.author-dept生物機電工程學系zh_TW
顯示於系所單位:生物機電工程學系

文件中的檔案:
檔案 大小格式 
U0001-2607202022224600.pdf
  目前未授權公開取用
2.43 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved