Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 生物資源暨農學院
  3. 生物機電工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/89071
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor林達德zh_TW
dc.contributor.advisorTa-Te Linen
dc.contributor.author汪軍諺zh_TW
dc.contributor.authorJun-Yan Wangen
dc.date.accessioned2023-08-16T17:00:33Z-
dc.date.available2023-11-10-
dc.date.copyright2023-08-16-
dc.date.issued2023-
dc.date.submitted2023-08-08-
dc.identifier.citationAbbasi-Kesbi, R., Nikfarjam, A., & Nemati, M. (2020). Developed wireless sensor network to supervise the essential parameters in greenhouses for internet of things applications. Iet Circuits Devices & Systems, 14(8), 1258-1264.
Amador, G. J., & Hu, D. L. (2017). Sticky Solution Provides Grip for the First Robotic Pollinator. Chem, 2(2), 162-164.
Aquino, A., Barrio, I., Diago, M.-P., Millan, B., & Tardaguila, J. (2018). vitisBerry: An Android-smartphone application to early evaluate the number of grapevine berries by means of image analysis. Computers and Electronics in Agriculture, 148, 19-28.
Araus, J. L., & Kefauver, S. C. (2018). Breeding to adapt agriculture to climate change: affordable phenotyping solutions. Current opinion in plant biology, 45, 237-247.
Aslan, M. F., Durdu, A., Sabanci, K., Ropelewska, E., & Gueltekin, S. S. (2022). A Comprehensive Survey of the Recent Studies with UAV for Precision Agriculture in Open Fields and Greenhouses. Applied Sciences-Basel, 12(3), Article 1047.
Bailey, T., & Durrant-Whyte, H. (2006). Simultaneous localization and mapping (SLAM): part II. IEEE Robotics & Automation Magazine, 13(3), 108-117.
Barbedo, J. G. A. (2019). A review on the use of unmanned aerial vehicles and imaging sensors for monitoring and assessing plant stresses. Drones, 3(2), 40.
Das Choudhury, S., Bashyam, S., Qiu, Y., Samal, A., & Awada, T. (2018). Holistic and component plant phenotyping using temporal image sequence. Plant Methods, 14(1), 1-21.
Durrant-Whyte, H., & Bailey, T. (2006). Simultaneous localization and mapping: part I. IEEE Robotics & Automation Magazine, 13(2), 99-110.
Elazab, A., Ordóñez, R. A., Savin, R., Slafer, G. A., & Araus, J. L. (2016). Detecting interactive effects of N fertilization and heat stress on maize productivity by remote sensing techniques. European Journal of Agronomy, 73, 11-24.
Feng, J., Liu, G., Wang, S., Zeng, L., & Ren, W. (2012). A novel 3D laser vision system for robotic apple harvesting. 2012 Dallas, Texas, July 29-August 1, 2012.
Font, D., Pallejà, T., Tresanchez, M., Runcan, D., Moreno, J., Martínez, D., Teixidó, M., & Palacín, J. (2014). A proposal for automatic fruit harvesting by combining a low cost stereovision camera and a robotic arm. Sensors, 14(7), 11557-11579.
Fonteijn, H., Afonso, M., Lensink, D., Mooij, M., Faber, N., Vroegop, A., Polder, G., & Wehrens, R. (2021). Automatic Phenotyping of Tomatoes in Production Greenhouses Using Robotics and Computer Vision: From Theory to Practice. Agronomy-Basel, 11(8), Article 1599.
Gongal, A., Silwal, A., Amatya, S., Karkee, M., Zhang, Q., & Lewis, K. (2016). Apple crop-load estimation with over-the-row machine vision system. Computers and Electronics in Agriculture, 120, 26-35.
He, J. Q., Harrison, R. J., & Li, B. (2017). A novel 3D imaging system for strawberry phenotyping. Plant Methods, 13(1), 1-8.
Hoornweg, D., & Pope, K. (2017). Population predictions for the world's largest cities in the 21st century. Environment and Urbanization, 29(1), 195-216.
Hughes, A., Askew, K., Scotson, C. P., Williams, K., Sauze, C., Corke, F., Doonan, J. H., & Nibau, C. (2017). Non-destructive, high-content analysis of wheat grain traits using X-ray micro computed tomography. Plant Methods, 13(1), 76.
Huletski, A., Kartashov, D., & Krinkin, K. (2015, 9-14 Nov. 2015). Evaluation of the modern visual SLAM methods. 2015 Artificial Intelligence and Natural Language and Information Extraction, Social Media and Web Search FRUCT Conference (AINL-ISMW FRUCT).
Iddio, E., Wang, L., Thomas, Y., McMorrow, G., & Denzer, A. (2020). Energy efficient operation and modeling for greenhouses: A literature review. Renewable and Sustainable Energy Reviews, 117, 109480.
Jakob Engel, T. S. o., and Daniel Cremers. (2014). LSD-SLAM: Large-Scale Direct Monocular SLAM.
Jimenez-Berni, J. A., Deery, D. M., Rozas-Larraondo, P., Condon, A. G., Rebetzke, G. J., James, R. A., Bovill, W. D., Furbank, R. T., & Sirault, X. R. (2018). High throughput determination of plant height, ground cover, and above-ground biomass in wheat with LiDAR. Frontiers in Plant Science, 9, 237.
Kalantar, A., Edan, Y., Gur, A., & Klapp, I. (2020). A deep learning system for single and overall weight estimation of melons using unmanned aerial vehicle images. Computers and Electronics in Agriculture, 178, Article 105748.
Kalischuk, M., Paret, M. L., Freeman, J. H., Raj, D., Da Silva, S., Eubanks, S., Wiggins, D. J., Lollar, M., Marois, J. J., Mellinger, H. C., & Das, J. (2019). An Improved Crop Scouting Technique Incorporating Unmanned Aerial Vehicle-Assisted Multispectral Crop Imaging into Conventional Scouting Practice for Gummy Stem Blight in Watermelon. Plant Dis, 103(7), 1642-1650.
Krul, S., Pantos, C., Frangulea, M., & Valente, J. (2021). Visual SLAM for Indoor Livestock and Farming Using a Small Drone with a Monocular Camera: A Feasibility Study. Drones, 5(2), Article 41.
López-Granados, F., Torres-Sánchez, J., De Castro, A.-I., Serrano-Pérez, A., Mesas-Carrascosa, F.-J., & Peña, J.-M. (2016). Object-based early monitoring of a grass weed in a grass crop using high resolution UAV imagery. Agronomy for Sustainable Development, 36(4), 67.
Lachlan Dowling, T. P., Isaac Hook, Hao Tang, Ying Tan,, & Will Glenn, R. R. U. (2018). Accurate indoor mapping using an autonomous unmanned aerial vehicle (UAV).
Li, P., Lee, S.-h., & Hsu, H.-Y. (2011). Review on fruit harvesting method for potential use of automatic fruit harvesting systems. Procedia Engineering, 23, 351-366.
Liu, S., Whitty, M., & Cossell, S. (2015). Automatic grape bunch detection in vineyards for precise yield estimation. 2015 14th IAPR International Conference on Machine Vision Applications (MVA).
Liyang, Liu, P., Li, B., & Yu, X. (2018). Intelligent Control System of Cucumber Production in the Greenhouse Based on Internet of Things. In Cloud Computing and Security (pp. 395-406).
Lobao, L., & Meyer, K. (2001). The great agricultural transition: Crisis, change, and social consequences of twentieth century US farming. Annual Review of Sociology, 27, 103-124.
Maskey, M. L., Pathak, T. B., & Dara, S. K. (2019). Weather Based Strawberry Yield Forecasts at Field Scale Using Statistical and Machine Learning Models. Atmosphere, 10(7), 378.
Mehta, S. S., & Burks, T. F. (2014). Vision-based control of robotic manipulator for citrus harvesting. Computers and Electronics in Agriculture, 102, 146-158.
Miao, J., & Niu, L. (2016). A survey on feature selection. Procedia Computer Science, 91, 919-926.
Mur-Artal, R., Montiel, J. M. M., & Tardos, J. D. (2015). ORB-SLAM: A Versatile and Accurate Monocular SLAM System. Ieee Transactions on Robotics, 31(5), 1147-1163.
Mur-Artal, R., & Tardos, J. D. (2017). ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras. Ieee Transactions on Robotics, 33(5), 1255-1262.
Olenskyj, A. G., Sams, B. S., Fei, Z. H., Singh, V., Raja, P. V., Bornhorst, G. M., & Earles, J. M. (2022). End-to-end deep learning for directly estimating grape yield from ground-based imagery. Computers and Electronics in Agriculture, 198, Article 107081.
Peirson, B. (2013). Wilhelm Ludvig Johannsen (1857-1927). Embryo Project Encyclopedia.
Qin, H., Meng, Z., Meng, W., Chen, X., Sun, H., Lin, F., & Ang, M. H. (2019). Autonomous Exploration and Mapping System Using Heterogeneous UAVs and UGVs in GPS-Denied Environments. IEEE Transactions on Vehicular Technology, 68(2), 1339-1350.
Rejeb, A., Abdollahi, A., Rejeb, K., & Treiblmaier, H. (2022). Drones in agriculture: A review and bibliometric analysis. Computers and Electronics in Agriculture, 198, Article 107017.
Roldán, Joossen, Sanz, Cerro, & Barrientos. (2015). Mini-UAV Based Sensory System for Measuring Environmental Variables in Greenhouses. Sensors, 15(2).
Roldán, J. J., Garcia-Aunon, P., Garzón, M., De León, J., Del Cerro, J., & Barrientos, A. (2016). Heterogeneous Multi-Robot System for Mapping Environmental Variables of Greenhouses. Sensors, 16(7).
Rublee, E., Rabaud, V., Konolige, K., & Bradski, G. (2011, 6-13 Nov. 2011). ORB: An efficient alternative to SIFT or SURF. 2011 International Conference on Computer Vision.
Sankaran, S., Khot, L. R., Espinoza, C. Z., Jarolmasjed, S., Sathuvalli, V. R., Vandemark, G. J., Miklas, P. N., Carter, A. H., Pumphrey, M. O., & Knowles, N. R. (2015). Low-altitude, high-resolution aerial imaging systems for row and field crop phenotyping: A review. European Journal of Agronomy, 70, 112-123.
Shafiekhani, A., Kadam, S., Fritschi, F. B., & DeSouza, G. N. (2017). Vinobot and vinoculer: Two robotic platforms for high-throughput field phenotyping. Sensors, 17(1), 214.
Shi, Q., Liu, D., Mao, H., Shen, B., Liu, X., & Ou, M. (2019). Study on Assistant Pollination of Facility Tomato by UAV 2019 ASABE Annual International Meeting.
Song, P., Wang, J. L., Guo, X. Y., Yang, W. N., & Zhao, C. J. (2021). High-throughput phenotyping: Breaking through the bottleneck in future crop breeding. Crop Journal, 9(3), 633-645.
Tangarife, H. I., & Díaz, A. E. (2017, 18-20 Oct. 2017). Robotic applications in the automation of agricultural production under greenhouse: A review. 2017 IEEE 3rd Colombian Conference on Automatic Control (CCAC).
Trachsel, S., Dhliwayo, T., Gonzalez Perez, L., Mendoza Lugo, J. A., & Trachsel, M. (2019). Estimation of physiological genomic estimated breeding values (PGEBV) combining full hyperspectral and marker data across environments for grain yield under combined heat and drought stress in tropical maize (Zea mays L.). PLoS One, 14(3), e0212200.
Vakilian, K. A., & Massah, J. (2012). Design, development and performance evaluation of a robot to early detection of nitrogen deficiency in greenhouse cucumber (Cucumis sativus) with machine vision. Int. J. Agric. Res. Rev, 2, 448-454.
Virlet, N., Sabermanesh, K., Sadeghi-Tehran, P., & Hawkesford, M. J. (2016). Field Scanalyzer: An automated robotic field phenotyping platform for detailed crop monitoring. Functional Plant Biology, 44(1), 143-153.
Xiong, B., Wang, B., Xiong, S., Lin, C., & Yuan, X. (2019). 3D morphological processing for wheat spike phenotypes using computed tomography images. Remote Sensing, 11(9), 1110.
Yang, W., Feng, H., Zhang, X., Zhang, J., Doonan, J. H., Batchelor, W. D., Xiong, L., & Yan, J. (2020). Crop phenomics and high-throughput phenotyping: past decades, current challenges, and future perspectives. Molecular Plant, 13(2), 187-214.
Zhao, Y., Gong, L., Liu, C., & Huang, Y. (2016). Dual-arm Robot Design and Testing for Harvesting Tomato in Greenhouse. IFAC-PapersOnLine, 49(16), 161-165.
Zheng, C., Abd-Elrahman, A., & Whitaker, V. (2021). Remote Sensing and Machine Learning in Crop Phenotyping and Management, with an Emphasis on Applications in Strawberry Farming. Remote Sensing, 13(3), 531.
Zhou, X., Zheng, H., Xu, X., He, J., Ge, X., Yao, X., Cheng, T., Zhu, Y., Cao, W., & Tian, Y. (2017). Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. ISPRS Journal of Photogrammetry and Remote Sensing, 130, 246-255.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/89071-
dc.description.abstract目前全世界存在大量規模龐大的溫室,而溫室的作物生長狀況監測為溫室管理的一大焦點。傳統的人工巡查監測方式耗時且需要投入大量人力資源,使種植者無法快速、即時地了解溫室當前的整體情況。然而,自動化的無人機導航監控系統能夠解決此難題,基於純視覺定位的無人機不需安裝昂貴的傳感器,僅需搭載RGB相機即可執行導航任務。視覺定位無人機自主導航系統成為低成本的溫室自動監測之核心,從技術上改變了智慧農業的樣貌。本研究的無人機自主巡航系統,使用了加入ArUco Marker的ORB-SLAM2,稱為Enhanced ORB-SLAM2。ArUco Marker是一種標誌,具有特定的幾何模式,通常被用作計算機視覺中的參考點,能夠幫助系統進行位置定位和追蹤。ORB-SLAM2則是一種視覺定位與地圖構建演算法,能夠使用相機實現同時定位和地圖構建。實驗驗證了Enhanced ORB-SLAM2在環境存在陰影特徵的定位結果優於原始的ORB-SLAM2。此無人機導航系統可於溫室中自動執行各式飛行任務,且飛行軌跡均方根誤差範圍在30公分以下。此外,地圖校正使用仿設轉換算法,可以使地圖的MapAruco與人工量測的ArUco Marker位置完全貼合。果實偵測使用YOLOv4深度學習模型訓練,果實偵測模型之mAP達到0.96,DeepSORT基於此果實偵測模型運行果實追蹤任務。將DeepSORT的追蹤結果經過三步驟的資料清理後,使假果實實驗中的ID switch數量由平均5.83顆,下降至0顆,達到準確追蹤之目標。溫室果實定位算法基於清理後的果實追蹤結果,並使用三角測量算法計算果實位置,計算求得之果實位置再分別使用地圖校正以及迭代ArUco Marker的校正常數做位置校正,校正後假果實位置的均方根誤差由2.758公尺下降至0.223公尺,此果實追蹤與定位算法也已驗證可應用於真實果實之追蹤與定位。zh_TW
dc.description.abstractThere are a large number of large-scale greenhouses around the world, and monitoring the growth of crops in these greenhouses is a major focus of greenhouse management. Traditional manual inspection and monitoring methods are time-consuming and require a significant amount of human resources, preventing growers from quickly and instantly understanding the overall situation within the greenhouse. However, automated unmanned aerial vehicle (UAV) navigation and monitoring systems can address this challenge. UAVs equipped with pure visual positioning, without the need for expensive sensors, can perform navigation tasks with an RGB camera. In this study, an autonomous UAV navigation system was developed using Enhanced ORB-SLAM2, which incorporates ArUco Markers. ArUco Markers are specific geometric patterns used as reference points in computer vision, aiding in location estimation and tracking. ORB-SLAM2 is a visual positioning and mapping algorithm that achieves simultaneous localization and mapping using a camera. The experiment demonstrated that Enhanced ORB-SLAM2 outperforms the original ORB-SLAM2 in positioning results when dealing with environments containing shadow features. This UAV navigation system can perform various flight missions within a greenhouse autonomously, with a root mean square error of the flight trajectory within 30 centimeters. Additionally, map calibration using a similarity transformation algorithm ensures the alignment of the MapAruco with manually measured ArUco Marker positions. Fruit detection employs the YOLOv4 deep learning model, achieving a fruit detection model mAP of 0.96. DeepSORT utilizes this fruit detection model for fruit tracking. After three steps of data cleaning on DeepSORT's tracking results, the average ID switch count in false fruit experiments decreased from 5.83 to 0, achieving accurate tracking. The greenhouse fruit localization algorithm utilizes the cleaned fruit tracking results. Using triangulation, fruit positions are calculated and subsequently corrected using map calibration and iterative ArUco Marker calibration. The root mean square error of false fruit position decreased from 2.758 meters to 0.223 meters after correction. This fruit tracking and localization algorithm has also been validated for real fruit tracking and localization applications.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-08-16T17:00:33Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2023-08-16T17:00:33Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents致謝 i
摘要 ii
Abstract iii
圖目錄 vii
表目錄 x
第一章 緒論 1
1.1 前言 1
1.2 研究目的 3
第二章 文獻探討 4
2.1 無人機與智慧溫室 4
2.2 無人機於室內室外之應用 5
2.2.1 室外無人機之應用與發展 5
2.2.2 室內無人機之應用與發展 5
2.2.3 室內無人機的困難點與案例 6
2.3 室內無人機巡航與SLAM 6
2.3.1 溫室無人機巡航 6
2.3.2 即時定位與地圖構建 6
2.4 作物表型分析 9
2.4.1 作物表型分析技術 9
2.4.2 作物產量預估 11
2.4.3 作物定位與量測 12
第三章 研究方法 13
3.1 自主導航系統架構 13
3.1.1 硬體架構 13
3.1.2 飛行控制軟體架構 16
3.2 ORB-SLAM2 17
3.2.1 Enhanced ORB-SLAM2 17
3.2.1 視覺地圖建立 17
3.2.2 Enhanced ORB-SLAM2地圖校正 18
3.3 飛行控制系統 19
3.3.1 無人機3D位姿估計 19
3.3.2 導航路徑設置與抵達航點之判斷方法 20
3.3.3 無人機控制系統與控制閥設計 21
3.4 果實資訊分析 22
3.4.1 果實追蹤演算法 22
3.4.2 果實定位演算法 25
3.4.2 果實大小演算法 29
3.5 實驗設置 30
3.5.1 實驗場域 30
3.5.2 ArUco Marker之規格與製作方法 31
3.5.3 實驗設計與目的 31
3.6 程式檔案與操作流程 34
第四章 結果與討論 37
4.1 Enhanced ORB-SLAM2 定位分析 37
4.1.1 ORB-SLAM2定位效果分析 37
4.1.2 Enhanced ORB-SLAM2定位效果分析 39
4.1.3 ORB-SLAM2與 Enhanced ORB-SLAM2定位效果比較 40
4.2 地圖校正 40
4.2.1 ICP方法之地圖校正 40
4.2.2 仿射轉換方法之地圖校正 41
4.2.3 校正地圖之定位效果分析 43
4.2.4 SLAM建圖可視化結果 45
4.3 飛行控制分析 48
4.3.1 SLAM系統與飛行軌跡分析 48
4.3.2 轉彎PID參數調整 55
4.3.3 不同SLAM系統之差異比較 56
4.4 果實算法分析 57
4.4.1 果實偵測模型效能分析 57
4.4.2 假果實追蹤結果分析 58
4.4.3 假果實定位結果分析 63
4.4.4 假果實大小結果分析 74
4.4.5 真實果實追蹤結果分析 75
4.4.6 真實果實定位結果分析 76
4.4.7 真實果實大小結果分析 80
第五章 結論與建議 82
5.1 結論 82
5.2 建議 84
參考文獻 85
-
dc.language.isozh_TW-
dc.subject果實定位zh_TW
dc.subject果實追蹤zh_TW
dc.subject自主導航zh_TW
dc.subject自動化zh_TW
dc.subject無人機zh_TW
dc.subjectORB-SLAM2zh_TW
dc.subjectunmanned aerial vehicleen
dc.subjectORB-SLAM2en
dc.subjectfruit trackingen
dc.subjectautonomous navigationen
dc.subjectautomationen
dc.subjectfruit localizationen
dc.title自主巡航無人機系統應用於溫室洋香瓜之定位及量測zh_TW
dc.titleApplication of an Autonomous Drone Navigation System on Greenhouse Muskmelon Localization and Measurementen
dc.typeThesis-
dc.date.schoolyear111-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee顏炳郎;楊江益zh_TW
dc.contributor.oralexamcommitteePing-Lang Yen;Chiang-Yi Yangen
dc.subject.keyword自動化,無人機,自主導航,ORB-SLAM2,果實追蹤,果實定位,zh_TW
dc.subject.keywordautomation,unmanned aerial vehicle,autonomous navigation,ORB-SLAM2,fruit tracking,fruit localization,en
dc.relation.page90-
dc.identifier.doi10.6342/NTU202303659-
dc.rights.note同意授權(全球公開)-
dc.date.accepted2023-08-10-
dc.contributor.author-college生物資源暨農學院-
dc.contributor.author-dept生物機電工程學系-
顯示於系所單位:生物機電工程學系

文件中的檔案:
檔案 大小格式 
ntu-111-2.pdf8.21 MBAdobe PDF檢視/開啟
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved