Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 生物資源暨農學院
  3. 生物機電工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/64727
標題: 茶園機械之導航輔助暨植被監測影像系統之開發
Developing a Guiding and Monitoring System for Tea Plucking Machine
作者: Yu-Kai Lin
林愉凱
指導教授: 陳世芳(Shih-Fang Chen)
關鍵字: 航向輔助,深度學習,語義分割,影像處理,採茶機械,
semantic segmentation,deep learning,automatic navigation,image processing,tea plucking machine,
出版年 : 2020
學位: 碩士
摘要: 近年來人口老年化導致農業勞力短缺,於茶產業尤為嚴重。台灣從日本引進乘坐式採茶機,以高效率採茶方式解決勞力短缺問題,然而使用採茶機需高度技術與經驗,若操作不當可能會對茶樹造成損害,並導致機械故障,甚至影響茶葉採收品質。因此,若能開發採茶機之即時航向輔助功能以提高茶葉品質。於採茶機施作同時建立植被監測系統,進行生長情形監測,將可提高其功能性。本研究使用深度卷積神經網路FCN-32s、FCN-16s、FCN8s和ENet等語義分割架構辨識茶行所在及茶園中之障礙物。其中以ENet模型取得較佳之平均交疊率(mean intersection over unit, mean IU) 為0.734、平均準確率(mean accuracy)為0.94,及較快之運算時間0.176s。完成物件分割後,再利用霍夫變換(Hough transform)建立行使之航向輔助線,其平均誤差為5.92° 和11.30公分。於植被監測系統中,利用色彩空間HSV判斷茶樹的生長狀況,並建立全景影觀察一定範圍的茶樹冠面,比較scale-invariant feature transform (SIFT), speeded up robust feature (SURF) and binary robust invariant scalable keypoints (BRISK)三種方法。三種方法的縫合效果差異不大,其中以SURF的匹配速度較快,在一般茶行和茶行盡頭這兩種情形中,偵測時間為1.86和1.17秒。植被監測系統由GPS軌跡、單張茶樹冠面影像的生長情形和茶樹冠面全景影像呈現,全景影像由影像縫合技術建立。比較一般茶行、茶行盡頭和稀疏茶行三種情形,影像縫合達到90.50%、95.94%和90.91%的平均縫合相似度。本研究成功建立乘坐式採茶機之航向輔助導引,以協助其行駛路線維持於茶行中心;並於同次機械運作時收集植被生長狀態,提供包含行駛軌跡紀錄、位置與茶樹冠面生長影像、植被生長狀態判別等監測功能。透過上述兩大主要開發功能,以期提升茶園機械操作及生長管理的便利性。
Labor shortage is a critical issue in many industries, especially in agricultural production. In recent years, riding-type tea plucking machines were imported to provide a relatively efficient solution for tea harvesting. However, high-level driving skill is essential. Improper operation may cause damage to tea trees, mechanical failure, and affect the quality of harvested tea leaves. A real-time image-based navigation system can potentially mitigate these difficulties. While the tea plucking machine is in operation, tea canopy images are captured to build a monitoring system. The monitoring system provides the growth status of tea. In this study, semantic segmentation ‒ fully convolutional networks (FCN, including the architectures of 8s, 16s, and 32s) and ENet were applied to detect the obstacles and develop guiding lines to maneuver the plucking machine. ENet outperformed other models in overall performance with mean intersection over unit (mIU) of 0.734, mean accuracy of 0.941, and detection time of 0.176s. Based on the segmentation result from ENet, the guiding lines calculated by Hough transform produced an average angle bias and distance bias of 5.92° and 11.30 centimeter, respectively. In the meantime, scale-invariant feature transform (SIFT), speeded up robust feature (SURF) and Binary Robust Invariant Scalable Keypoints (BRISK) image stitching methods were compared to build panoramic tea canopy images for monitoring purpose. There were no significant differences in the stitching results among the three methods. SURF delivered the results in the shortest processing time of 1.86s and 1.17s in tea row and end of tea row images, respectively. The monitoring system consists of a driving path, a single location image with growth status, and a panoramic image. The cosine similarity (CS) of panoramic tea canopy image are 90.50%, 95.94% and 90.91% in tea row, end of tea row and sparse tea row images. This study successfully develops a real-time guiding system to assist the machine operator to ride in the center of tea row, as well as a monitoring system for tea field management.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/64727
DOI: 10.6342/NTU202000600
全文授權: 有償授權
顯示於系所單位:生物機電工程學系

文件中的檔案:
檔案 大小格式 
ntu-109-1.pdf
  目前未授權公開取用
2.8 MBAdobe PDF
顯示文件完整紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved