Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 生物資源暨農學院
  3. 生物機電工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/94408
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor江昭皚zh_TW
dc.contributor.advisorJoe-Air Jiangen
dc.contributor.author簡嘉俊zh_TW
dc.contributor.authorChia-Chun Chienen
dc.date.accessioned2024-08-15T17:19:53Z-
dc.date.available2024-08-16-
dc.date.copyright2024-08-15-
dc.date.issued2024-
dc.date.submitted2024-08-09-
dc.identifier.citationÁlvarez-Alfageme, F., Pálinkás, Z., Bigler, F., & Romeis, J. (2012). Development of an early-tier laboratory bioassay for assessing the impact of orally-active insecticidal compounds on larvae of Coccinella septempunctata (Coleoptera: Coccinellidae). Environmental entomology, 41(6), 1687-1693.
Alzubaidi, L., Zhang, J., Humaidi, A. J., Al-Dujaili, A., Duan, Y., Al-Shamma, O., Santamaría, J., Fadhel, M. A., Al-Amidie, M., & Farhan, L. (2021). Review of deep learning: Concepts, CNN architectures, challenges, applications, future directions. Journal of big Data, 8, 1-74.
Assessment, U. S. C. O. o. T. (1995). Biologically based technologies for pest control. Office of Technology Assessment.
Bakkay, M. C., Chambon, S., Rashwan, H. A., Lubat, C., & Barsotti, S. (2018). Automatic detection of individual and touching moths from trap images by combining contour‐based and region‐based segmentation. IET Computer Vision, 12(2), 138-145.
Bechar, A., & Vigneault, C. (2016). Agricultural robots for field operations: Concepts and components. Biosystems Engineering, 149, 94-111.
Diwan, T., Anirudh, G., & Tembhurne, J. V. (2023). Object detection using YOLO: Challenges, architectural successors, datasets and applications. Multimedia Tools and Applications, 82(6), 9243-9275.
dos Santos-Cividanes, T., Ramos, T. d. O., & Cividanes, F. (2016). Fertility life table of the Asian ladybug in different temperatures. Revista de la Facultad de Agronomía (La Plata), 115(2), 129-133.
Engel, G. (1968). Convolution of Line Gratings with Gaussian Blur. JOSA, 58(10), 1416-1417.
Ge, Z., Liu, S., Wang, F., Li, Z., & Sun, J. (2021). Yolox: Exceeding yolo series in 2021. arXiv preprint arXiv:2107.08430.
Geng, A., Hu, X., Liu, J., Mei, Z., Zhang, Z., & Yu, W. (2022). Development and Testing of Automatic Row Alignment System for Corn Harvesters. Applied Sciences, 12(12), 6221.
Gharakhani, H., Thomasson, J. A., & Lu, Y. (2023). Integration and preliminary evaluation of a robotic cotton harvester prototype. Computers and Electronics in Agriculture, 211, 107943.
Girshick, R. (2015). Fast r-cnn. Proceedings of the IEEE international conference on computer vision,
Girshick, R., Donahue, J., Darrell, T., & Malik, J. (2014). Rich feature hierarchies for accurate object detection and semantic segmentation. Proceedings of the IEEE conference on computer vision and pattern recognition,
Hafezalkotob, A., Hami-Dindar, A., Rabie, N., & Hafezalkotob, A. (2018). A decision support system for agricultural machines and equipment selection: A case study on olive harvester machines. Computers and Electronics in Agriculture, 148, 207-216.
He, K., Zhang, X., Ren, S., & Sun, J. (2015). Spatial pyramid pooling in deep convolutional networks for visual recognition. IEEE transactions on pattern analysis and machine intelligence, 37(9), 1904-1916.
Holmes, L., Upadhyay, D., & Mandjiny, S. (2016). Biological control of agriculture insect pests. European Scientific Journal, Special Edition, 228-237.
Hua, X., Li, H., Zeng, J., Han, C., Chen, T., Tang, L., & Luo, Y. (2023). A Review of Target Recognition Technology for Fruit Picking Robots: From Digital Image Processing to Deep Learning. Applied Sciences, 13(7), 4160.
Jin, X., Tang, L., Li, R., Zhao, B., Ji, J., & Ma, Y. (2022). Edge recognition and reduced transplantation loss of leafy vegetable seedlings with Intel RealsSense D415 depth camera. Computers and Electronics in Agriculture, 198, 107030.
Jocher, G., Chaurasia, A., Stoken, A., Borovec, J., Kwon, Y., Michael, K., Fang, J., Yifu, Z., Wong, C., & Montes, D. (2022). ultralytics/yolov5: v7. 0-yolov5 sota realtime instance segmentation. Zenodo.
Kasinathan, T., Singaraju, D., & Uyyala, S. R. (2021). Insect classification and detection in field crops using modern machine learning techniques. Information Processing in Agriculture, 8(3), 446-457.
Khalid, S., Oqaibi, H. M., Aqib, M., & Hafeez, Y. (2023). Small pests detection in field crops using deep learning object detection. Sustainability, 15(8), 6815.
Khan, A., Sohail, A., Zahoora, U., & Qureshi, A. S. (2020). A survey of the recent architectures of deep convolutional neural networks. Artificial intelligence review, 53, 5455-5516.
Li, C., Li, L., Jiang, H., Weng, K., Geng, Y., Li, L., Ke, Z., Li, Q., Cheng, M., & Nie, W. (2022). YOLOv6: A single-stage object detection framework for industrial applications. arXiv preprint arXiv:2209.02976.
Lin, T.-Y., Dollár, P., Girshick, R., He, K., Hariharan, B., & Belongie, S. (2017). Feature pyramid networks for object detection. Proceedings of the IEEE conference on computer vision and pattern recognition,
Liu, S., Qi, L., Qin, H., Shi, J., & Jia, J. (2018). Path aggregation network for instance segmentation. Proceedings of the IEEE conference on computer vision and pattern recognition,
Lu, S., Wang, B., Wang, H., Chen, L., Linjian, M., & Zhang, X. (2019). A real-time object detection algorithm for video. Computers & Electrical Engineering, 77, 398-408.
Maes, S., Antoons, T., Grégoire, J.-C., & De Clercq, P. (2014). A semi-artificial rearing system for the specialist predatory ladybird Cryptolaemus montrouzieri. BioControl, 59, 557-564.
Martineau, M., Conte, D., Raveaux, R., Arnault, I., Munier, D., & Venturini, G. (2017). A survey on image-based insect classification. Pattern Recognition, 65, 273-284.
McCornack, B., Koch, R. L., & Ragsdale, D. (2007). A simple method for in-field sex determination of the multicolored Asian lady beetle Harmonia axyridis. Journal of Insect Science, 7(1), 10.
Mirhosseini, M. A., Hosseini, M. R., & Jalali, M. A. (2015). Effects of diet on development and reproductive fitness of two predatory coccinellids (Coleoptera: Coccinellidae). European Journal of Entomology, 112(3), 446.
Ortiz, J. C., Ruiz, A. T., Morales-Ramos, J., Thomas, M., Rojas, M., Tomberlin, J., Yi, L., Han, R., Giroud, L., & Jullien, R. (2016). Insect mass production technologies. In Insects as sustainable food ingredients (pp. 153-201). Elsevier.
Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. Proceedings of the IEEE conference on computer vision and pattern recognition,
Ren, S., He, K., Girshick, R., & Sun, J. (2015). Faster r-cnn: Towards real-time object detection with region proposal networks. Advances in neural information processing systems, 28.
Reznik, S. Y., Dolgovskaya, M. Y., & Ovchinnikov, A. N. (2015). Effect of photoperiod on adult size and weight in Harmonia axyridis (Coleoptera: Coccinellidae). European Journal of Entomology, 112(4).
S Hesler, L., McNickle, G., A Catangui, M., E Losey, J., A Beckendorf, E., Stellwag, L., M Brandt, D., & B Bartlett, P. (2012). Method for continuously rearing Coccinella lady beetles (Coleoptera: Coccinellidae). The Open Entomology Journal, 6(1).
Schmitz, A., & Seckler, D. (1970). Mechanized agriculture and social welfare: The case of the tomato harvester. American Journal of Agricultural Economics, 52(4), 569-577.
Serra, J. (1983). Image analysis and mathematical morphology. Academic Press, Inc.
Wang, C.-Y., Bochkovskiy, A., & Liao, H.-Y. M. (2023). YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. Proceedings of the IEEE/CVF conference on computer vision and pattern recognition,
Wang, S., Tan, X.-L., Guo, X.-J., & Zhang, F. (2013). Effect of temperature and photoperiod on the development, reproduction, and predation of the predatory ladybird Cheilomenes sexmaculata (Coleoptera: Coccinellidae). Journal of economic entomology, 106(6), 2621-2629.
Xiao, Y., Tian, Z., Yu, J., Zhang, Y., Liu, S., Du, S., & Lan, X. (2020). A review of object detection based on deep learning. Multimedia Tools and Applications, 79, 23729-23791.
Xie, D., Chen, L., Liu, L., Chen, L., & Wang, H. (2022). Actuators and sensors for application in agricultural robots: A review. Machines, 10(10), 913.
Xiong, Y., Ge, Y., & From, P. J. (2021). An improved obstacle separation method using deep learning for object detection and tracking in a hybrid visual control loop for fruit picking in clusters. Computers and Electronics in Agriculture, 191, 106508.
Xiong, Y., Peng, C., Grimstad, L., From, P. J., & Isler, V. (2019). Development and field evaluation of a strawberry harvesting robot with a cable-driven gripper. Computers and Electronics in Agriculture, 157, 392-402.
Yu, J.-Z., Chi, H., & Chen, B.-H. (2013). Comparison of the life tables and predation rates of Harmonia dimidiata (F.)(Coleoptera: Coccinellidae) fed on Aphis gossypii Glover (Hemiptera: Aphididae) at different temperatures. Biological Control, 64(1), 1-9.
Yu, J., & Hsu, P. (2015). Longevity and fecundity of Lemnia biplagiata adult (Coleoptera: Coccinellidae) cultured in different sexual ratio and density. Plant Protection Bulletin (Taipei), 57(4), 51-82.
Zhang, B., Xie, Y., Zhou, J., Wang, K., & Zhang, Z. (2020). State-of-the-art robotic grippers, grasping and control strategies, as well as their applications in agricultural robots: A review. Computers and Electronics in Agriculture, 177, 105694.
Zhu, L.-Q., & Zhang, Z. (2010). Auto-classification of insect images based on color histogram and GLCM. 2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery,
林俊耀, 李昆龍, 顏辰鳳, & 陳宏伯. (2020). 化學農藥十年減半政策推動現況. 符合農藥減量政策的新穎性植物保護技術研討會專刊.
段淑人. (2019). 天敵在臺灣生物防治應用之發展及願景. 出自“ 有益昆蟲在友善農耕之應用研討會專輯”, 11-24. (苗栗:行政院農業委員會苗栗區農業改良場 ).
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/94408-
dc.description.abstract隨著食品安全和環保意識的提升,農藥減量政策成為全世界農業發展的重要方向,提升了生物防治技術的關注度。六條瓢蟲為鞘翅目瓢蟲科,以蚜蟲為主食,為農業害蟲防治的重要天敵昆蟲,在生物防治中有卓越表現。本研究針對六條瓢蟲(Cheilomenes sexmaculata),開發了一套自動化成蟲雌雄分離系統。然而,現有的大量瓢蟲養殖方式,無法做到雌雄分離。雌雄分離在瓢蟲的生產中具有重要意義。雌蟲的主要任務是產卵,繁衍下一代。雄蟲主要負責交配,而在高密度環境中雄蟲之間會產生競爭關係,影響繁殖效率。而通過雌雄分離,可以調整雌雄比,確保繁殖成功率也預防了競爭問題。除此之外也能防止雄蟲食卵的問題,從而提升整體生產效率。所以在無分辨雌雄的情況下難以達到量產,且分辨瓢蟲雌雄需依賴大量人力,難以有效提升生產效率。
為了解決此問題,本研究開發自動化瓢蟲成蟲雌雄分離系統。此系統的目標是解決在大規模瓢蟲飼養過程中,雌雄分離所涉及的人力和耗時問題。本研究欲透過機器視覺技術,精準的定位瓢蟲的位置並辨識其性別。此系統結合六軸移動平台與末端吸取裝置,在吸取過程中實現分類,確保瓢蟲被正確歸類。本研究建立之自動化系統可達成以下三個目標:1. 建立具備雌雄分離功能之六軸移動平台;2. 建立YOLOv8深度學習模型精準定位瓢蟲中心點位置;3. 以YOLOv8建立雌雄辨識模型。結果顯示,在精準定位方面,精準定位後的吸取成功率可達94.6%,系統所建立的雌雄辨識模型在測試集中準確度達到91.8%;Precision在雌性瓢蟲高達97.8%,雄性瓢蟲達到87.8%;Recall在雌性瓢蟲上達88.7 %,雄性瓢蟲高達98.8%;F1-score在雌性瓢蟲達93.1 %,雄性瓢蟲達92.9 %。而在實際辨識雌雄瓢蟲時可達92%的準確率,而辨識後之放置成功率高達97.8%,表示系統能夠精確且穩定的完成瓢蟲的雌雄分離工作。該自動化系統不僅減少人工操作需求,降低人力成本,還能提高生產效率,為瓢蟲大規模飼養提供了一個可行的解決方法。此系統的成功開發將促進生物防治技術的發展,為瓢蟲養殖業提供實際應用的技術支持。
zh_TW
dc.description.abstractWith the increasing awareness of food safety and environmental protection, the policy of reducing pesticide usage has become a significant direction for global agricultural development, enhancing the focus on biological control techniques. The Cheilomenes sexmaculata is the Coccinellidae family in the order Coleoptera, primarily feeds on aphids and is an important natural enemy in agricultural pest control, demonstrating outstanding performance in biological control. This study aims to develop an automated gender separate system for ladybug. However, the current large-scale rearing methods for ladybugs present challenges for effective gender separation. Gender separation is crucial in ladybug production, as female ladybugs are primarily responsible for laying eggs and propagating the next generation, while males are mainly involved in mating. In high-density environments, competition among males can affect reproductive efficiency. By separating males and females, the gender ratio can be adjusted to ensure reproductive success and prevent competition. Additionally, it can prevent males from eating eggs, thereby improving overall production efficiency. Without gender separation, it is challenging to achieve mass production, and gender identification of ladybugs requires substantial manual labor, making it difficult to enhance production efficiency effectively.
To address this issue, this study developed an automated system for gender separation of adult ladybugs. The goal of this system is to address the labor and time-consuming challenges associated with gender separation in large-scale ladybug rearing. The system aims to accurately locate ladybugs and identify their gender through machine vision technology. This system combines a six-axis moving platform with an end effector to achieve classification during the suction process, ensuring that ladybugs are correctly categorized. The automated system developed in this study can achieve the following three objectives: 1. Establish a six-axis moving platform with gender separation capability; 2. Develop a YOLOv8 deep learning model to accurately locate the central point of ladybugs; 3. Develop a YOLOv8 model for gender identification. The results show that the success rate of suction after accurate positioning can reach 94.6%. The gender identification model developed by the system achieves an accuracy of 91.8% in the test set, with a precision of 97.8% for females and 87.8% for males, a recall of 88.7% for females and 98.8% for males, and an F1-score of 93.1% for females and 92.9% for males. In practical applications, the system can achieve an accuracy rate of 92% for gender identification, and a placement success rate of up to 97.8%, indicating that the system can accurately and reliably complete the gender separation of ladybugs. This automated system not only reduces the need for manual operation and lowers labor costs but also improves production efficiency, providing a feasible solution for large-scale ladybug rearing. The successful development of this system will promote the advancement of biological control technology and provide practical technical support for the ladybug rearing industry.
en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2024-08-15T17:19:53Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2024-08-15T17:19:53Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents目次 i
誌謝 iii
摘要 iv
Abstract vi
圖次 viii
表次 xi
第1章 前言 1
1.1 研究背景 1
1.2 研究目的 4
第2章 文獻探討 6
2.1 瓢蟲 6
2.2 瓢蟲生產方式與雌雄分辨方法 7
2.2.1 瓢蟲飼養條件 7
2.2.2 傳統飼養 9
2.2.3 瓢蟲雌雄比影響 11
2.2.4 瓢蟲雌雄分辨 13
2.3 自動化生產系統 14
2.3.1 自動化機台開發 14
2.3.2 視覺系統之自動化機台開發 16
2.4 影像辨識模型 19
2.4.1 影像處理 20
2.4.2 卷積神經網路(Convolutional neural networks, CNN) 22
2.4.3 目標偵測模型(Object Detection Model) 23
第3章 材料與方法 32
3.1 系統架構 32
3.2 材料規格介紹 34
3.2.1 六條瓢蟲 35
3.2.2 成蟲飼養箱與遷移通道設計 37
3.2.3 六軸移動平台之硬體架構 39
3.2.4 電控與氣壓系統元件配置 41
3.2.5 末端吸取裝置配置 44
3.2.6 影像拍攝配置 45
3.3 建立影像處理與YOLOv8深度學習模型精準定位瓢蟲中心點位置 47
3.3.1 利用影像處理技術偵測瓢蟲中心點位置 48
3.3.2 以深度學習建立瓢蟲物件偵測模型與瓢蟲雌雄辨識模型 52
3.4 驗證試驗 59
3.4.1 六軸移動平台瓢蟲定位吸取成功率試驗 59
3.4.2 模型辨識準確度試驗 60
3.4.3 瓢蟲放置成功率試驗 60
第4章 結果討論 62
4.1 視覺系統辨識結果 62
4.1.1 影像處理偵測六條瓢蟲中心點之結果 62
4.1.2 以深度學習方式建立物件偵測模型並定位中心點之結果 64
4.2 六軸移動平台吸取率結果 66
4.3 六條瓢蟲雌雄辨識結果 67
4.3.1 六條瓢蟲自動化放置成功率試驗 70
第5章 結論 72
參考文獻 73
-
dc.language.isozh_TW-
dc.title瓢蟲的自動化雌雄分離系統應用於六條瓢蟲zh_TW
dc.titleAutomated gender separate system for ladybug, Cheilomenes sexmaculata (Coleoptera: Coccinellidae)en
dc.typeThesis-
dc.date.schoolyear112-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee周明儀;王永鐘;俞齊山zh_TW
dc.contributor.oralexamcommitteeMing-Yi Chou;Yung-Chung Wang ;Chi-Shan Yuen
dc.subject.keyword機器視覺,生物防治,六條瓢蟲,自動化生產系統,YOLOv8,zh_TW
dc.subject.keywordMachine Vision,biological control,Cheilomenes sexmaculata,Automated Production System,YOLOv8,en
dc.relation.page77-
dc.identifier.doi10.6342/NTU202403463-
dc.rights.note未授權-
dc.date.accepted2024-08-12-
dc.contributor.author-college生物資源暨農學院-
dc.contributor.author-dept生物機電工程學系-
顯示於系所單位:生物機電工程學系

文件中的檔案:
檔案 大小格式 
ntu-112-2.pdf
  目前未授權公開取用
8.15 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved