請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/85975
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 朱元南(Yuan-Nan Chu) | |
dc.contributor.author | Kuan-Wei Liao | en |
dc.contributor.author | 廖冠瑋 | zh_TW |
dc.date.accessioned | 2023-03-19T23:31:13Z | - |
dc.date.copyright | 2022-09-30 | |
dc.date.issued | 2022 | |
dc.date.submitted | 2022-09-28 | |
dc.identifier.citation | 郭宛靜、林翰佑、林翰佳(2021)。養蝦,你會不會忘了什麼? 鄭金華(2017)。赴南非參加「2017世界水產學會」發表論文及參訪。 Barange, M. (2018). Fishery and aquaculture statistics. FAO yearbook. Fishery and Aquaculture Statistics= FAO Annuaire. Statistiques des Peches et de l'Aquaculture= FAO Anuario. Estadisticas de Pesca y Acuicultura, I-82. Bass, D., Stentiford, G. D., Wang, H.-C., Koskella, B., & Tyler, C. R. (2019). The pathobiome in animal and plant diseases. Trends in ecology & evolution, 34(11), 996-1008. Black, T. R., Herleth-King, S. S., & Mattingly, H. T. (2010). Efficacy of internal PIT tagging of small-bodied crayfish for ecological study. Southeastern Naturalist, 9(sp3), 257-266. Bochkovskiy, A., Wang, C.-Y., & Liao, H.-Y. M. (2020). Yolov4: Optimal speed and accuracy of object detection. arXiv preprint arXiv:2004.10934. Caceci, T., Smith, S. A., Toth, T. E., Duncan, R. B., & Walker, S. C. (1999). Identification of individual prawns with implanted microchip transponders. Aquaculture, 180(1-2), 41-51. Cai, K., Miao, X., Wang, W., Pang, H., Liu, Y., & Song, J. (2020). A modified YOLOv3 model for fish detection based on MobileNetv1 as backbone. Aquacultural Engineering, 91, 102117. Chen, L., Li, S., Bai, Q., Yang, J., Jiang, S., & Miao, Y. (2021). Review of image classification algorithms based on convolutional neural networks. Remote Sensing, 13(22), 4712. Cong, S., & Zhou, Y. (2022). A review of convolutional neural network architectures and their optimizations. Artificial Intelligence Review, 1-65. Darodes de Tailly, J. B., Keitel, J., Owen, M. A., Alcaraz‐Calero, J. M., Alexander, M. E., & Sloman, K. A. (2021). Monitoring methods of feeding behaviour to answer key questions in penaeid shrimp feeding. Reviews in Aquaculture, 13(4), 1828-1843. Deng, H., Peng, L., Zhang, J., Tang, C., Fang, H., & Liu, H. (2019). An intelligent aerator algorithm inspired-by deep learning. Mathematical biosciences and engineering, 16(4), 2990-3002. Diba, A., Sharma, V., Pazandeh, A., Pirsiavash, H., & Van Gool, L. (2017). Weakly supervised cascaded convolutional networks. Proceedings of the IEEE conference on computer vision and pattern recognition, Dugassa, H., & Gaetan, D. G. (2018). Biology of white leg shrimp, Penaeus vannamei: Review. World Journal of Fish and Marine Sciences, 10(2), 5-17. Flegel, T. W. (2006). The special danger of viral pathogens in shrimp translocated for aquaculture. Science Asia, 32(3), 215-221. Flegel, T. W., Lightner, D. V., Lo, C. F., & Owens, L. (2008). Shrimp disease control: past, present and future. Diseases in Asian Aquaculture VI. Fish Health Section, Asian Fisheries Society, Manila, Philippines, 505, 355-378. Foote, A., Stratford, C., & Coman, G. (2018). Passive integrated transponder (PIT) tagging black tiger shrimp, Penaeus monodon: Applications for breeding programs. Aquaculture, 491, 321-324. Garcia-Garcia, A., Orts-Escolano, S., Oprea, S., Villena-Martinez, V., Martinez-Gonzalez, P., & Garcia-Rodriguez, J. (2018). A survey on deep learning techniques for image and video semantic segmentation. Applied Soft Computing, 70, 41-65. Garcia, R., Prados, R., Quintana, J., Tempelaar, A., Gracias, N., Rosen, S., Vågstøl, H., & Løvall, K. (2020). Automatic segmentation of fish using deep learning with application to fish size measurement. ICES Journal of Marine Science, 77(4), 1354-1366. Gherardi, F., Aquiloni, L., & Tricarico, E. (2012). Behavioral plasticity, behavioral syndromes and animal personality in crustacean decapods: an imperfect map is better than no map. Current Zoology, 58(4), 567-579. Guo, X., Zhao, X., Liu, Y., & Li, D. (2019). Underwater sea cucumber identification via deep residual networks. Information Processing in Agriculture, 6(3), 307-315. Guo, Y., Liu, Y., Georgiou, T., & Lew, M. S. (2018). A review of semantic segmentation using deep neural networks. International journal of multimedia information retrieval, 7(2), 87-93. He, K., Zhang, X., Ren, S., & Sun, J. (2015). Spatial pyramid pooling in deep convolutional networks for visual recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 37(9), 1904-1916. Hou, S., Liu, J., Wang, Y., An, D., & Wei, Y. (2022). Research on fish bait particles counting model based on improved MCNN. Computers and Electronics in Agriculture, 196, 106858. Hu, W.-C., Chen, L.-B., Huang, B.-K., & Lin, H.-M. (2022). A Computer Vision-Based Intelligent Fish Feeding System Using Deep Learning Techniques for Aquaculture. IEEE Sensors Journal, 22(7), 7185-7194. Hu, W.-C., Wu, H.-T., Zhang, Y.-F., Zhang, S.-H., & Lo, C.-H. (2020). Shrimp recognition using ShrimpNet based on convolutional neural network. Journal of Ambient Intelligence and Humanized Computing, 1-8. Hu, X., Liu, Y., Zhao, Z., Liu, J., Yang, X., Sun, C., Chen, S., Li, B., & Zhou, C. (2021). Real-time detection of uneaten feed pellets in underwater images for aquaculture using an improved YOLO-V4 network. Computers and Electronics in Agriculture, 185, 106135. Hunt, M. J., Winsor, H., & Alexander, C. (1992). Feeding by penaeid prawns: the role of the anterior mouthparts. Journal of Experimental Marine Biology and Ecology, 160(1), 33-46. Ibrahim, A. K., Zhuang, H., Chérubin, L. M., Schärer-Umpierre, M. T., & Erdol, N. (2018). Automatic classification of grouper species by their sounds using deep neural networks. The Journal of the Acoustical Society of America, 144(3), EL196-EL202. Jamal, M. T., Broom, M., Al-Mur, B. A., Al Harbi, M., Ghandourah, M., Al Otaibi, A., & Haque, M. F. (2020). Biofloc technology: Emerging microbial biotechnology for the improvement of aquaculture productivity. Polish journal of microbiology, 69(4), 401-409. Khanjani, M. H., & Sharifinia, M. (2020). Biofloc technology as a promising tool to improve aquaculture production. Reviews in Aquaculture, 12(3), 1836-1850. Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems, 25. Lai, P.-C., Lin, H.-Y., Lin, J.-Y., Hsu, H.-C., Chu, Y.-N., Liou, C.-H., & Kuo, Y.-F. (2022). Automatic measuring shrimp body length using CNN and an underwater imaging system. biosystems engineering, 221, 224-235. Liao, D. S., & Liao, I. C. (2006). An Economic Evaluation of Shrimp Farming Industry in Taiwan. Shrimp Culture Economics, Market, and Trade, 187-196. Liu, C., Gu, B., Sun, C., & Li, D. (2022). Effects of aquaponic system on fish locomotion by image-based YOLO v4 deep learning algorithm. Computers and Electronics in Agriculture, 194, 106785. Liu, S., Qi, L., Qin, H., Shi, J., & Jia, J. (2018). Path aggregation network for instance segmentation. Proceedings of the IEEE conference on computer vision and pattern recognition, Long, J., Shelhamer, E., & Darrell, T. (2015). Fully convolutional networks for semantic segmentation. Proceedings of the IEEE conference on computer vision and pattern recognition, Moland, E., Carlson, S. M., Villegas‐Ríos, D., Ree Wiig, J., & Moland Olsen, E. (2019). Harvest selection on multiple traits in the wild revealed by aquatic animal telemetry. Ecology and evolution, 9(11), 6480-6491. Mu, R., & Zeng, X. (2019). A review of deep learning research. KSII Transactions on Internet and Information Systems (TIIS), 13(4), 1738-1764. Mugwanya, M., Dawood, M. A., Kimera, F., & Sewilam, H. (2021). Biofloc systems for sustainable production of economically important aquatic species: A review. Sustainability, 13(13), 7255. Nair, V., & Hinton, G. E. (2010). Rectified linear units improve restricted boltzmann machines. Icml, Nisar, U., Peng, D., Mu, Y., & Sun, Y. (2021). A Solution for Sustainable Utilization of Aquaculture Waste: A Comprehensive Review of Biofloc Technology and Aquamimicry. Frontiers in Nutrition, 8. Noh, H., Hong, S., & Han, B. (2015). Learning deconvolution network for semantic segmentation. Proceedings of the IEEE international conference on computer vision, Nowacek, D. P., Christiansen, F., Bejder, L., Goldbogen, J. A., & Friedlaender, A. S. (2016). Studying cetacean behaviour: new technological approaches and conservation applications. Animal behaviour, 120, 235-244. Ouyang, W., Zeng, X., Wang, X., Qiu, S., Luo, P., Tian, Y., Li, H., Yang, S., Wang, Z., & Li, H. (2016). DeepID-Net: Object detection with deformable part based convolutional neural networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(7), 1320-1334. Peixoto, S., Soares, R., & Davis, D. A. (2020). An acoustic based approach to evaluate the effect of different diet lengths on feeding behavior of Litopenaeus vannamei. Aquacultural Engineering, 91, 102114. Peixoto, S., Soares, R., Silva, J. F., Hamilton, S., Morey, A., & Davis, D. A. (2020). Acoustic activity of Litopenaeus vannamei fed pelleted and extruded diets. Aquaculture, 525, 735307. Rajkumar, M., Pandey, P. K., Aravind, R., Vennila, A., Bharti, V., & Purushothaman, C. S. (2016). Effect of different biofloc system on water quality, biofloc composition and growth performance in L itopenaeus vannamei (Boone, 1931). Aquaculture research, 47(11), 3432-3444. Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. Proceedings of the IEEE conference on computer vision and pattern recognition, Redmon, J., & Farhadi, A. (2017). YOLO9000: better, faster, stronger. Proceedings of the IEEE conference on computer vision and pattern recognition, Redmon, J., & Farhadi, A. (2018). Yolov3: An incremental improvement. arXiv preprint arXiv:1804.02767. Ren, S., He, K., Girshick, R., & Sun, J. (2015). Faster r-cnn: Towards real-time object detection with region proposal networks. Advances in neural information processing systems, 28. Salman, A., Siddiqui, S. A., Shafait, F., Mian, A., Shortis, M. R., Khurshid, K., Ulges, A., & Schwanecke, U. (2020). Automatic fish detection in underwater videos by a deep neural network-based hybrid motion learning system. ICES Journal of Marine Science, 77(4), 1295-1307. Signaroli, M., Lana, A., Martorell-Barceló, M., Sanllehi, J., Barcelo-Serra, M., Aspillaga, E., Mulet, J., & Alós, J. (2022). Measuring inter-individual differences in behavioural types of gilthead seabreams in the laboratory using deep learning. PeerJ, 10, e13396. Sookying, D., Davis, D., & Soller Dias da Silva, F. (2013). A review of the development and application of soybean‐based diets for Pacific white shrimp Litopenaeus vannamei. Aquaculture Nutrition, 19(4), 441-448. Sucharitha, V., & Jyothi, S. (2013). An identification of penaeid prawn species based on histogram values. International Journal of Advanced Research in Computer Science and Software Engineering, 3(7). Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan, D., Vanhoucke, V., & Rabinovich, A. (2015). Going deeper with convolutions. Proceedings of the IEEE conference on computer vision and pattern recognition, Tang, C., Zhang, G., Hu, H., Wei, P., Duan, Z., & Qian, Y. (2020). An improved YOLOv3 algorithm to detect molting in swimming crabs against a complex background. Aquacultural Engineering, 91, 102115. Tseng, C.-H., Hsieh, C.-L., & Kuo, Y.-F. (2020). Automatic measurement of the body length of harvested fish using convolutional neural networks. biosystems engineering, 189, 36-47. Voulodimos, A., Doulamis, N., Doulamis, A., & Protopapadakis, E. (2018). Deep learning for computer vision: A brief review. Computational intelligence and neuroscience, 2018. Wang, C.-Y., Bochkovskiy, A., & Liao, H.-Y. M. (2022). YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. arXiv preprint arXiv:2207.02696. Wang, C.-Y., Liao, H.-Y. M., Wu, Y.-H., Chen, P.-Y., Hsieh, J.-W., & Yeh, I.-H. (2020). CSPNet: A new backbone that can enhance learning capability of CNN. Proceedings of the IEEE/CVF conference on computer vision and pattern recognition workshops, Wang, H., Zhang, S., Zhao, S., Wang, Q., Li, D., & Zhao, R. (2022). Real-time detection and tracking of fish abnormal behavior based on improved YOLOV5 and SiamRPN++. Computers and Electronics in Agriculture, 192, 106512. Yang, X., Zhang, S., Liu, J., Gao, Q., Dong, S., & Zhou, C. (2021). Deep learning for smart fish farming: applications, opportunities and challenges. Reviews in Aquaculture, 13(1), 66-90. Yao, Z., Yang, K., Huang, L., Huang, X., Qiuqian, L., Wang, K., & Zhang, D. (2018). Disease outbreak accompanies the dispersive structure of shrimp gut bacterial community with a simple core microbiota. AMB Express, 8(1), 1-10. Zhang, S., Yang, X., Wang, Y., Zhao, Z., Liu, J., Liu, Y., Sun, C., & Zhou, C. (2020). Automatic fish population counting by machine vision and a hybrid deep neural network model. Animals, 10(2), 364. Zhao, J., Bao, W., Zhang, F., Zhu, S., Liu, Y., Lu, H., Shen, M., & Ye, Z. (2018). Modified motion influence map and recurrent neural network-based monitoring of the local unusual behaviors for fish school in intensive aquaculture. Aquaculture, 493, 165-175. Zhao, Z.-Q., Zheng, P., Xu, S.-t., & Wu, X. (2019). Object detection with deep learning: A review. IEEE transactions on neural networks and learning systems, 30(11), 3212-3232. Zheng, Z., Wang, P., Liu, W., Li, J., Ye, R., & Ren, D. (2020). Distance-IoU loss: Faster and better learning for bounding box regression. Proceedings of the AAAI conference on artificial intelligence, Zhou, C., Xu, D., Chen, L., Zhang, S., Sun, C., Yang, X., & Wang, Y. (2019). Evaluation of fish feeding intensity in aquaculture using a convolutional neural network and machine vision. Aquaculture, 507, 457-465. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/85975 | - |
dc.description.abstract | 中南美白對蝦是目前全球養蝦產業最重要的物種,即時觀測白蝦的行為模式有助於提高飼料效率和降低成本,傳統上白蝦的觀測是透過傘網,此種做法費時費力,且會干擾白蝦的活動。本研究研發白蝦物聯網監測系統,包含白蝦即時影像觀測、白蝦偵測模型、投餌參數分析和自動投餌等功能。本研究研發多種數位和類比式水下影像系統及附件,搭配樹苺派控制和紅外線照明,可配合三種養殖池型,提供串流影像的即時觀測及備份,能即時觀測六種水色的白蝦影像,包括蝦苗或夜間影像。本研究利用深度學習研發白蝦偵測模型,將白蝦影像分成蝦體跟尾扇,使用卷積類神經網絡的模型YOLOv4來偵測白蝦的位置,利用影像處理將白蝦與背景分離,計算白蝦的體長、拖糞率和出現數,合稱投餌參數,每間隔一周對白蝦影像做24小時連續性的統計分析,共歷時兩個半月,以歸納出白蝦的生長及行為模式。本研究在彰化、高雄及屏東三處養蝦場進行監測系統的驗證,結果顯示,YOLOv4模型辨識白蝦的平均精準度(mAP)為0.76,出現數的精準度為0.85,拖糞率的精準度為0.69,體長的精準度為0.94。體長與時間呈線性變化,決定係數(R square)為0.98,適用於3公分以上的白蝦。影像系統平台高度會影響蝦子出現的機會,出現數在投餌後會有明顯下降再上升的趨勢,顯示白蝦被飼料吸引離開平台上方的程度,有助於判斷白蝦食慾,調整投餌量及投餌時機,拖糞率在前半段的變化趨勢與出現數相似,後半段則不顯著。本研究研發的水道池專用的自動投餌機,可雲端控制投餌,讓夜間投餌成為可行,在屏東完成全自動投餌實驗。本研究所提出的白蝦物聯網監測系統克服了池塘養殖中白蝦不易觀測的瓶頸,為白蝦生長及行為模式提供重要資訊,並有助於投餌控制的自動化。 | zh_TW |
dc.description.abstract | Penaeus vannamei is currently the most important species in the global shrimp farming industry. Real-time observation of the behavior pattern of white shrimp can help improve feed efficiency and reduce costs. Traditionally, white shrimp is observed through feeding trays, which is time-consuming and labor-intensive and interferes with the activity of white shrimp. This study develops the white shrimp IoT monitoring system, which includes the functions of white shrimp real-time image observation, white shrimp detection model, feeding parameter analysis and automatic feeding. In this study, a variety of digital and analog underwater imaging systems and accessories have been developed. With Raspberry Pi control and infrared lighting, they can be matched with three types of aquaculture ponds, providing real-time observation and backup of streaming images, and can instantly observe six colors of white shrimp, including shrimp fry or nighttime image. This study uses deep learning to develop a white shrimp detection model, divides the white shrimp image into shrimp body and tail fan, uses the convolutional neural network model YOLOv4 to detect the location of the white shrimp, and uses image processing to separate the white shrimp from the background. Calculate the body length, dripping rate and appearing number of the white shrimp, which are collectively referred to as the feeding parameters. Statistical analysis of the white shrimp image for 24 hours is performed every one week for two and a half months to summarize the growth and behavior of the white shrimp. In this study, the monitoring system was verified in three shrimp farms in Changhua, Kaohsiung and Pingtung. The results showed that the average precision (mAP) of the YOLOv4 model for identifying white shrimp was 0.76. The accuracy of the appearing number is 0.85. The accuracy of dropping rate was 0.69. The accuracy of body length was 0.94. The body length changes linearly with time, and the coefficient of determination (R square) was 0.98, which is suitable for white shrimp over 3 cm. The height of the imaging system platform will affect the chance of shrimps appearing. The appearing number will obviously decrease and then increase after feeding. It shows the degree to which the shrimps are attracted away from the platform by the feeding, which is helpful for judging the appetite of the shrimps and adjusting the feeding amount. And the timing of feeding, the change trend of the dropping rate in the first half was similar to the appearing number, but it was not significant in the second half. The automatic feeder specially developed for waterway pools in this study can control feeding in the cloud, making night-time feeding feasible, and completed the automatic feeding experiment in Pingtung. The white shrimp IoT monitoring system proposed in this study overcomes the bottleneck of difficult observation of white shrimp in pond culture, provides important information for white shrimp growth and behavior patterns, and facilitates the automation of feeding control. | en |
dc.description.provenance | Made available in DSpace on 2023-03-19T23:31:13Z (GMT). No. of bitstreams: 1 U0001-2709202211420000.pdf: 8780176 bytes, checksum: 3bbd3b770a72b573a60addb1e3b4d5d7 (MD5) Previous issue date: 2022 | en |
dc.description.tableofcontents | 誌謝 i 摘要 ii Abstract iv 目錄 vi 圖目錄 viii 表目錄 xi 第一章 前言 1 第二章 文獻探討 4 2.1 蝦子養殖概況 4 2.2 生物絮團 5 2.3蝦類型態學 6 2.4 蝦子的監測方式 7 2.5 深度學習與影像辨識 9 2.6 深度學習在養殖業的應用 11 第三章 材料方法 13 3.1 實驗場域及規劃 13 3.2 水下影像系統設計 15 3.3 影像蒐集方式 22 3.4 影像校正 25 3.5 影像標註及訓練集 26 3.6 YOLOv4 29 3.7 投餌參數的後續處理影像處理 32 3.8 自動投餌機 34 3.9 白蝦物聯網監測系統 36 第四章 結果與討論 38 4.1 各式水下影像系統的比較 38 4.2 影像蒐集的結果探討 40 4.3 物件偵測模型的偵測誤差 44 4.4 投餌參數的討論 48 4.4.1 出現數 48 4.4.2 拖糞率 54 4.4.3 體長 58 4.5 最佳投餌策略驗證 59 第五章 結論 64 參考文獻 65 | |
dc.language.iso | zh-TW | |
dc.title | 利用深度學習與水下影像研發白蝦成長及行為模式的物聯網監測系統 | zh_TW |
dc.title | Development of an IoT monitoring system for whiteleg shrimp's growth and behavior patterns using deep learning and underwater images | en |
dc.type | Thesis | |
dc.date.schoolyear | 110-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 郭彥甫(Yan-Fu Kuo),劉擎華(Chyng-Hwa Liou) | |
dc.subject.keyword | 白蝦,水下影像,樹莓派,深度學習,YOLOv4,拖糞,物聯網, | zh_TW |
dc.subject.keyword | white shrimp,underwater image,Raspberry Pi,deep learning,YOLOv4,dropping,IoT, | en |
dc.relation.page | 70 | |
dc.identifier.doi | 10.6342/NTU202204147 | |
dc.rights.note | 同意授權(全球公開) | |
dc.date.accepted | 2022-09-29 | |
dc.contributor.author-college | 生物資源暨農學院 | zh_TW |
dc.contributor.author-dept | 生物機電工程學系 | zh_TW |
dc.date.embargo-lift | 2022-09-30 | - |
顯示於系所單位: | 生物機電工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
U0001-2709202211420000.pdf | 8.57 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。