Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 生物資源暨農學院
  3. 農藝學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/99269
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor劉力瑜zh_TW
dc.contributor.advisorLi-yu Daisy Liuen
dc.contributor.author李俊翰zh_TW
dc.contributor.authorChun-Han Leeen
dc.date.accessioned2025-08-21T17:03:33Z-
dc.date.available2025-08-22-
dc.date.copyright2025-08-21-
dc.date.issued2025-
dc.date.submitted2025-07-30-
dc.identifier.citationAgresti, A. 2012. Categorical data analysis. John Wiley & Sons.
Ali, M.M., N.A. Bachik, N. ‘Atirah Muhadi, T.N. Tuan Yusof, and C. Gomes. 2019. Non-destructive techniques of detecting plant diseases: A review. Physiological and Molecular Plant Pathology 108: 101426. doi: 10.1016/j.pmpp.2019.101426.
Alves, A.K.S., M.S. Araújo, S.F.S. Chaves, L.A.S. Dias, L.P. Corrêdo, et al. 2024. High throughput phenotyping in soybean breeding using RGB image vegetation indices based on drone. Sci Rep 14(1): 32055. doi: 10.1038/s41598-024-83807-4.
Attallah, O. 2023. Deep Learning-Based Model For Paddy Diseases Classification By Thermal Infrared Sensor: An Application For Precision Agriculture. ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci. X-1/W1-2023: 779–784. doi: 10.5194/isprs-annals-X-1-W1-2023-779-2023.
Bajocco, S., F. Ginaldi, F. Savian, D. Morelli, M. Scaglione, et al. 2022. On the Use of NDVI to Estimate LAI in Field Crops: Implementing a Conversion Equation Library. Remote Sensing 14(15): 3554. doi: 10.3390/rs14153554.
Bannari, A., D. Morin, F. Bonn, and A. Huete. 1995. A review of vegetation indices. Remote sensing reviews 13(1–2): 95–120. doi: 10.1080/02757259509532298.
Beauchêne, K., F. Leroy, A. Fournier, C. Huet, M. Bonnefoy, et al. 2019. Management and Characterization of Abiotic Stress via PhénoField®, a High-Throughput Field Phenotyping Platform. Front Plant Sci 10: 904. doi: 10.3389/fpls.2019.00904.
Behmann, J., K. Acebron, D. Emin, S. Bennertz, S. Matsubara, et al. 2018. Specim IQ: Evaluation of a New, Miniaturized Handheld Hyperspectral Camera and Its Application for Plant Phenotyping and Disease Detection. Sensors 18(2): 441. doi: 10.3390/s18020441.
Benos, L., A.C. Tagarakis, G. Dolias, R. Berruto, D. Kateris, et al. 2021. Machine Learning in Agriculture: A Comprehensive Updated Review. Sensors 21(11): 3758. doi: 10.3390/s21113758.
Bhatti, M.A., M.S. Syam, H. Chen, Y. Hu, L.W. Keung, et al. 2024. Utilizing convolutional neural networks (CNN) and U-Net architecture for precise crop and weed segmentation in agricultural imagery: A deep learning approach. Big Data Research 36: 100465. doi: 10.1016/j.bdr.2024.100465.
Bodner, G., A. Nakhforoosh, T. Arnold, and D. Leitner. 2018. Hyperspectral imaging: a novel approach for plant root phenotyping. Plant Methods 14(1): 84. doi: 10.1186/s13007-018-0352-1.
Böhler, J.E., M.E. Schaepman, and M. Kneubühler. 2018. Crop Classification in a Heterogeneous Arable Landscape Using Uncalibrated UAV Data. Remote Sensing 10(8): 1282. doi: 10.3390/rs10081282.
Borra-Serrano, I., T. De Swaef, P. Quataert, J. Aper, A. Saleem, et al. 2020. Closing the Phenotyping Gap: High Resolution UAV Time Series for Soybean Growth Analysis Provides Objective Data from Field Trials. Remote Sensing 12(10): 1644. doi: 10.3390/rs12101644.
Brugger, A., J. Behmann, S. Paulus, H.-G. Luigs, M.T. Kuska, et al. 2019. Extending Hyperspectral Imaging for Plant Phenotyping to the UV-Range. Remote Sensing 11(12): 1401. doi: 10.3390/rs11121401.
Chen, P.-C., Y.-C. Chiang, and P.-Y. Weng. 2020. Imaging using unmanned aerial vehicles for agriculture land use classification. Agriculture 10(9): 416.
Chen, T., and C. Guestrin. 2016. XGBoost: A Scalable Tree Boosting System. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, San Francisco California USA. p. 785–794
Chen, T., T. He, M. Benesty, V. Khotilovich, Y. Tang, et al. 2024. xgboost: Extreme Gradient Boosting. https://cran.r-project.org/web/packages/xgboost/index.html (accessed 6 January 2025).
Chen, Z., J. Wang, T. Wang, Z. Song, Y. Li, et al. 2021. Automated in-field leaf-level hyperspectral imaging of corn plants using a Cartesian robotic platform. Computers and Electronics in Agriculture 183: 105996. doi: 10.1016/j.compag.2021.105996.
Christians, N.E., A.J. Patton, and Q.D. Law. 2016. Fundamentals of Turfgrass Management, Fifth Edition. 1st ed. Wiley.
Colwell, F. de J., Jock Souter, G.J. Bryan, L.J. Compton, N. Boonham, et al. 2021. Development and Validation of Methodology for Estimating Potato Canopy Structure for Field Crop Phenotyping and Improved Breeding. Front. Plant Sci. 12. doi: 10.3389/fpls.2021.612843.
Daponte, P., L.D. Vito, L. Glielmo, L. Iannelli, D. Liuzza, et al. 2019. A review on the use of drones for precision agriculture. IOP Conf. Ser.: Earth Environ. Sci. 275(1): 012022. doi: 10.1088/1755-1315/275/1/012022.
De Swaef, T., W.H. Maes, J. Aper, J. Baert, M. Cougnon, et al. 2021. Applying RGB- and Thermal-Based Vegetation Indices from UAVs for High-Throughput Field Phenotyping of Drought Tolerance in Forage Grasses. Remote Sensing 13(1): 147. doi: 10.3390/rs13010147.
Di Gennaro, S.F., P. Toscano, M. Gatti, S. Poni, A. Berton, et al. 2022. Spectral Comparison of UAV-Based Hyper and Multispectral Cameras for Precision Viticulture. Remote Sensing 14(3): 449. doi: 10.3390/rs14030449.
Dulhare, U.N., K. Ahmad, Khairol Amali Bin Ahmad, and John Wiley & Sons, Inc, editors. 2020. Machine learning and big data: concepts, algorithms, tools and applications. Wiley-Scrivener, Hoboken, NJ.
Falcioni, R., J.V.F. Gonçalves, K.M. de Oliveira, C.A. de Oliveira, A.S. Reis, et al. 2023. Chemometric Analysis for the Prediction of Biochemical Compounds in Leaves Using UV-VIS-NIR-SWIR Hyperspectroscopy. Plants 12(19): 3424. doi: 10.3390/plants12193424.
Franzius, M., M. Dunn, N. Einecke, and R. Dirnberger. 2017. Embedded Robust Visual Obstacle Detection on Autonomous Lawn Mowers. 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). IEEE, Honolulu, HI, USA. p. 361–369
GEOSAT Aerospace & Technology Inc. GEOSAT GeoCCP. https://www.geosat.com.tw/TW/product-platform-geoccp.aspx.
Gill, T., S.K. Gill, D.K. Saini, Y. Chopra, J.P. de Koff, et al. 2022. A Comprehensive Review of High Throughput Phenotyping and Machine Learning for Plant Stress Phenotyping. Phenomics 2(3): 156–183. doi: 10.1007/s43657-022-00048-z.
Haralick, R.M., K. Shanmugam, and I. Dinstein. 1973. Textural Features for Image Classification. IEEE Transactions on Systems, Man, and Cybernetics SMC-3(6): 610–621. doi: 10.1109/TSMC.1973.4309314.
Hassan, S.I., M.M. Alam, U. Illahi, M.A. Al Ghamdi, S.H. Almotiri, et al. 2021. A Systematic Review on Monitoring and Advanced Control Strategies in Smart Agriculture. IEEE Access 9: 32517–32548. doi: 10.1109/ACCESS.2021.3057865.
Hassan, M.A., M. Yang, A. Rasheed, G. Yang, M. Reynolds, et al. 2019. A rapid monitoring of NDVI across the wheat growth cycle for grain yield prediction using a multi-spectral UAV platform. Plant Science 282: 95–103. doi: 10.1016/j.plantsci.2018.10.022.
Hijmans, R.J., J. van Etten, M. Sumner, J. Cheng, D. Baston, et al. 2024. raster: Geographic Data Analysis and Modeling. https://cran.r-project.org/web/packages/raster/index.html (accessed 6 January 2025).
Huang, S., Z. Chen, M. Hu, Y. Xue, L. Liao, et al. 2021. First Report of Bacterial Soft Rot Disease on Taro Caused by Dickeya fangzhongdai in China. Plant Disease 105(11): 3737. doi: 10.1094/PDIS-10-20-2225-PDN.
Ishimwe, R., K. Abutaleb, and F. Ahmed. 2014. Applications of Thermal Imaging in Agriculture—A Review. ARS 03(03): 128–140. doi: 10.4236/ars.2014.33011.
James, G., D. Witten, T. Hastie, and R. Tibshirani. 2013. An Introduction to Statistical Learning. Springer New York, New York, NY.
Jeroen Ooms. 2024. magick: Advanced Graphics and Image-Processing in R. https://CRAN.R-project.org/package=magick.
Jia, X. 2022. Field Guide to Hyperspectral / Multispectral Image Processing. 1st ed. SPIE, Bellingham.
Jin, X., P.J. Zarco-Tejada, U. Schmidhalter, M.P. Reynolds, M.J. Hawkesford, et al. 2021. High-Throughput Estimation of Crop Traits: A Review of Ground and Aerial Phenotyping Platforms. IEEE Geoscience and Remote Sensing Magazine 9(1): 200–231. doi: 10.1109/MGRS.2020.2998816.
Kim, K.-L., B.-J. Kim, Y.-K. Lee, and J.-H. Ryu. 2019. Generation of a Large-Scale Surface Sediment Classification Map Using Unmanned Aerial Vehicle (UAV) Data: A Case Study at the Hwang-do Tidal Flat, Korea. Remote Sensing 11(3): 229. doi: 10.3390/rs11030229.
Kim, H.-O., and J.-M. Yeom. 2014. Effect of red-edge and texture features for object-based paddy rice crop classification using RapidEye multi-spectral satellite image data. International Journal of Remote Sensing: 1–23. doi: 10.1080/01431161.2014.965285.
Kraus, K. 2011. Photogrammetry: geometry from images and laser scans. Walter de Gruyter.
Ku, K.-B., S. Mansoor, G.D. Han, Y.S. Chung, and T.T. Tuan. 2023. Identification of new cold tolerant Zoysia grass species using high-resolution RGB and multi-spectral imaging. Sci Rep 13: 13209. doi: 10.1038/s41598-023-40128-2.
Kuhn, M., J. Wing, S. Weston, Andre Williams, C. Keefer, et al. 2024. caret: Classification and Regression Training. https://cran.r-project.org/web/packages/caret/index.html (accessed 6 January 2025).
Kwak, G.-H., and N.-W. Park. 2019. Impact of texture information on crop classification with machine learning and UAV images. Applied Sciences 9(4): 643. doi: 10.3390/app9040643.
Lammers, B. 2020. ANN2: Artificial neural networks for anomaly detection.
Laroche-Pinel, E., K.R. Vasquez, and L. Brillante. 2024. Assessing grapevine water status in a variably irrigated vineyard with NIR/SWIR hyperspectral imaging from UAV. Precision Agric 25(5): 2356–2374. doi: 10.1007/s11119-024-10170-9.
Lee, C.-H., K.-Y. Chen, and L.D. Liu. 2024. Effect of Texture Feature Distribution on Agriculture Field Type Classification with Multitemporal UAV RGB Images. Remote Sensing 16(7): 1221. doi: 10.3390/rs16071221.
Lee, C.-H., and L.-Y.D. Liu. 2021. agrifeature: Agriculture Image Feature. https://cran.r-project.org/web/packages/agrifeature/index.html (accessed 6 January 2025).
Lewis, R.J. 2000. An Introduction to Classification and Regression Tree (CART) Analysis. Department of Emergency Medicine Harbor-UCLA Medical Center Torrance, San Francisco, California, USA
Li, C., H. Li, J. Li, Y. Lei, C. Li, et al. 2019. Using NDVI percentiles to monitor real-time crop growth. Computers and Electronics in Agriculture 162: 357–363. doi: 10.1016/j.compag.2019.04.026.
Li, F., Y. Miao, G. Feng, F. Yuan, S. Yue, et al. 2014. Improving estimation of summer maize nitrogen status with red edge-based spectral vegetation indices. Field Crops Research 157: 111–123. doi: 10.1016/j.fcr.2013.12.018.
Liu, Y., S. Zhou, H. Wu, W. Han, C. Li, et al. 2022. Joint optimization of autoencoder and Self-Supervised Classifier: Anomaly detection of strawberries using hyperspectral imaging. Computers and Electronics in Agriculture 198: 107007. doi: 10.1016/j.compag.2022.107007.
Mahlein, A.-K., M.T. Kuska, J. Behmann, G. Polder, and A. Walter. 2018. Hyperspectral Sensors and Imaging Technologies in Phytopathology: State of the Art. Annu. Rev. Phytopathol. 56(1): 535–558. doi: 10.1146/annurev-phyto-080417-050100.
Manavalan, L.P., I. Cui, K.V. Ambrose, S. Panjwani, S. DeLong, et al. 2021. Systematic approach to validate and implement digital phenotyping tool for soybean: A case study with PlantEye. The Plant Phenome Journal 4(1): e20025. doi: 10.1002/ppj2.20025.
Marín, J., L. Parra, J. Rocher, S. Sendra, J. Lloret, et al. 2018. Urban Lawn Monitoring in Smart City Environments. Journal of Sensors 2018(1): 8743179. doi: 10.1155/2018/8743179.
Mashaba-Munghemezulu, Z., G.J. Chirima, and C. Munghemezulu. 2021. Mapping Smallholder Maize Farms Using Multi-Temporal Sentinel-1 Data in Support of the Sustainable Development Goals. Remote Sensing 13(9): 1666. doi: 10.3390/rs13091666.
Mavridou, E., E. Vrochidou, G.A. Papakostas, T. Pachidis, and V.G. Kaburlasos. 2019. Machine Vision Systems in Precision Agriculture for Crop Farming. Journal of Imaging 5(12): 89. doi: 10.3390/jimaging5120089.
Mendoza-Bernal, J., A. González-Vidal, and A.F. Skarmeta. 2024. A Convolutional Neural Network approach for image-based anomaly detection in smart agriculture. Expert Systems with Applications 247: 123210. doi: 10.1016/j.eswa.2024.123210.
Meyer, D., E. Dimitriadou, K. Hornik, A. Weingessel, F. Leisch, et al. 2024. e1071: Misc Functions of the Department of Statistics, Probability Theory Group (Formerly: E1071), TU Wien. https://cran.r-project.org/web/packages/e1071/index.html (accessed 6 January 2025).
Mienye, I.D., and Y. Sun. 2022. A Survey of Ensemble Learning: Concepts, Algorithms, Applications, and Prospects. IEEE Access 10: 99129–99149. doi: 10.1109/ACCESS.2022.3207287.
Mikhail, E.M. 2001. Introduction to Modern Photogrammetry. John Williey & sons.
Motohka, T., K.N. Nasahara, A. Miyata, M. Mano, and S. Tsuchida. 2009. Evaluation of optical satellite remote sensing for rice paddy phenology in monsoon Asia using a continuous in situ dataset. International Journal of Remote Sensing 30(17): 4343–4357. doi: 10.1080/01431160802549369.
Nagler, P.L., Y. Inoue, E.P. Glenn, A.L. Russ, and C.S.T. Daughtry. 2003. Cellulose absorption index (CAI) to quantify mixed soil–plant litter scenes. Remote Sensing of Environment 87(2): 310–325. doi: 10.1016/j.rse.2003.06.001.
Nelsen, T.S., J. Hegarty, S. Tamagno, and M.E. Lundy. 2025. NDVI-based ideotypes as a cost-effective tool to support wheat yield stability selection under heterogeneous environments. Field Crops Research 322: 109727. doi: 10.1016/j.fcr.2024.109727.
Noble, W.S. 2006. What is a support vector machine? Nat Biotechnol 24(12): 1565–1567. doi: 10.1038/nbt1206-1565.
Olson, D., and J. Anderson. 2021. Review on unmanned aerial vehicles, remote sensors, imagery processing, and their applications in agriculture. Agronomy Journal 113(2): 971–992. doi: 10.1002/agj2.20595.
Ozaki, Y., C. Huck, S. Tsuchikawa, and S.B. Engelsen, editors. 2021. Near-Infrared Spectroscopy: Theory, Spectral Analysis, Instrumentation, and Applications. Springer Singapore, Singapore.
Park, E., Y.-S. Kim, M.A. Faqeerzada, M.S. Kim, I. Baek, et al. 2023. Hyperspectral reflectance imaging for nondestructive evaluation of root rot in Korean ginseng (Panax ginseng Meyer). Front. Plant Sci. 14. doi: 10.3389/fpls.2023.1109060.
Parra, L., A. Ahmad, M. Zaragoza-Esquerdo, A. Ivars-Palomares, S. Sendra, et al. 2024. A Comprehensive Survey of Drones for Turfgrass Monitoring. Drones 8(10): 563. doi: 10.3390/drones8100563.
Pebesma, E., R. Bivand, E. Racine, M. Sumner, I. Cook, et al. 2024. sf: Simple Features for R. https://cran.r-project.org/web/packages/sf/index.html (accessed 6 January 2025).
Phang, S.K., T.H.A. Chiang, A. Happonen, and M.M.L. Chang. 2023. From Satellite to UAV-Based Remote Sensing: A Review on Precision Agriculture. IEEE Access 11: 127057–127076. doi: 10.1109/ACCESS.2023.3330886.
QGIS.org. 2018. QGIS Geographic Information System. http://qgis.org.
Qiao, L., W. Tang, D. Gao, R. Zhao, L. An, et al. 2022a. UAV-based chlorophyll content estimation by evaluating vegetation index responses under different crop coverages. Computers and Electronics in Agriculture 196: 106775. doi: 10.1016/j.compag.2022.106775.
Qin, J., O. Monje, M.R. Nugent, J.R. Finn, A.E. O’Rourke, et al. 2023. A hyperspectral plant health monitoring system for space crop production. Front. Plant Sci. 14. doi: 10.3389/fpls.2023.1133505.
Queiroz, D.M.D., D.S.M. Valente, F.D.A.D.C. Pinto, Aluízio Borém, and J.K. Schueller, editors. 2022. Digital Agriculture. Springer International Publishing, Cham.
R Core Team. 2019. R: A language and environment for statistical computing. https://CRAN.R-project.org/package=ANN2.
Radočaj, D., A. Šiljeg, R. Marinović, and M. Jurišić. 2023. State of Major Vegetation Indices in Precision Agriculture Studies Indexed in Web of Science: A Review. Agriculture 13(3): 707. doi: 10.3390/agriculture13030707.
Rajbahadur, G.K., S. Wang, Y. Kamei, and A.E. Hassan. 2019. Impact of discretization noise of the dependent variable on machine learning classifiers in software engineering. IEEE Transactions on Software Engineering 47(7): 1414–1430.
Ram, B.G., P. Oduor, C. Igathinathane, K. Howatt, and X. Sun. 2024. A systematic review of hyperspectral imaging in precision agriculture: Analysis of its current state and future prospects. Computers and Electronics in Agriculture 222: 109037. doi: 10.1016/j.compag.2024.109037.
Rowlands, D.A. 2020. Physics of digital photography. IOP Publishing.
Santos, M.G., M. Nunes da Silva, M.W. Vasconcelos, and S.M.P. Carvalho. 2024. Scientific and technological advances in the development of sustainable disease management tools: a case study on kiwifruit bacterial canker. Front Plant Sci 14: 1306420. doi: 10.3389/fpls.2023.1306420.
Septiarini, A., A. Sunyoto, H. Hamdani, A.A. Kasim, F. Utaminingrum, et al. 2021. Machine vision for the maturity classification of oil palm fresh fruit bunches based on color and texture features. Scientia Horticulturae 286: 110245. doi: 10.1016/j.scienta.2021.110245.
Sergyan, S. 2008. Color histogram features based image classification in content-based image retrieval systems. 2008 6th international symposium on applied machine intelligence and informatics. IEEE. p. 221–224
Sharma, S. 1995. Applied multivariate techniques. John Wiley & Sons, Inc.
Shen, X., Y. Teng, H. Fu, Z. Wan, and X. Zhang. 2020. Crop identification using UAV image segmentation. Second Target Recognition and Artificial Intelligence Summit Forum. SPIE. p. 480–484
Shrivastava, V.K., and M.K. Pradhan. 2021. Rice plant disease classification using color features: a machine learning paradigm. J Plant Pathol 103(1): 17–26. doi: 10.1007/s42161-020-00683-3.
Sishodia, R.P., R.L. Ray, and S.K. Singh. 2020. Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sensing 12(19): 3136. doi: 10.3390/rs12193136.
Skelsey, P. 2021. Forecasting Risk of Crop Disease with Anomaly Detection Algorithms. Phytopathology® 111(2): 321–332. doi: 10.1094/PHYTO-05-20-0185-R.
Song, H., S.-R. Yoon, Y.-M. Dang, J.-S. Yang, I.M. Hwang, et al. 2022. Nondestructive classification of soft rot disease in napa cabbage using hyperspectral imaging analysis. Sci Rep 12(1): 14707. doi: 10.1038/s41598-022-19169-6.
Srivastava, D., R. Wadhvani, and M. Gyanchandani. 2015. A Review: Color Feature Extraction Methods for Content Based Image Retrieval. 18(3).
Szeliski, R. 2022. Computer vision: algorithms and applications. Springer Nature.
Therneau, T., B. Atkinson, B.R. (producer of the initial R. port, and maintainer 1999-2017). 2023. rpart: Recursive Partitioning and Regression Trees. https://cran.r-project.org/web/packages/rpart/index.html (accessed 6 January 2025).
Tian, D.P. 2013. A review on image feature extraction and representation techniques. International Journal of Multimedia and Ubiquitous Engineering 8(4): 385–396.
Tian, H., T. Wang, Y. Liu, X. Qiao, and Y. Li. 2020. Computer vision technology in agricultural automation —A review. Information Processing in Agriculture 7(1): 1–19. doi: 10.1016/j.inpa.2019.09.006.
Tomaštík, J., M. Mokroš, P. Surový, A. Grznárová, and J. Merganič. 2019. UAV RTK/PPK Method—An Optimal Solution for Mapping Inaccessible Forested Areas? Remote Sensing 11(6): 721. doi: 10.3390/rs11060721.
Vadivambal, R., and D.S. Jayas. 2011. Applications of Thermal Imaging in Agriculture and Food Industry—A Review. Food Bioprocess Technol 4(2): 186–199. doi: 10.1007/s11947-010-0333-5.
Vasconcelos, G.J.Q., G.S.R. Costa, T.V. Spina, and H. Pedrini. 2023. Low-Cost Robot for Agricultural Image Data Acquisition. Agriculture 13(2): 413. doi: 10.3390/agriculture13020413.
Waheed, T., R.B. Bonnell, S.O. Prasher, and E. Paulet. 2006. Measuring performance in precision agriculture: CART—A decision tree approach. Agricultural Water Management 84(1): 173–185. doi: 10.1016/j.agwat.2005.12.003.
Wang, T., B. Chen, Z. Zhang, H. Li, and M. Zhang. 2022. Applications of machine vision in agricultural robot navigation: A review. Computers and Electronics in Agriculture 198: 107085. doi: 10.1016/j.compag.2022.107085.
Wang, Y., G. Kootstra, Z. Yang, and H.A. Khan. 2024. UAV multispectral remote sensing for agriculture: A comparative study of radiometric correction methods under varying illumination conditions. Biosystems Engineering 248: 240–254. doi: 10.1016/j.biosystemseng.2024.11.005.
Wang, X., H. Zeng, X. Yang, J. Shu, Q. Wu, et al. 2025. Remote sensing revolutionizing agriculture: Toward a new frontier. Future Generation Computer Systems 166: 107691. doi: 10.1016/j.future.2024.107691.
Weiss, M., F. Jacob, and G. Duveiller. 2020. Remote sensing for agricultural applications: A meta-review. Remote Sensing of Environment 236: 111402. doi: 10.1016/j.rse.2019.111402.
Wei-Ta Chen, Wei-Chuan Liu, and M.-S. Chen. 2010. Adaptive color feature extraction based on image color distributions. IEEE Transactions on image processing 19(8): 2005–2016.
Wickham, H. 2016. ggplot2: elegant graphics for data analysis. Second edition. Springer international publishing, Cham.
Xiaobo, Z., Z. Jiewen, M. Hanpin, S. Jiyong, Y. Xiaopin, et al. 2010. Genetic Algorithm Interval Partial Least Squares Regression Combined Successive Projections Algorithm for Variable Selection in Near-Infrared Quantitative Analysis of Pigment in Cucumber Leaves. Appl Spectrosc 64(7): 786–794. doi: 10.1366/000370210791666246.
Xie, Q., J. Dash, W. Huang, D. Peng, Q. Qin, et al. 2018. Vegetation Indices Combining the Red and Red-Edge Spectral Information for Leaf Area Index Retrieval. IEEE J. Sel. Top. Appl. Earth Observations Remote Sensing 11(5): 1482–1493. doi: 10.1109/JSTARS.2018.2813281.
Xu, D., and X. Guo. 2015. Some Insights on Grassland Health Assessment Based on Remote Sensing. Sensors 15(2): 3070–3089. doi: 10.3390/s150203070.
Xue, J., and B. Su. 2017. Significant Remote Sensing Vegetation Indices: A Review of Developments and Applications. Journal of Sensors 2017(1): 1353691. doi: 10.1155/2017/1353691.
Yang, C. 2020. Remote Sensing and Precision Agriculture Technologies for Crop Disease Detection and Management with a Practical Application Example. Engineering 6(5): 528–532. doi: 10.1016/j.eng.2019.10.015.
Yu, J., S.M. Sharpe, A.W. Schumann, and N.S. Boyd. 2019. Deep learning for image-based weed detection in turfgrass. European Journal of Agronomy 104: 78–84. doi: 10.1016/j.eja.2019.01.004.
Yuan, H., Z. Liu, Y. Cai, and B. Zhao. 2018. Research on Vegetation Information Extraction from Visible UAV Remote Sensing Images. 2018 Fifth International Workshop on Earth Observation and Remote Sensing Applications (EORSA). IEEE, Xi’an. p. 1–5
Zhang, X., B.A. Vinatzer, and S. Li. 2024. Hyperspectral imaging analysis for early detection of tomato bacterial leaf spot disease. Sci Rep 14(1): 27666. doi: 10.1038/s41598-024-78650-6.
Zhang, J., S. Xu, Y. Zhao, J. Sun, S. Xu, et al. 2023. Aerial orthoimage generation for UAV remote sensing: Review. Information Fusion 89: 91–120. doi: 10.1016/j.inffus.2022.08.007.
Zhao, G., Y. Pei, R. Yang, L. Xiang, Z. Fang, et al. 2022. A non-destructive testing method for early detection of ginseng root diseases using machine learning technologies based on leaf hyperspectral reflectance. Front. Plant Sci. 13. doi: 10.3389/fpls.2022.1031030.
Zhao, H., J. Wang, J. Guo, X. Hui, Y. Wang, et al. 2024. Detecting Water Stress in Winter Wheat Based on Multifeature Fusion from UAV Remote Sensing and Stacking Ensemble Learning Method. Remote Sensing 16(21): 4100. doi: 10.3390/rs16214100.
Zhou, J., M. Hu, A. Hu, C. Li, X. Ren, et al. 2022. Isolation and Genome Analysis of Pectobacterium colocasium sp. nov. and Pectobacterium aroidearum, Two New Pathogens of Taro. Front. Plant Sci. 13. doi: 10.3389/fpls.2022.852750.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/99269-
dc.description.abstract基於影像和光譜學的非性破壞性測量技術在今日已成為獲取農業和作物相關資訊的重要工具。然而由於農業非性破壞性測量應用的情境多元,不同研究問題間的空間尺度差異甚大,從基於遙測技術的橫跨多縣市的大範圍監測至單一田區,甚至近距離非破壞性的單一植株監測,皆運用非性破壞性測量相關技術,而不同空間尺度會對資料的需求以及分析方法上帶來差異。本研究旨在探討不同空間尺度下,非性破壞性測量技術在應用於農業資訊的分析上時分析工具上的探討,以及研究上的差異和限制。本研究提供了一套適用於不同空間尺度下,影像或光譜資料的分析框架,將農業遙測的流程區分為光譜資料獲取、數據校正和前處理、特徵轉換、模型建立等步驟,並回顧與展示了每個步驟中可供選擇的工具或方法,以統整不同空間尺度下的所使用的方法論。為了展示非性破壞性測量技術在不同空間尺度下的應用,本研究透過三個分別在不同空間尺度下的案例,顯示非性破壞性測量技術應用於農業資訊分析的成果。在大空間尺度的案例一中,本案例使用基於無人機收集的可見光影像,使用了色彩特徵,以及本案例所提出的基於灰度共生矩陣的紋理特徵 (Gray-Level Co-occurrence matrix vector, GLCMv),並結合機器學習的 XGBoost 演算法,在嘉義與台南兩縣市進行田區地物種類的辨識。結果顯示,最佳特徵組合下的準確度達到82%,同時相較於傳統紋理特徵,本案例所提出的 GLCMv 能顯著的提升模型準確度。在中等空間尺度的案例二中,本案例使用了基於一款商用自走式割草機器人為載具,加裝可見光相機後,在多塊公園和開放式空間的草地上,於執行割草任務的同時進行影像收集,並利用植生指數 (Vegetation Index, VI) 以及 k-means 分群演算法來計算草坪上草和綠草的覆蓋率,以及結合機器人的座標位置,利用覆蓋率指標繪製地圖和尋找空間中低覆蓋率的熱點區域。結果顯示基於影像所得到的草覆蓋率和綠草覆蓋率兩項指標皆和專家的標註結果存在有相同的趨勢,案例中也建立起一套影像指標和專家標註間的轉換表。同時在地圖繪製的部分,不論是使用單次影像或是時序性割草任務,都能有效觀察到熱點區域的位置,且所辨識的熱點區域和人工在草地旁觀察到的結果一致,而在將影像指標轉換為專家標註的系統後,能更有效的提供管理上的建議。同時該系統在用於高爾夫球場等高度管理的場域時,也能在影像指標的層級上辨識出覆蓋率較低的區域。在小空間尺度的案例三中,展示了使用手持式高光譜儀,在單株芋植株的葉面上測量反射率,以預測一個月後芋地下部的軟腐病 (Bacterial soft rot) 罹病情形。本案例將芋地下部因軟腐病造成的重量折損率,透過 0% 和 5% 兩個閾值轉換為折損與否以定義罹病情形,並在光譜特徵上使用了主成分分析 (Principal Component Analysis, PCA) 、稀疏化以及基於平均和變異係數等統計量的特徵篩選三種不同的特徵篩選與處理技術來處理高維度的高光譜儀資料,利用基於深度學習的自編碼器 (Autoencoder) 建立異常檢測器,辨識罹病的異常植株,並利用特徵篩選的方式挑選出對分類結果有高影響力的關鍵波段。結果顯示,在 5% 的折損閾值,以及使用 forward selection 進行特徵篩選時,本案例能在測試中達到最高 0.81 的 F1 score,有效的及早預測出罹病植株。而同時本案例也篩選出 17 個重要的波段,分別位於紫外光、綠光、紅光、紅光邊緣、近紅外光、短波長紅外光等範圍中,並且有 12 個波段已在前人研究中被提及和葉綠素、黃酮類化合物、蛋白質與氮含量和水分等植物體內的化合物含量有關。在不同案例間,基於不同的空間尺度以及研究目標上而存在有載具、波段和資料處理與模型建立方式上的差異。但如本研究所提出的框架所示,在每個步驟中,不同尺度下的案例在欲達成的目標上是一致的,方法選擇上的差異則是由於資料的收集或形態上的不同而導致。zh_TW
dc.description.abstractNon-destructive sensing techniques based on imaging and spectroscopy have become an important tool for obtaining agricultural and crop-related information today. However, due to the diverse contexts of agricultural non-destructive measurement applications, there existed huge differences in spatial scales across different research questions. These range from large-scale monitoring by remote sensing across counties to single fields, and even close-range sensing of individual plants, all of which belong to non-destructive measurement-related technologies. Different spatial scales result in variations in data requirements and analytical methods. This study aims to explore the analytical tools and research differences and limitations associated with the application of non-destructive measurement technologies in agricultural information analysis across different spatial scales. This study provides a framework for analyzing image or spectral data across different spatial scales, dividing the agricultural remote sensing process into steps such as spectral data acquisition, data correction and preprocessing, feature transformation, and model establishment. It reviews and demonstrates the tools or methods available at each step to integrate the methodologies used across different spatial scales. To demonstrate the application of non-destructive measurement techniques at different spatial scales, this study presents three cases under different spatial scales to shows the application of non-destructive measurement techniques in agricultural information analysis. In Case Study 1, which focuses on a large spatial scale, this case uses visible light images collected by drones, employing color features and the texture features based on the gray-level co-occurrence matrix vector (GLCMv) proposed in this case, combined with the XGBoost algorithm of machine learning, to identify the types of land cover in farmland areas in Chiayi and Tainan counties. The results showed that the accuracy reached 82% under the optimal feature combination. Compared to traditional texture features, the GLCMv proposed in this case significantly improved the model accuracy. In Case 2, which focuses on the medium spatial scale, this study utilized a commercial autonomous lawn mower as the carrier, equipped with a visible light camera, to collect images while performing mowing tasks on multiple parks and open-space grasslands. Vegetation Index (VI) and the k-means clustering algorithm to calculate the coverage rates of grass and green grass on the lawn. By combining the robot's coordinate position with the coverage rate indicators, maps were created and low-coverage hotspot areas in the space were identified. The results showed that the two indicators—grass coverage and green grass coverage—derived from the images exhibited the same trends as the expert annotations. A conversion table between the image indicators and expert annotations was also established in this case. In terms of map generation, whether using a single image or sequential mowing tasks, hotspot locations can be effectively identified, and the identified hotspots align with those observed manually alongside the grass. After converting image metrics into expert annotations, the system can provide more effective management recommendations. Furthermore, when applied to highly managed environments such as golf courses, the system can identify areas with lower coverage at the image indicator level. In Case Study 3, which was conducted on a small spatial scale, a handheld hyperspectral imager was used to measure the reflectance of the leaves of individual taro plants to predict the incidence of bacterial soft rot in the underground parts of the taro plants one month later. The weight loss rate caused by soft rot in the underground parts of taro plants was used to determine whether damage had occurred, using two thresholds of 0% and 5% to define the abnormal or not. Spectral features were processed using three different feature selection and processing techniques: principal component analysis (PCA), sparsification, and feature selection based on statistical measures such as mean and coefficient of variation to process high-dimensional hyperspectral data. An anomaly detector was established by a deep learning-based autoencoder to identify abnormal plants with disease, and feature selection was used to select key bands with a high impact on the classification results. The results showed that at a 5% loss threshold and using forward selection for feature selection, this case achieved a maximum F1 score of 0.81 in the test, effectively predicting diseased plants at an early stage. Additionally, this case identified 17 important bands spanning ultraviolet, green, red, red edge, near-infrared, and short-wave infrared regions, with 12 of these bands previously mentioned in prior studies as being associated with plant compounds such as chlorophyll, flavonoids, proteins, nitrogen content, and moisture levels. Across different cases, differences in spatial scales and research objectives lead to variations in platforms, bands, data processing, and model development methods. The objectives sought in each step are consistent across cases at different scales, while differences in method selection arise from variations in data collection or form.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2025-08-21T17:03:33Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2025-08-21T17:03:33Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents口試委員會審定書 I
致謝 II
中文摘要 III
英文摘要 V
目次 VII
圖次 XI
表次 XIII
1.前言 1
2.文獻回顧 4
2.1. 光譜資料處理流程 4
2.2. 光譜資料獲取 4
2.2.1. 遙測載具 4
2.2.2. 光譜波段 9
2.3. 感測器 13
2.4. 數據校正和前處理 15
2.4.1. 空間校正 15
2.4.2. 光譜校正 15
2.4.3. 正射拼接處理 16
2.4.4. 植生指數 (Vegetation Index, VI) 17
2.5. 特徵轉換 18
2.5.1. 色彩特徵 19
2.5.2. 紋理特徵 19
2.6. 模型建立 21
3.案例一:使用無人機可見光影像進行大範圍地物辨識 25
3.1. 案例介紹 25
3.2. 材料與方法 25
3.2.1. 資料取得 25
3.2.2. 工作流程 28
3.2.3. 資料前處理 29
3.2.3.1. 影像切割 29
3.2.3.2. 重採樣 29
3.2.3.3. 影像標註 30
3.2.4. 特徵轉換 31
3.2.5. 模型訓練 32
3.3. 結果與討論 34
3.3.1. 重採樣 34
3.3.2. 特徵抽取 35
3.3.3. 測試結果 37
3.3.4. 色彩特徵、GLCMv 和傳統紋理特徵的比較 40
4.案例二:使用自走機器人可見光影像進行草坪覆蓋度與健康度分析 42
4.1. 案例介紹 42
4.2. 材料與方法 43
4.2.1. 影像與座標資料取得 43
4.2.2. 資料分析流程 46
4.2.3. 資料前處理 46
4.2.3.1. 影像對齊與裁切 46
4.2.3.2. 主相機影像校正 48
4.2.4. 特徵轉換 48
4.2.4.1. 植生指數計算 49
4.2.4.2. 透射轉換 (Perspective Transformation) 49
4.2.4.3. 覆蓋度計算 50
4.2.5. 地圖繪製 51
4.3. 結果與討論 51
4.3.1. 資料取得 52
4.3.2. 資料前處理 54
4.3.3. 特徵轉換 54
4.3.4. 驗證結果 56
4.3.4.1. 單次拍攝 56
4.3.4.2. 時序報告 58
4.3.4.3. 高爾夫球場 59
5.案例三:使用手持式高光譜儀進行芋軟腐病預測 62
5.1. 案例介紹 62
5.2. 材料與方法 63
5.2.1. 資料收集 63
5.2.2. 資料分析流程 63
5.2.3. 折損率分群 64
5.2.4. 特徵篩選與處理 64
5.2.4.1. 主成分分析 (Principal Component Analysis, PCA) 65
5.2.4.2. 稀疏化 65
5.2.4.3. 波段篩選 65
5.2.5. 模型建立 66
5.3. 結果與討論 67
5.3.1. 高光譜反射率以及折損率資料 67
5.3.2. 特徵篩選與處理 71
5.3.3. 異常檢測模型 74
5.3.4. 關鍵波段 74
6.討論 77
6.1. 載具與波段選擇 78
6.2. 分析單位 78
6.3. 前處理 79
6.4. 特徵計算與處理 79
6.5. 地真資料特性 80
6.6. 模型建立 81
7.結論 83
參考文獻 84
-
dc.language.isozh_TW-
dc.subject非性破壞性測量zh_TW
dc.subject遙測zh_TW
dc.subject農業光譜資訊分析zh_TW
dc.subject地物辨識zh_TW
dc.subject紋理特徵zh_TW
dc.subject植生指數zh_TW
dc.subject病害預測zh_TW
dc.subjectRemote Sensingen
dc.subjectDisease Predictionen
dc.subjectVegetation Indexen
dc.subjectTexture Featuresen
dc.subjectLand Used Type Classificationen
dc.subjectAgricultural Spectral Information Analysisen
dc.subjectNon-destructive sensingen
dc.title非破壞性感測技術應用於不同空間規模之農業光譜資訊分析zh_TW
dc.titleApplications of Non-destructive Sensing Technologies for Agriculture Spectrum Information Analysis under Different Spatial Scalesen
dc.typeThesis-
dc.date.schoolyear113-2-
dc.description.degree博士-
dc.contributor.oralexamcommittee賴瑞聲;王裕文;楊傳凱;林汶鑫zh_TW
dc.contributor.oralexamcommitteeJui-Sheng Lai;Yue-Wen Wang;Chuan-Kai Yang;Wen-Shin Linen
dc.subject.keyword非性破壞性測量,遙測,農業光譜資訊分析,地物辨識,紋理特徵,植生指數,病害預測,zh_TW
dc.subject.keywordNon-destructive sensing,Remote Sensing,Agricultural Spectral Information Analysis,Land Used Type Classification,Texture Features,Vegetation Index,Disease Prediction,en
dc.relation.page93-
dc.identifier.doi10.6342/NTU202502769-
dc.rights.note同意授權(全球公開)-
dc.date.accepted2025-08-01-
dc.contributor.author-college生物資源暨農學院-
dc.contributor.author-dept農藝學系-
dc.date.embargo-lift2030-07-28-
顯示於系所單位:農藝學系

文件中的檔案:
檔案 大小格式 
ntu-113-2.pdf
  此日期後於網路公開 2030-07-28
13.55 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved