Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 生物資源暨農學院
  3. 生物機電工程學系
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/98755
Full metadata record
???org.dspace.app.webui.jsptag.ItemTag.dcfield???ValueLanguage
dc.contributor.advisor林達德zh_TW
dc.contributor.advisorTa-Te Linen
dc.contributor.author徐滋zh_TW
dc.contributor.authorTzu Hsuen
dc.date.accessioned2025-08-19T16:04:35Z-
dc.date.available2025-08-20-
dc.date.copyright2025-08-19-
dc.date.issued2025-
dc.date.submitted2025-08-05-
dc.identifier.citationAmpatzidis, Y., & Partel, V. (2019). UAV-based high throughput phenotyping in citrus utilizing multispectral imaging and artificial intelligence. Remote Sensing, 11(4). https://doi.org/10.3390/rs11040410
Asaamoning, G., Mendes, P., Rosário, D., & Cerqueira, E. (2021). Drone swarms as networked control systems by integration of networking and computing. Sensors, 21(8). https://doi.org/10.3390/s21082642
Aslan, M. F., Durdu, A., Sabanci, K., Ropelewska, E., & Gültekin, S. S. (2022). A comprehensive survey of the recent studies with UAV for precision agriculture in open fields and greenhouses. Applied Sciences, 12(3), 1047. https://www.mdpi.com/2076-3417/12/3/1047
Azzam, R., Boiko, I., & Zweiri, Y. (2023). Swarm cooperative navigation using centralized training and decentralized execution. Drones, 7(3). https://doi.org/10.3390/drones7030193
Bagagiolo, G., Matranga, G., Cavallo, E., & Pampuro, N. (2022). Greenhouse Robots: ultimate solutions to improve automation in protected cropping systems—A review. Sustainability, 14(11). https://doi.org/10.3390/su14116436
Bailey, T., Nieto, J., Guivant, J., Stevens, M., & Nebot, E. (2006, 9-15 Oct. 2006). Consistency of the EKF-SLAM algorithm. 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, https://doi.org/10.1109/IROS.2006.281644
Bouabdallah, S., Noth, A., & Siegwart, R. (2004, 28 Sept.-2 Oct. 2004). PID vs LQ control techniques applied to an indoor micro quadrotor. 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566), https://doi.org/10.1109/IROS.2004.1389776
Campos, C., Elvira, R., Rodríguez, J. J. G., Montiel, J. M. M., & Tardós, J. D. (2021). ORB-SLAM3: An accurate open-source library for visual, visual–inertial, and multimap SLAM. IEEE Transactions on Robotics, 37(6), 1874-1890. https://doi.org/10.1109/TRO.2021.3075644
Chandra, A. L., Vikas Desai, S., Guo, W., & Balasubramanian, V. N. (2020). Computer vision with deep learning for plant phenotyping in agriculture: A survey. arXiv e-prints, arXiv:2006.11391. https://doi.org/10.48550/arXiv.2006.11391
Chang, Y., Cheng, Y., Manzoor, U., & Murray, J. (2023). A review of UAV autonomous navigation in GPS-denied environments. Robotics and Autonomous Systems, 170, 104533. https://doi.org/10.1016/j.robot.2023.104533
Chen, W., Zhu, J., Liu, J., & Guo, H. (2024). A fast coordination approach for large-scale drone swarm. Journal of Network and Computer Applications, 221, 103769. https://doi.org/10.1016/j.jnca.2023.103769
Chung, S. J., Paranjape, A. A., Dames, P., Shen, S., & Kumar, V. (2018). A Survey on aerial swarm robotics. IEEE Transactions on Robotics, 34(4), 837-855. https://doi.org/10.1109/TRO.2018.2857475
Davison, A. J., Reid, I. D., Molton, N. D., & Stasse, O. (2007). MonoSLAM: Real-time single camera SLAM. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(6), 1052-1067. https://doi.org/10.1109/TPAMI.2007.1049
Engel, J., Schöps, T., & Cremers, D. (2014). LSD-SLAM: Large-scale direct monocular SLAM.Lecture Notes in Computer Science ECCV 2014, Cham. https://doi.org/10.1007/978-3-319-10605-2_54
Fiorani, F., & Schurr, U. (2013). Future scenarios for plant phenotyping. Annual review of plant biology, 64. https://doi.org/10.1146/annurev-arplant-050312-120137
Gao, T., Sheng, W., Zhang, Z., Li, H., & Zhang, M. (2024). Greenhouse phenotyping measurement techniques and systems: A review. In Y. Yun, W. Sheng, & Z. Zhang (Eds.), Advanced Sensing and Robotics Technologies in Smart Agriculture (pp. 43-59). Springer Nature Singapore. https://doi.org/10.1007/978-981-97-6441-9_3
Gompertz, B. (1825). XXIV. On the nature of the function expressive of the law of human mortality, and on a new mode of determining the value of life contingencies. In a letter to Francis Baily, Esq. FRS &c. Philosophical transactions of the Royal Society of London(115), 513-583. https://doi.org/10.1098/rspl.1815.0271
Gu, Y., Jin, X., Xiang, R., Wang, Q., Wang, C., & Yang, S. (2020). UAV-based integrated multispectral-LiDAR imaging system and data processing. Science China Technological Sciences, 63(7), 1293-1301. https://doi.org/10.1007/s11431-019-1571-0
Guédon, A., & Lepetit, V. (2023). SuGaR: Surface-aligned gaussian splatting for efficient 3D mesh reconstruction and high-quality mesh rendering. arXiv e-prints, arXiv:2311.12775. https://doi.org/10.48550/arXiv.2311.12775
Hu, J., Bruno, A., Zagieboylo, D., Zhao, M., Ritchken, B., Jackson, B., Chae, J. Y., Mertil, F., Espinosa, M., & Delimitrou, C. (2018). To centralize or not to centralize: A tale of swarm coordination. arXiv e-prints, arXiv:1805.01786. https://doi.org/10.48550/arXiv.1805.01786
Hunt Jr, E. R., & Daughtry, C. S. T. (2018). What good are unmanned aircraft systems for agricultural remote sensing and precision agriculture? International Journal of Remote Sensing, 39(15-16), 5345-5376. https://doi.org/10.1080/01431161.2017.1410300
Huynh, H. X., Tran, L. N., & Duong-Trung, N. (2023). Smart greenhouse construction and irrigation control system for optimal Brassica Juncea development. PLOS ONE, 18(10), e0292971. https://doi.org/10.1371/journal.pone.0292971
James, C., Chandra, S. S., & Chapman, S. C. (2025). A scalable and efficient UAV-based pipeline and deep learning framework for phenotyping sorghum panicle morphology from point clouds. Plant Phenomics, 100050. https://doi.org/10.1016/j.plaphe.2025.100050
Jamshidpey, A., Wahby, M., Allwright, M., Zhu, W., Dorigo, M., & Heinrich, M. K. (2024). Centralization vs. decentralization in multi-robot sweep coverage with ground robots and UAVs. arXiv e-prints, arXiv:2408.06553. https://doi.org/10.48550/arXiv.2408.06553
Ju, C., & Son, H. I. (2018). Multiple UAV systems for agricultural applications: Control, implementation, and evaluation. Electronics, 7(9), 162. https://www.mdpi.com/2079-9292/7/9/162
Khosiawan, Y., & Nielsen, I. (2016). A system of UAV application in indoor environment. Production & Manufacturing Research, 4(1), 2-22. https://doi.org/10.1080/21693277.2016.1195304
Kim, J., Kim, S., Ju, C., & Son, H. I. (2019). Unmanned Aerial Vehicles in agriculture: A review of perspective of platform, control, and applications. IEEE Access, 7, 105100-105115. https://doi.org/10.1109/ACCESS.2019.2932119
Krul, S., Pantos, C., Frangulea, M., & Valente, J. (2021). Visual SLAM for indoor livestock and farming using a small drone with a monocular camera: A feasibility study. Drones, 5(2). https://doi.org/10.3390/drones5020041
Li, D., Shi, G., Li, J., Chen, Y., Zhang, S., Xiang, S., & Jin, S. (2022). PlantNet: A dual-function point cloud segmentation network for multiple plant species. ISPRS Journal of Photogrammetry and Remote Sensing, 184, 243-263. https://doi.org/10.1016/j.isprsjprs.2022.01.007
Li, R., Wang, S., & Gu, D. (2018). Ongoing evolution of Visual SLAM from geometry to deep learning: challenges and opportunities. Cognitive Computation, 10(6), 875-889. https://doi.org/10.1007/s12559-018-9591-8
Li, Y., Liu, J., Zhang, B., Wang, Y., Yao, J., Zhang, X., Fan, B., Li, X., Hai, Y., & Fan, X. (2022). Three-dimensional reconstruction and phenotype measurement of maize seedlings based on multi-view image sequences. Frontiers in plant science, 13, 974339. https://doi.org/10.3389/fpls.2022.974339
Loayza, K., Lucas, P., & Peláez, E. (2017, October 16-20). A centralized control of movements using a collision avoidance algorithm for a swarm of autonomous agents. 2017 IEEE Second Ecuador Technical Chapters Meeting (ETCM), Salinas, Ecuador. https://doi.org/10.1109/ETCM.2017.8247496
Lu, Y., Zhucun, X., Gui-Song, X., & Zhang, L. (2018). A survey on vision-based UAV navigation. Geo-spatial Information Science, 21(1), 21-32. https://doi.org/10.1080/10095020.2017.1420509
Mur-Artal, R., Montiel, J. M. M., & Tardós, J. D. (2015). ORB-SLAM: A versatile and accurate monocular SLAM system. IEEE Transactions on Robotics, 31(5), 1147-1163. https://doi.org/10.1109/TRO.2015.2463671
Mur-Artal, R., & Tardós, J. D. (2017). ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Transactions on Robotics, 33(5), 1255-1262. https://doi.org/10.1109/TRO.2017.2705103
Pan, L., Baráth, D., Pollefeys, M., & Schönberger, J. L. (2024). Global structure-from-motion revisited. arXiv e-prints, arXiv:2407.20219. https://doi.org/10.48550/arXiv.2407.20219
Pantos, C., Hildmann, H., & Valente, J. (2023). Experimental connectivity analysis for drones in greenhouses. Drones, 7(1). https://doi.org/10.3390/drones7010024
Pardossi, A., Paola, G., Fernando, M., Marinone, A. F., Carla, M., Giovanni, S., & Vernieri, P. (2000). The influence of growing season on fruit yield and quality of greenhouse melon (Cucumis melo L.) grown in nutrient film technique in a Mediterranean climate. The Journal of Horticultural Science and Biotechnology, 75(4), 488-493. https://doi.org/10.1080/14620316.2000.11511274
Park, T., Lee, J., Oh, H., Yun, W.-J., & Lee, K.-W. (2025). Optimizing indoor farm monitoring efficiency using UAV: yield estimation in a GNSS-denied cherry tomato greenhouse. arXiv preprint arXiv:2505.00995. https://doi.org/10.48550/arXiv.2505.00995
Pieruschka, R., & Schurr, U. (2019). Plant phenotyping: past, present, and future. Plant Phenomics, 2019, 7507131. https://doi.org/10.34133/2019/7507131
Qi, C. R., Liu, W., Wu, C., Su, H., & Guibas, L. J. (2017). Frustum PointNets for 3D object detection from RGB-D data. arXiv e-prints, arXiv:1711.08488. https://doi.org/10.48550/arXiv.1711.08488
Rejeb, A., Abdollahi, A., Rejeb, K., & Treiblmaier, H. (2022). Drones in agriculture: A review and bibliometric analysis. Computers and Electronics in Agriculture, 198, 107017. https://doi.org/10.1016/j.compag.2022.107017
Roldán, J. J., Joossen, G., Sanz, D., Del Cerro, J., & Barrientos, A. (2015). Mini-UAV based sensory system for measuring environmental variables in greenhouses. Sensors, 15(2), 3334-3350. https://doi.org/10.3390/s150203334
Schlegel, D., Colosi, M., & Grisetti, G. (2018, 21-25 May 2018). ProSLAM: Graph SLAM from a Programmer's Perspective. 2018 IEEE International Conference on Robotics and Automation (ICRA), https://doi.org/10.1109/ICRA.2018.8461180
Schönberger, J. L., & Frahm, J. M. (2016, 27-30 June 2016). Structure-from-Motion revisited. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), https://doi.org/10.1109/CVPR.2016.445
Shi, W., van de Zedde, R., Jiang, H., & Kootstra, G. (2019). Plant-part segmentation using deep learning and multi-view vision. Biosystems Engineering, 187, 81-95. https://doi.org/10.1016/j.biosystemseng.2019.08.014
Singh, R., & Singh, G. S. (2017). Traditional agriculture: a climate-smart approach for sustainable food production. Energy, Ecology and Environment, 2(5), 296-316. https://doi.org/10.1007/s40974-017-0074-7
Sumikura, S., Shibuya, M., & Sakurada, K. (2019). OpenVSLAM: A versatile visual SLAM framework proceedings of the 27th ACM International Conference on Multimedia, Nice, France. https://doi.org/10.1145/3343031.3350539
Teshome, F. T., Bayabil, H. K., Hoogenboom, G., Schaffer, B., Singh, A., & Ampatzidis, Y. (2023). Unmanned aerial vehicle (UAV) imaging and machine learning applications for plant phenotyping. Computers and Electronics in Agriculture, 212, 108064. https://doi.org/10.1016/j.compag.2023.108064
Trujillo, J.-C., Munguia, R., Guerra, E., & Grau, A. (2018). Visual-based SLAM configurations for cooperative multi-UAV systems with a lead agent: An observability-based approach. Sensors, 18(12). https://doi.org/10.3390/s18124243
Vetrella, A. R., Fasano, G., Accardo, D., & Moccia, A. (2016). Differential GNSS and vision-based tracking to improve navigation performance in cooperative multi-UAV systems. Sensors, 16(12). https://doi.org/10.3390/s16122164
Wang, Q., Tan, Y., & Mei, Z. (2020). Computational methods of acquisition and processing of 3D point cloud data for construction applications. Archives of Computational Methods in Engineering, 27(2), 479-499. https://doi.org/10.1007/s11831-019-09320-4
Williams, D., Macfarlane, F., & Britten, A. (2024). Leaf only SAM: A segment anything pipeline for zero-shot automated leaf segmentation. Smart Agricultural Technology, 8, 100515. https://doi.org/10.1016/j.atech.2024.100515
Zafari, F., Gkelias, A., & Leung, K. K. (2019). A survey of indoor localization systems and technologies. IEEE Communications Surveys & Tutorials, 21(3), 2568-2599. https://doi.org/10.1109/COMST.2019.2911558
Zarei, A., Li, B., Schnable, J. C., Lyons, E., Pauli, D., Barnard, K., & Benes, B. (2024). PlantSegNet: 3D point cloud instance segmentation of nearby plant organs with identical semantics. Computers and Electronics in Agriculture, 221, 108922. https://doi.org/10.1016/j.compag.2024.108922
Zhai, Z., Martínez, J. F., Beltran, V., & Martínez, N. L. (2020). Decision support systems for agriculture 4.0: Survey and challenges. Computers and Electronics in Agriculture, 170, 105256. https://doi.org/10.1016/j.compag.2020.105256
Zhang, C., Tang, C., Wang, H., Lian, B., & Zhang, L. (2025). Data set for UWB cooperative navigation and positioning of UAV cluster. Scientific Data, 12(1), 486. https://doi.org/10.1038/s41597-025-04808-0
Zhang, H., Ma, H., Mersha, B. W., Zhang, X., & Jin, Y. (2023). Distributed cooperative search method for multi-UAV with unstable communications. Applied Soft Computing, 148, 110592. https://doi.org/10.1016/j.asoc.2023.110592
Zhang, R., Isola, P., Efros, A. A., Shechtman, E., & Wang, O. (2018). The unreasonable effectiveness of deep features as a perceptual metric. arXiv e-prints, arXiv:1801.03924. https://doi.org/10.48550/arXiv.1801.03924
Zhang, X., Bai, Y., & He, K. (2023). On countermeasures against cooperative fly of UAV swarms. Drones, 7(3). https://doi.org/10.3390/drones7030172
Zhao, J., Bodner, G., & Rewald, B. (2016). Phenotyping: Using machine learning for improved pairwise genotype classification based on root traits [Original Research]. Frontiers in plant science, Volume 7 - 2016. https://doi.org/10.3389/fpls.2016.01864
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/98755-
dc.description.abstract本研究開發了一套自主多機無人機系統,旨在進行溫室中洋香瓜作物的生長監測。透過多機無人機的協同作業,進行了多角度的影像拍攝,並設計了三種不同的飛行路徑。利用UWB定位系統進行無人機自主飛行的精度比較,並利用收集的影像進行三維重建,對洋香瓜植物進行分析。收集的影像經過處理後,用於提取關鍵的植物表型數據,本研究主要分析了植株的高度和展幅,透過高度和展幅進一步對植物的生長進行監測,通過擬合生長曲線,並將其與實際生長數據進行比較。在無人機的飛行精度方面,平行飛行路徑的誤差範圍為7至12公分,閉環飛行路徑為5至9公分,而多高度路徑的誤差範圍為4至11公分,顯示出穩定的飛行精度。接著,我們進一步比較了在相同覆蓋面積下,多機系統與單機系統的效能。結果顯示,多機系統能夠將任務時間縮短73%,並將電池消耗降低5%。在作物三維重建方面,我們比較了三種軌跡所收集的三種不同的重建方法,分別是單面、合併以及三個高度。根據評估重建結果的指標,使用PSNR、SSIM和LPIPS三個指標進行比較,結果顯示三個高度方法在重建質量上表現最佳,PSNR為0.37,SSIM為9.48,LPIPS為0.65。在植物高度的測量上,合併方法達到了最低的MAE誤差為6.6公分,而在展幅測量方面,單面方法則達到了最低的MAE誤差為5.8公分。本研究展示了多機無人機系統在溫室作物監測中的應用潛力,還證明了不同重建方法和測量策略在提高農業監測精度和效率方面的有效性。zh_TW
dc.description.abstractThis study developed an autonomous multi-drone system for muskmelon crop growth monitoring in a greenhouse. Through collaborative multi-drone operations, multi-angle images were captured and three flight paths were designed. The UWB positioning system was used to compare the accuracy of the UAV autonomous flight, and the collected images were used for 3D reconstruction to analyze muskmelon plants. The images were processed to extract key phenotypic data, focusing on plant height and canopy span, and growth monitoring was performed by fitting growth curves and comparing them with actual growth data. In terms of flight accuracy, the parallel flight path had an error range of 7 to 12 cm, the closed-loop path had an error range of 5 to 9 cm, and the multi-altitude path had an error range of 4 to 11 cm, demonstrating stable flight precision. We also compared the performance of multi-drone and single-drone systems over the same coverage area. The multi-drone system reduced mission time by 73% and battery consumption by 5%. For 3D reconstruction, we compared three methods collected along three different paths: Single-side, Merged, and Three-height. Evaluation metrics showed that the Three-height method provided the best reconstruction quality with PSNR of 0.37, SSIM of 9.48, and LPIPS of 0.65. For height measurement, the Merged method achieved the lowest MAE of 6.6 cm, and for canopy span measurement, the Single-side method achieved the lowest MAE of 5.8 cm. This study demonstrates the potential of multi-UAV systems in greenhouse crop monitoring and proves the effectiveness of different reconstruction methods and measurement strategies in improving monitoring accuracy and efficiency.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2025-08-19T16:04:35Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2025-08-19T16:04:35Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents誌謝 i
摘要 ii
ABSTRACT iii
Table of Contents iv
List of Figures vii
List of Tables xi
CHAPTER 1 Introduction 1
1.1 General Background Information 1
1.2 Research Objectives 4
CHAPTER 2 Literature Review 5
2.1 UAV Applications in Greenhouse Environments 5
2.1.1 Development and Applications 6
2.1.2 Case Studies and Challenges 8
2.2 Single‐UAV and Multi‐UAV Navigation 9
2.2.1 Single‐UAV Navigation Applications and Challenges 9
2.2.2 Multi‐UAV Navigation Applications and Challenges 10
2.2.3 Vision-based Navigation 12
2.3 Multi-UAV Control and Communication 15
2.3.1 Centralized Architecture 15
2.3.2 Distributed Architecture 16
2.3.3 Communication and Collaboration Mechanisms 17
2.4 Plant Phenotyping 18
2.4.1 Plant Phenotyping Techniques 19
2.4.2 Two-Dimensional Image-Based Approaches 20
2.4.3 Three-Dimensional Point Cloud-Based Approaches 22
CHAPTER 3 Materials and Methods 25
3.1 Architecture of Autonomous Multi-UAV System 25
3.1.1 UAV and Edge Computing Hardware Architecture 25
3.1.2 Software Architecture 29
3.1.3 Multi-UAV Communication Architecture 30
3.2 Autonomous Multi-UAV Flight in Greenhouse 31
3.2.1 Experimental Setup in Greenhouse 31
3.2.2 Visual SLAM for Localization and Mapping 33
3.2.3 PID Controller for Waypoint Navigation 34
3.2.4 UWB-based Ground Truth and SLAM Error Evaluation 34
3.2.5 Visual SLAM Map Optimization 36
3.3 Multi-UAV Cooperative Path Planning and Analysis 38
3.3.1 Cooperative Path Planning Design 38
3.3.2 Comparison Between Single and Multi-UAV Flights 39
3.4 3D Reconstruction Methods of Plants 41
3.4.1 Structure from Motion 41
3.4.2 Gaussian Splatting 43
3.4.3 Evaluation Metrics 46
3.5 Phenotyping for Muskmelon Plant 47
3.5.1 Extraction of the Single Plant 47
3.5.2 Plant Height and Canopy Span Measurement 48
3.5.3 Muskmelon Growth Monitoring 51
CHAPTER 4 Results and Discussion 52
4.1 Evaluation of Multi-UAV Communication Performance 52
4.2 Multi-UAV Trajectory and Mapping Accuracy 54
4.2.1 Visual SLAM Map Building Result 54
4.2.2 Visual SLAM Map Optimization Result 56
4.2.3 Parallel-aisle Flight Evaluation 61
4.2.4 Closed‑loop Flight Evaluation 64
4.2.5 Multi‑altitude Flight Evaluation 67
4.2.6 Multi-UAV Flight-Path Performance Comparison 70
4.3 Evaluation of Multi-UAV Cooperative 71
4.4 3D Reconstruction Analysis for Plant 73
4.4.1 Sparse 3D Reconstruction with GLOMAP 73
4.4.2 Evaluation of Gaussian splatting for 3D plant reconstruction 74
4.5 Phenotyping Result of Muskmelon 83
4.5.1 Plant Measurement Analysis 83
4.5.2 Muskmelon Growth Monitoring Analysis 88
CHAPTER 5 Conclusions and Suggestions 96
5.1 Conclusions 96
5.2 Suggestions 98
References 99
-
dc.language.isoen-
dc.subject多機無人機系統zh_TW
dc.subjectVisual SLAMzh_TW
dc.subject三維重建zh_TW
dc.subject高斯潑濺zh_TW
dc.subject表型分析zh_TW
dc.subject生長監測zh_TW
dc.subject3D reconstructionen
dc.subjectMulti-UAV systemen
dc.subjectGrowth monitoringen
dc.subjectPhenotyping analysisen
dc.subjectGaussian Spattingen
dc.subjectVisual SLAMen
dc.title應用於溫室環境作物監測之自主導航無人機多機協同系統zh_TW
dc.titleAn Autonomous Multi-UAV Cooperative Navigation System for Crop Monitoring in Greenhouse Environmentsen
dc.typeThesis-
dc.date.schoolyear113-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee顏炳郎;楊江益zh_TW
dc.contributor.oralexamcommitteePing-Lang Yen;Chiang-Yi Yangen
dc.subject.keyword多機無人機系統,Visual SLAM,三維重建,高斯潑濺,表型分析,生長監測,zh_TW
dc.subject.keywordMulti-UAV system,Visual SLAM,3D reconstruction,Gaussian Spatting,Phenotyping analysis,Growth monitoring,en
dc.relation.page107-
dc.identifier.doi10.6342/NTU202503732-
dc.rights.note同意授權(全球公開)-
dc.date.accepted2025-08-11-
dc.contributor.author-college生物資源暨農學院-
dc.contributor.author-dept生物機電工程學系-
dc.date.embargo-lift2025-08-20-
Appears in Collections:生物機電工程學系

Files in This Item:
File SizeFormat 
ntu-113-2.pdf5.22 MBAdobe PDFView/Open
Show simple item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved