請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/98597完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 林達德 | zh_TW |
| dc.contributor.advisor | Ta-Te Lin | en |
| dc.contributor.author | 陳姵瑜 | zh_TW |
| dc.contributor.author | Pei-Yu Chen | en |
| dc.date.accessioned | 2025-08-18T01:01:22Z | - |
| dc.date.available | 2025-08-18 | - |
| dc.date.copyright | 2025-08-15 | - |
| dc.date.issued | 2025 | - |
| dc.date.submitted | 2025-08-05 | - |
| dc.identifier.citation | Akyon, F. C., Altinuc, S. O., & Temizel, A. (2019). Slicing aided hyper inference and fine-tuning for small object detection. arXiv preprint arXiv:2202.06934. https://doi.org/10.48550/arXiv.2202.06934
Alonso, R. S., Sittón-Candanedo, I., García, Ó., Prieto, J., & Rodríguez-González, S. (2020). An intelligent Edge-IoT platform for monitoring livestock and crops in a dairy farming scenario. Ad Hoc Networks, 98, 102047. https://doi.org/10.1016/j.adhoc.2019.102047 Cai, Z., He, X., Sun, J., & Vasconcelos, N. (2017). Deep learning with low precision by half-wave gaussian quantization. Proceedings of the IEEE conference on computer vision and pattern recognition, (pp. 5918-5926). Cao, K., Liu, Y., Meng, G., & Sun, Q. (2020). An overview on edge computing research. IEEE Access, 8, 85714-85728. https://doi.org/10.1109/ACCESS.2020.2991734 Chen, C.-J., Huang, Y.-Y., Li, Y.-S., Chen, Y.-C., Chang, C.-Y., & Huang, Y.-M. (2021). Identification of fruit tree pests with deep learning on embedded drone to achieve accurate pesticide spraying. IEEE Access, 9, 21986-21997. https://doi.org/10.1109/ACCESS.2021.3056082 Chen, J., Chen, W., Zeb, A., Zhang, D., & Nanehkaran, Y. A. (2021). Crop pest recognition using attention-embedded lightweight network under field conditions. Applied entomology and zoology, 56(4), 427-442. Chen, J., Mai, H., Luo, L., Chen, X., & Wu, K. (2021). Effective feature fusion network in BIFPN for small object detection. 2021 IEEE international conference on image processing (ICIP), (pp. 699-703). Cho, J. H., & Hariharan, B. (2019). On the efficacy of knowledge distillation. Proceedings of the IEEE/CVF international conference on computer vision, (pp. 4794-4802). Cloyd, R. A. (2015). Ecology of Fungus Gnats (Bradysia spp.) in Greenhouse Production Systems Associated with Disease-Interactions and Alternative Management Strategies. Insects, 6(2), 325-332. https://doi.org/10.3390/insects6020325 Debauche, O., Mahmoudi, S., Mahmoudi, S. A., Manneback, P., Bindelle, J., & Lebeau, F. (2020). Edge computing and artificial intelligence for real-time poultry monitoring. Procedia computer science, 175, 534-541. https://doi.org/10.1016/j.procs.2020.07.076 Ebrahimi, M., Khoshtaghaza, M. H., Minaei, S., & Jamshidi, B. (2017). Vision-based pest detection based on SVM classification method. Computers and electronics in agriculture, 137, 52-58. https://doi.org/10.1016/j.compag.2017.03.016 Erisoglu, M., Calis, N., & Sakallioglu, S. (2011). A new algorithm for initial cluster centers in k-means algorithm. Pattern Recognition Letters, 32(14), 1701-1705. https://doi.org/10.1016/j.patrec.2011.07.011 Faithpraise, F., Birch, P., Young, R., Obu, J., Faithpraise, B., & Chatwin, C. (2013). Automatic plant pest detection and recognition using k-means clustering algorithm and correspondence filters. Faustino, G. M., Gattass, M., Rehen, S., & de Lucena, C. J. (2009). Automatic embryonic stem cells detection and counting method in fluorescence microscopy images. 2009 IEEE International Symposium on Biomedical Imaging: From Nano to Macro, (pp. 799-802). Feng, F., Dong, H., Zhang, Y., Zhang, Y., & Li, B. (2022). Ms-aln: Multiscale attention learning network for pest recognition. IEEE Access, 10, 40888-40898. https://doi.org/10.1109/ACCESS.2022.3167397 Freitas, L., Martins, V., de Aguiar, M., de Brisolara, L., & Ferreira, P. (2022). Deep learning embedded into smart traps for fruit insect pests detection. ACM Transactions on Intelligent Systems and Technology, 14(1), 1-24. https://doi.org/10.1145/3552435 Gray, R. M., & Neuhoff, D. L. (1998). Quantization. IEEE transactions on information theory, 44(6), 2325-2383. https://doi.org/10.1109/18.720541 He, Y., Zeng, H., Fan, Y., Ji, S., & Wu, J. (2019). Application of deep learning in integrated pest management: a real‐time system for detection and diagnosis of oilseed rape pests. Mobile Information Systems, 2019(1), 4570808. https://doi.org/10.1155/2019/4570808 Hinton, G., Vinyals, O., & Dean, J. (2015). Distilling the knowledge in a neural network. arXiv preprint arXiv:1503.02531. Hu, J., Shen, L., & Sun, G. (2018). Squeeze-and-excitation networks. Proceedings of the IEEE conference on computer vision and pattern recognition, (pp. 7132-7141). Huddar, S. R., Gowri, S., Keerthana, K., Vasanthi, S., & Rupanagudi, S. R. (2012). Novel algorithm for segmentation and automatic identification of pests on plants using image processing. 2012 Third International Conference on Computing, Communication and Networking Technologies (ICCCNT'12), (pp. 1-5). Jiao, L., Xie, C., Chen, P., Du, J., Li, R., & Zhang, J. (2022). Adaptive feature fusion pyramid network for multi-classes agricultural pest detection. Computers and electronics in agriculture, 195, 106827. https://doi.org/10.1016/j.compag.2022.106827 Junaid, M., & Gokce, A. (2024). Global agricultural losses and their causes. Bulletin of Biological and Allied Sciences Research, 2024(1), 66-66. https://doi.org/10.54112/bbasr.v2024i1.66 Kariyanna, B., & Sowjanya, M. (2024). Unravelling the use of artificial intelligence in management of insect pests. Smart Agricultural Technology, 100517. https://doi.org/10.1016/j.atech.2024.100517 Kingsford, C., & Salzberg, S. L. (2008). What are decision trees? Nature biotechnology, 26(9), 1011-1013. Lee, J., Park, S., Mo, S., Ahn, S., & Shin, J. (2020). Layer-adaptive sparsity for the magnitude-based pruning. arXiv preprint arXiv:2010.07611. https://doi.org/10.48550/arXiv.2010.07611 Li, H., Kadav, A., Durdanovic, I., Samet, H., & Graf, H. P. (2016). Pruning filters for efficient convnets. arXiv preprint arXiv:1608.08710. Li, W., Zheng, T., Yang, Z., Li, M., Sun, C., & Yang, X. (2021). Classification and detection of insects from field images using deep learning for smart pest management: A systematic review. Ecological Informatics, 66, 101460. https://doi.org/10.1016/j.ecoinf.2021.101460 Li, Y., Wang, H., Dang, L. M., Sadeghi-Niaraki, A., & Moon, H. (2020). Crop pest recognition in natural scenes using convolutional neural networks. Computers and electronics in agriculture, 169, 105174. https://doi.org/10.1016/j.compag.2019.105174 Liang, J., Tian, M., & Liu, X. (2024). Rapid detection of multi‐scale cotton pests based on lightweight GBW‐YOLOv5 model. Pest management science, 80(6), 2738-2750. https://doi.org/10.1002/ps.7978 Lin, T.-L., Chang, H.-Y., & Chen, K.-H. (2019). Pest and disease identification in the growth of sweet peppers using faster R-CNN. 2019 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW), (pp. 1-2). Lippi, M., Bonucci, N., Carpio, R. F., Contarini, M., Speranza, S., & Gasparri, A. (2021). A yolo-based pest detection system for precision agriculture. 2021 29th Mediterranean Conference on Control and Automation (MED), (pp. 342-347). Liu, L., Wang, R., Xie, C., Li, R., Wang, F., & Qi, L. (2022). A global activated feature pyramid network for tiny pest detection in the wild. Machine Vision and Applications, 33(5), 76. Liu, T., Chen, W., Wu, W., Sun, C., Guo, W., & Zhu, X. (2016). Detection of aphids in wheat fields using a computer vision technique. Biosystems Engineering, 141, 82-93. https://doi.org/10.1016/j.biosystemseng.2015.11.005 Liu, X., Wang, T., Yang, J., Tang, C., & Lv, J. (2024). MPQ-YOLO: Ultra low mixed-precision quantization of YOLO for edge devices deployment. Neurocomputing, 574, 127210. https://doi.org/10.1016/j.neucom.2023.127210 Liu, Y., Zhang, X., Gao, Y., Qu, T., & Shi, Y. (2022). Improved CNN method for crop pest identification based on transfer learning. Computational intelligence and neuroscience, 2022(1), 9709648. https://doi.org/10.1155/2022/9709648 Liu, Z., Li, J., Shen, Z., Huang, G., Yan, S., & Zhang, C. (2017). Learning efficient convolutional networks through network slimming. Proceedings of the IEEE international conference on computer vision, (pp. 2736-2744). Miranda, J. L., Gerardo, B. D., & Tanguilig III, B. T. (2014). Pest detection and extraction using image processing techniques. International journal of computer and communication engineering, 3(3), 189-192. Morse, J. G., & Hoddle, M. S. (2006). Invasion biology of thrips. Annual review of entomology, 51(1), 67-89. https://doi.org/10.1146/annurev.ento.51.110104.151044 Navas-Castillo, J., Fiallo-Olivé, E., & Sánchez-Campos, S. (2011). Emerging virus diseases transmitted by whiteflies. Annual review of phytopathology, 49(1), 219-248. https://doi.org/10.1146/annurev-phyto-072910-095235 Noble, W. S. (2006). What is a support vector machine? Nature biotechnology, 24(12), 1565-1567. Peterson, L. E. (2009). K-nearest neighbor. Scholarpedia, 4(2), 1883. https://doi.org/10.4249/scholarpedia.1883 Rajan, P., Radhakrishnan, B., & Suresh, L. P. (2016). Detection and classification of pests from crop images using support vector machine. 2016 international conference on emerging technological trends (ICETT), (pp. 1-6). Rustia, D. J. A., Chiu, L. Y., Lu, C. Y., Wu, Y. F., Chen, S. K., Chung, J. Y., Hsu, J. C., & Lin, T. T. (2022). Towards intelligent and integrated pest management through an AIoT‐based monitoring system. Pest management science, 78(10), 4288-4302. https://doi.org/10.1002/ps.7048 Rustia, D. J. A., Lin, C. E., Chung, J.-Y., Zhuang, Y.-J., Hsu, J.-C., & Lin, T.-T. (2020). Application of an image and environmental sensor network for automated greenhouse insect pest monitoring. Journal of Asia-Pacific Entomology, 23(1), 17-28. https://doi.org/10.1016/j.aspen.2019.11.006 Rustia, D. J. A., & Lin, T.-T. (2017). An IoT-based wireless imaging and sensor node system for remote greenhouse pest monitoring. Chemical Engineering Transactions, 58, 601-606. Saradopoulos, I., Potamitis, I., Rigakis, I., Konstantaras, A., & Barbounakis, I. S. (2024). Image Augmentation Using Both Background Extraction and the SAHI Approach in the Context of Vision-Based Insect Localization and Counting. Information, 16(1), 10. https://doi.org/10.3390/info16010010 Shi, W., & Dustdar, S. (2016). The promise of edge computing. Computer, 49(5), 78-81. https://doi.org/10.1109/MC.2016.145 Shivling, D. V., Sharma, S. K., Ghanshyam, C., Dogra, S., Mokheria, P., Kaur, R., & Arora, D. (2015). Low cost sensor based embedded system for plant protection and pest control. 2015 International Conference on Soft Computing Techniques and Implementations (ICSCTI), (pp. 179-184). Song, Z., Zhang, Y., Liu, Y., Yang, K., & Sun, M. (2022). MSFYOLO: Feature fusion-based detection for small objects. IEEE Latin America Transactions, 20(5), 823-830. https://doi.org/10.1109/TLA.2022.9693567 Sun, S., Ren, W., Li, J., Wang, R., & Cao, X. (2024). Logit standardization in knowledge distillation. Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, (pp. 15731-15740). Tang, Z., Lu, J., Chen, Z., Qi, F., & Zhang, L. (2023). Improved Pest-YOLO: Real-time pest detection based on efficient channel attention mechanism and transformer encoder. Ecological Informatics, 78, 102340. https://doi.org/10.1016/j.ecoinf.2023.102340 Tung, F., & Mori, G. (2019). Similarity-preserving knowledge distillation. Proceedings of the IEEE/CVF international conference on computer vision, (pp. 1365-1374). Wang, J., Li, Y., Feng, H., Ren, L., Du, X., & Wu, J. (2020). Common pests image recognition based on deep convolutional neural network. Computers and electronics in agriculture, 179, 105834. Wei, D., Chen, J., Luo, T., Long, T., & Wang, H. (2022). Classification of crop pests based on multi-scale feature fusion. Computers and electronics in agriculture, 194, 106736. Wen, C., & Guyer, D. (2012). Image-based orchard insect automated identification and classification method. Computers and electronics in agriculture, 89, 110-115. https://doi.org/10.1016/j.compag.2012.08.008 Woo, S., Park, J., Lee, J.-Y., & Kweon, I. S. (2018). Cbam: Convolutional block attention module. Proceedings of the European conference on computer vision (ECCV), (pp. 3-19). Wu, D., Lv, S., Jiang, M., & Song, H. (2020). Using channel pruning-based YOLO v4 deep learning algorithm for the real-time and accurate detection of apple flowers in natural environments. Computers and electronics in agriculture, 178, 105742. https://doi.org/10.1016/j.compag.2020.105742 Wu, H., Judd, P., Zhang, X., Isaev, M., & Micikevicius, P. (2020). Integer quantization for deep learning inference: Principles and empirical evaluation. arXiv preprint arXiv:2004.09602. https://doi.org/10.48550/arXiv.2004.09602 Xu, R., Yu, J., Ai, L., Yu, H., & Wei, Z. (2024). Farmland pest recognition based on Cascade RCNN Combined with Swin-Transformer. Plos one, 19(6), e0304284. https://doi.org/10.1371/journal.pone.0304284 Yang, Z., & Zhang, H. (2021). Comparative analysis of structured pruning and unstructured pruning. International Conference on Frontier Computing, (pp. 882-889). Ye, R., Gao, Q., Qian, Y., Sun, J., & Li, T. (2024). Improved yolov8 and sahi model for the collaborative detection of small targets at the micro scale: A case study of pest detection in tea. Agronomy, 14(5), 1034. https://doi.org/10.3390/agronomy14051034 Zeng, N., Wu, P., Wang, Z., Li, H., Liu, W., & Liu, X. (2022). A small-sized object detection oriented multi-scale feature fusion approach with application to defect detection. IEEE Transactions on Instrumentation and Measurement, 71, 1-14. https://doi.org/10.1109/TIM.2022.3153997 Zhang, Y., Xie, F., Huang, L., Shi, J., Yang, J., & Li, Z. (2021). A lightweight one-stage defect detection network for small object based on dual attention mechanism and PAFPN. Frontiers in Physics, 9, 708097. | - |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/98597 | - |
| dc.description.abstract | 粉蝨、薊馬與蕈蚊等小型害蟲不僅是多種植物病害的傳播媒介,亦對農作物生產造成重大威脅。由於這些害蟲體積微小、外觀相似使得自動化檢測極具挑戰。本研究針對溫室黏蟲紙害蟲檢測需求,開發一套可攜式害蟲檢測設備,能即時進行小型昆蟲檢測與計數。系統整合了一個高解析度 64 MP 影像模組、 LED 側光照明組件,以及Raspberry pi嵌入式邊緣運算平台,無需依賴遠端伺服器即可完成完整檢測流程。系統亦配置直觀的觸控式操作介面及雲端連線功能,以支援數據同步及長期害蟲族群分析。為提升小型害蟲檢測的精準度,本研究提出改進的目標檢測框架 CAFE-YOLO(Context-Aware Feature Enhanced YOLO),引入 Edge Boost Stem、Adaptive Downsampling 與 Context-Aware Feature Pyramid Network 三個架構模組,強化小目標特徵提取能力。CAFE-YOLO 在測試中達到 0.946 的 mAP 與 0.887 的 F1-score,並透過結構化剪枝與知識蒸餾進行模型輕量化,兼顧嵌入式裝置運算效率與檢測準確率。系統同時採用密度感知閾值調整策略(Density-Aware Thresholding, DAT),能依據局部害蟲密度動態調整檢測閾值,以提升在高密度及雜訊環境下的檢測穩健性。於四個場域進行的實地實驗顯示,本系統達成 8.08 的平均絕對誤差(MAE)、10.52 的均方根誤差(RMSE) 與 5.19 的平均絕對百分比誤差(MAPE),相較於傳統檢測框架在小型及密集害蟲檢測上有顯著提升,同時在嵌入式裝置上可於 48.3 秒內完成單張影像推論。 | zh_TW |
| dc.description.abstract | Whiteflies, thrips, and fungus gnats are major agricultural pests that serve as vectors for plant diseases and pose significant threats to crop productivity. Their small size, high visual similarity, and the visually complex environment of greenhouses make automated detection particularly challenging. This study presents the development of a portable, intelligent pest detection system specifically designed for real-time monitoring of small insects on sticky traps. The system integrates a high-resolution 64 MP imaging module, a custom-engineered LED side-lighting assembly to minimize glare, and an embedded edge-computing platform that enables on-device, real-time inference without reliance on remote servers. A user-friendly touchscreen interface and seamless cloud connectivity allow for centralized data synchronization and long-term pest population analysis. To achieve high detection accuracy for small pests, this research proposes a novel object detection framework, CAFE-YOLO (Context-Aware Feature Enhanced YOLO), which introduces three architectural modules: Edge Boost Stem, Adaptive Downsampling, and Context-Aware Feature Pyramid Network. CAFE-YOLO achieves a mean average precision of 0.946 and an F1-score of 0.887. The model is further optimized through pruning and knowledge distillation to meet embedded hardware constraints while maintaining detection performance. The system also incorporates a density-aware thresholding strategy, which dynamically adjusts detection thresholds based on local pest density to improve robustness in cluttered scenes. Field experiments conducted across four greenhouse sites demonstrated that the system achieves a mean absolute error of 8.08, a root mean square error of 10.52, and a mean absolute percentage error of 5.19. Compared to conventional detection frameworks, CAFE-YOLO delivered substantially improved accuracy, especially for small and densely clustered pests, while maintaining efficient on-device processing with an inference time as low as 48.3 seconds per image. This research demonstrates an integrated, lightweight, and high-precision pest detection solution. The device enhances the accuracy and efficiency of greenhouse pest monitoring, reduces manual labor, and provides timely, data-driven decision support for integrated pest management in smart agriculture. | en |
| dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2025-08-18T01:01:22Z No. of bitstreams: 0 | en |
| dc.description.provenance | Made available in DSpace on 2025-08-18T01:01:22Z (GMT). No. of bitstreams: 0 | en |
| dc.description.tableofcontents | 誌謝 i
中文摘要 ii Abstract iii Table of Contents v List of Figures viii List of Tables xi CHAPTER 1 Introduction 1 1.1 Background of the Study 1 1.2 Objectives 3 CHAPTER 2 Literature Review 5 2.1 Agricultural Impacts of Insect Pests 5 2.2 Image-Based Techniques for Insect Pest Detection 8 2.2.1 Conventional Image Processing Methods 9 2.2.2 Machine Learning-Based Approaches 10 2.2.3 Conventional Image Processing Methods 12 2.3 Advances in Small Object Detection 14 2.3.1 Attention-Based Feature Enhancement 14 2.3.2 Multi-Scale Feature Fusion 16 2.3.3 High-Resolution Sliced Inference 17 2.4 Lightweight Model Design for Edge Applications 18 2.4.1 Integer and Mixed-Precision Quantization Techniques 19 2.4.2 Network Model Pruning 20 2.4.3 Knowledge Distillation 22 2.5 Edge-AI Integration for Agricultural Monitoring 23 CHAPTER 3 Materials and Methods 25 3.1 System Overview and Device Architecture 25 3.1.1 Device Architecture and User Interface 25 3.1.2 Operational Flow of the Pest Detection System 29 3.2 Pest Annotation and Dataset Composition 30 3.3 Image Preprocessing Workflow 34 3.4 CAFE-YOLO: Enhanced YOLO Network 36 3.4.1 Edge Boost Stem(EBStem) 37 3.4.2 Adaptive Downsampling(ADown) 39 3.4.3 Context-Aware Feature Pyramid Network (CAFPN) 40 3.5 Model Optimization for Edge Inference 42 3.5.1 Structured Model Pruning Strategy 42 3.5.2 Knowledge Distillation for Compact Models 45 3.6 Slice-Based High-Resolution Inference 49 3.7 Density-Aware Dynamic Thresholding 51 3.8 Edge Deployment and Cloud Integration 57 3.8.1 On-Device Inference Acceleration 57 3.8.2 Cloud Connectivity via AWS IoT 58 CHAPTER 4 Results and Discussion 60 4.1 Device Hardware Performance 60 4.2 Model Performance Evaluation 63 4.2.1 Comparative Experiments 64 4.2.2 Ablation Study 69 4.2.3 Edge Operator Comparison within EBStem 71 4.2.4 Attention Map Visualization 72 4.3 Model Compression Effectiveness 73 4.3.1 Pruning Strategy Analysis 74 4.3.2 Knowledge Distillation Evaluation 78 4.4 Slice-Based Inference Evaluation 81 4.5 Edge Inference Performance with AI Kit 84 4.6 Field-Tested System Performance 85 4.6.1 Performance Evaluation across Field Sites 85 4.6.2 Comparison between Portable Device and Alternative Detection 93 4.7 Website Deployment via AWS IoT 97 CHAPTER 5 Conclusions and Suggestions 99 5.1 Conclusions 99 5.2 Suggestions 101 References 103 | - |
| dc.language.iso | en | - |
| dc.subject | 攜帶式設備 | zh_TW |
| dc.subject | 害蟲檢測 | zh_TW |
| dc.subject | 深度學習 | zh_TW |
| dc.subject | 機器視覺 | zh_TW |
| dc.subject | 邊緣運算 | zh_TW |
| dc.subject | 小型物件檢測 | zh_TW |
| dc.subject | Deep learning | en |
| dc.subject | Portable device | en |
| dc.subject | Small-object detection | en |
| dc.subject | Edge computing | en |
| dc.subject | Machine vision | en |
| dc.subject | Pest detection | en |
| dc.title | 攜帶式小型害蟲檢測裝置設計與影像辨識模型優化 | zh_TW |
| dc.title | Design of a Portable Detection Device for Small Insect Pests with Optimized Image Recognition Models | en |
| dc.type | Thesis | - |
| dc.date.schoolyear | 113-2 | - |
| dc.description.degree | 碩士 | - |
| dc.contributor.oralexamcommittee | 蔡燿全;陳淑佩 | zh_TW |
| dc.contributor.oralexamcommittee | Yao-Chuan Tsai;Shu-Pei Chen | en |
| dc.subject.keyword | 攜帶式設備,害蟲檢測,深度學習,機器視覺,邊緣運算,小型物件檢測, | zh_TW |
| dc.subject.keyword | Portable device,Pest detection,Deep learning,Machine vision,Edge computing,Small-object detection, | en |
| dc.relation.page | 111 | - |
| dc.identifier.doi | 10.6342/NTU202503936 | - |
| dc.rights.note | 同意授權(全球公開) | - |
| dc.date.accepted | 2025-08-11 | - |
| dc.contributor.author-college | 生物資源暨農學院 | - |
| dc.contributor.author-dept | 生物機電工程學系 | - |
| dc.date.embargo-lift | 2025-08-18 | - |
| 顯示於系所單位: | 生物機電工程學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-113-2.pdf | 4.34 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
