Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 生物資源暨農學院
  3. 生物機電工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101511
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor郭彥甫zh_TW
dc.contributor.advisorYan-Fu Kuoen
dc.contributor.author朱王文亮zh_TW
dc.contributor.authorWen-Liang Chu Wangen
dc.date.accessioned2026-02-04T16:22:17Z-
dc.date.available2026-02-05-
dc.date.copyright2026-02-04-
dc.date.issued2026-
dc.date.submitted2026-01-28-
dc.identifier.citationAdam, K. D. B. J. (2014). A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 1412(6).
Damen, D., Doughty, H., Farinella, G. M., Fidler, S., Furnari, A., Kazakos, E., ... & Wray, M. (2018). Scaling egocentric vision: The epic-kitchens dataset. In Proceedings of the European conference on computer vision (ECCV) (pp. 720-736).
Farha, Y. A., & Gall, J. (2019). Ms-tcn: Multi-stage temporal convolutional network for action segmentation. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (pp. 3575-3584).
Fathi, A., Ren, X., & Rehg, J. M. (2011, June). Learning to recognize objects in egocentric activities. In CVPR 2011 (pp. 3281-3288). IEEE.
Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural computation, 9(8), 1735-1780.
Howard, A., Sandler, M., Chu, G., Chen, L. C., Chen, B., Tan, M., ... & Adam, H. (2019). Searching for mobilenetv3. In Proceedings of the IEEE/CVF international conference on computer vision (pp. 1314-1324).
Huang, Y., Liu, K., Lv, Y., Xiao, D., Liu, J., & Tan, Z. (2025). Behavior tracking and analysis of group-housed pigs based on deep OC-SORT. Computers and Electronics in Agriculture, 239, 111070.
Kuehne, H., Arslan, A., & Serre, T. (2014). The language of actions: Recovering the syntax and semantics of goal-directed human activities. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 780-787).
Lea, C., Flynn, M. D., Vidal, R., Reiter, A., & Hager, G. D. (2017). Temporal convolutional networks for action segmentation and detection. In proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 156-165).
LeCun, Y., Bottou, L., Bengio, Y., & Haffner, P. (2002). Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11), 2278-2324.
Lin, T. Y., Goyal, P., Girshick, R., He, K., & Dollár, P. (2017). Focal loss for dense object detection. In Proceedings of the IEEE international conference on computer vision (pp. 2980-2988).
Maaten, L. V. D., & Hinton, G. (2008). Visualizing data using t-SNE. Journal of machine learning research, 9(Nov), 2579-2605.
OECD and FAO, Paris and Rome. (2024) OECD-FAO Agricultural Outlook 2024-2033. Retrieved from https://doi.org/10.1787/4c5d2cfb-en
Oquab, M., Darcet, T., Moutakanni, T., Vo, H., Szafraniec, M., Khalidov, V., ... & Bojanowski, P. (2023). Dinov2: Learning robust visual features without supervision. arXiv preprint arXiv:2304.07193.
Pork Checkoff, United States. (2025) Life Cycle of a Market Pig. Retrieved from https://porkcheckoff.org/pork-branding/facts-statistics/life-cycle-of-a-market-pig/
S. Li, Y. A. Farha, Y. Liu, M. -M. Cheng and J. Gall (2020), MS-TCN++: Multi-Stage Temporal Convolutional Network for Action Segmentation. In IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 6, (pp. 6647-6658).
Tsai, Y. J., Huang, Y. C., Lin, E. C., Lai, S. C., Hong, X. C., Tsai, J., ... & Kuo, Y. F. (2024). Monitoring the lactation-related behaviors of sows and their piglets in farrowing crates using deep learning. Frontiers in Animal Science, 5, 1431285.
Taiwan Smart Agriweek, Taiwan. (2025). Summary of pig production industry in 2023. Retrieved from https://www.taiwanagriweek.com/media-detail/418/
Traulsen, I., Scheel, C., Auer, W., Burfeind, O., & Krieter, J. (2018). Using acceleration data to automatically detect the onset of farrowing in sows. Sensors, 18(1), 170.
U.S. Department of Agriculture, United States. (2024) Retrieved from https://www.fas.usda.gov/data/production/commodity/0113000
Wutke, M., Lensches, C., Hartmann, U., & Traulsen, I. (2024). Towards automatic farrowing monitoring—A Noisy Student approach for improving detection performance of newborn piglets. PloS one, 19(10), e0310818.
Yi, F., Wen, H., & Jiang, T. (2021). Asformer: Transformer for action segmentation. arXiv preprint arXiv:2110.08568.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101511-
dc.description.abstract肉豬是全球畜牧業的重要基石,亦在台灣畜牧業擁有很高的經濟價值。母豬分娩階段是商業豬舍中特別重要的時期,降低仔豬出生後的死亡率仍是持續存在的挑戰。傳統監測方法高度依賴人工觀察,不僅勞力密集、判定主觀。在現代集約化生產系統的趨勢下愈發不符合成本效益。本研究的主要目標是利用電腦視覺與深度學習,開發一套能夠在商業分娩欄位中自動偵測個別仔豬出生事件的方法。本研究自兩座商業豬舍共21個分娩欄位收集了467部仔豬生產影片。該資料集的類別數量極端不平衡,其中3.6%的影像「生產」,96.4%的影像則為「休息」。本研究最終開發了一套自動偵測母豬生產仔豬模型。該模型採用兩階段架構,包含空間特徵提取器與時序分類器(MS-TCN++)。本研究針對兩種特徵提取器:MobileNetV3-Large與DINOV2-S進行測試,並比較了預訓練(pretrain)、微調(fine-tuning)及特徵融合(fusion)等不同方法,同時採用焦點損失函數(focal loss)來解決類別不平衡問題。經微調並結合特徵融合的MobileNetV3-Large達到最佳模型表現,F1@0.1為78.18%、F1@0.25為67.27%。採用特徵融合的預訓練DINOV2-S則達到F1@0.1為70.48%、F1@0.25為53.33%的模型表現。在資料集無法對特定領域進行微調的情況下,展現出一種替代方案。由PDDM預測結果所計算出的分娩統計數據,包括,胎數、總分娩時長、平均生產間隔及平均生產持續時間,其數值均與實際數值呈現高度一致性。本研究為精準畜牧業奠定基礎,期望透過自動量化高精度的時序分娩統計資訊,輔助豬農即早介入母豬異常分娩。zh_TW
dc.description.abstractPork production represents a cornerstone of global livestock agriculture and holds high economic value among animal husbandry industry in Taiwan. The farrowing phase is a particularly critical period in commercial pig production. Decreasing the mortality rate of pre-weaning piglets remains a persistent challenge. Traditional monitoring methods rely heavily on manual observation, which is labor-intensive, subjective, and increasingly impractical given the rising trend in modern intensive production systems. The primary objective of this research was to develop an automated method for detecting individual piglet delivery events within commercial farrowing crates using computer vision and deep learning. A dataset comprising 467 piglet delivery videos from 21 farrowing crates across two commercial pig farms was collected. The dataset exhibited extreme class imbalance, with 3.6% of the frames labeled as "delivering" and 96.4% of the frames labeled as "resting." A Piglet Delivery Detection Model (PDDM) was developed using a two-stage architecture consisting of a spatial feature extractor followed by a temporal classifier (MS-TCN++). Two feature extractors were evaluated: MobileNetV3-Large and DINOV2-S. Different strategies including pretrained, fine-tuned, and feature fusion approaches were compared, with focal loss employed to address the class imbalance problem. The fine-tuned MobileNetV3-Large with feature fusion achieved the best performance with F1@0.1 of 78.18% and F1@0.25 of 67.27%. The pretrained DINOV2-S with feature fusion achieved F1@0.1 of 70.48% and F1@0.25 of 53.33%, presenting as a compelling alternative when domain-specific fine-tuning is not possible. Farrowing statistics derived from PDDM predictions, including litter size, total farrowing duration, mean delivery interval, and mean delivery duration, demonstrated strong agreement with ground truth values. This research lays a foundation for precision livestock farming, aiming to assist pig farmers in early intervention of abnormal sow farrowing through automated quantification of high-fidelity temporal farrowing statistics.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2026-02-04T16:22:17Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2026-02-04T16:22:17Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontentsACKNOWLEDGEMENTS i
中文摘要 ii
ABSTRACT iii
TABLE OF CONTENTS v
LIST OF FIGURES vii
LIST OF TABLES ix
CHAPTER 1. INTRODUCTION 1
1.1 Background 1
1.2 Objectives 2
1.3 Organization 3
CHAPTER 2. LITERATURE REVIEW 4
2.1 Computer vision-based Pig Farm Monitoring 4
2.2 Automated Piglet Delivery Detection Approaches 4
2.3 Temporal Action Segmentation 5
CHAPTER 3. MATERIALS AND METHODS 7
3.1 System Overview 7
3.2 Experimental Site 7
3.3 Video Acquisition 8
3.4 Data Annotation and Dataset 8
3.5 Piglet Delivery Detection Model 9
3.6 Video Feature Extraction 12
CHAPTER 4. RESULTS AND DISCUSSION 14
4.1 Performance of PDDM Using MobileNetV3-Large Features 14
4.2 Interpretability Analysis of the Feature Extractor (MobileNetV3-Large) 14
4.3 Interpretability Analysis of PDDM Using MobileNetV3-Large 15
4.4 Temporal Refinement Analysis of PDDM 18
4.5 Performance of PDDM Using DINOV2-S Features 19
4.6 Interpretability Analysis of PDDM Using DINOV2-S 20
4.7 Challenging Scenarios of Piglet Delivery Detection 21
4.8 Piglet Delivery Quantification of the Farrowing Process 23
CHAPTER 5. CONCLUSIONS 25
5.1 Summary 25
5.2 Future Work 25
REFERENCES 27
-
dc.language.isoen-
dc.subject仔豬分娩偵測-
dc.subject時序動作分割-
dc.subject深度學習-
dc.subject分娩監測-
dc.subject精準畜牧業-
dc.subjectpiglet delivery detection-
dc.subjecttemporal action segmentation-
dc.subjectsow farrowing monitoring-
dc.subjectprecision livestock farming-
dc.title利用深度學習於商業豬舍中自動偵測和時序分割母豬生產仔豬事件zh_TW
dc.titleAutomated Detection and Action Segmentation of Piglet Delivery Events of Sows in Commercial Pig Farms Using Deep Learningen
dc.typeThesis-
dc.date.schoolyear114-1-
dc.description.degree碩士-
dc.contributor.oralexamcommittee陳世芳;林恩仲zh_TW
dc.contributor.oralexamcommitteeShih-Fang Chen;En-Chung Linen
dc.subject.keyword仔豬分娩偵測,時序動作分割深度學習分娩監測精準畜牧業zh_TW
dc.subject.keywordpiglet delivery detection,temporal action segmentationsow farrowing monitoringprecision livestock farmingen
dc.relation.page28-
dc.identifier.doi10.6342/NTU202600404-
dc.rights.note同意授權(全球公開)-
dc.date.accepted2026-01-29-
dc.contributor.author-college生物資源暨農學院-
dc.contributor.author-dept生物機電工程學系-
dc.date.embargo-lift2026-02-05-
顯示於系所單位:生物機電工程學系

文件中的檔案:
檔案 大小格式 
ntu-114-1.pdf2.45 MBAdobe PDF檢視/開啟
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved