請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/99144完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 丁肇隆 | zh_TW |
| dc.contributor.advisor | Chao-Lung Ting | en |
| dc.contributor.author | 王威翔 | zh_TW |
| dc.contributor.author | Wei-Hsiang Wang | en |
| dc.date.accessioned | 2025-08-21T16:33:46Z | - |
| dc.date.available | 2025-08-22 | - |
| dc.date.copyright | 2025-08-21 | - |
| dc.date.issued | 2025 | - |
| dc.date.submitted | 2025-07-31 | - |
| dc.identifier.citation | [1]國家發展委員會. 人口推估查詢系統. https://reurl.cc/eMq0a7.
[2]World Health Organization. (2021, April 26). Falls. https://www.who.int/news-room/fact-sheets/detail/falls [3]Health Promotion Administration, Ministry of Health and Welfare. (2023, March 10). 65 歲以上長者每6人就有1人跌倒,一半以上在室內 身體狀況與居家環境都很重要。健康九九。https://health99.hpa.gov.tw/news/19188 [4]Ramachandran, Anita, Karuppiah, Anupama, A Survey on Recent Advances in Wearable Fall Detection Systems, BioMed Research International, 2020, 2167160, 17 pages, 2020. [5]Kangas, Maarit, et al. "Determination of simple thresholds for accelerometry-based parameters for fall detection." 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, 2007 [6]Zhao, Zhongtang, et al. "FallAlarm: Smart phone based fall detecting and positioning system." Procedia Computer Science 10 : 617-624. 2012. [7]Pannurat, Natthapon, Surapa Thiemjarus, and Ekawit Nantajeewarawat. "Automatic fall monitoring: A review." Sensors 14.7 : 12900-12936. 2014. [8]Li, Yun, et al. "Acoustic fall detection using a circular microphone array." 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology. IEEE, 2010. [9]Alwan, Majd, et al. "A smart and passive floor-vibration based fall detector for elderly." 2006 2nd International Conference on Information & Communication Technologies. Vol. 1. IEEE, 2006. [10]Mastorakis, Georgios, and Dimitrios Makris. "Fall detection system using Kinect’s infrared sensor." Journal of Real-Time Image Processing 9 : 635-646. 2014. [11]Liu, Liang, et al. "Automatic fall detection based on Doppler radar motion signature." 2011 5th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth) and Workshops. IEEE, 2011. [12]S. Hu, S. Cao, N. Toosizadeh, J. Barton, M. G. Hector and M. J. Fain, "Radar-Based Fall Detection: A Survey [Survey]," in IEEE Robotics & Automation Magazine, vol. 31, no. 3, pp. 170-185, Sept. 2024. [13]Ma, Xin, et al. "Depth-based human fall detection via shape features and improved extreme learning machine." IEEE journal of biomedical and health informatics 18.6 : 1915-1922. 2014. [14]Lee, Young-Sook, and Wan-Young Chung. "Visual sensor based abnormal event detection with moving shadow removal in home healthcare applications." Sensors 12.1 : 573-584. 2012. [15]Auvinet, Edouard, et al. "Fall detection with multiple cameras: An occlusion-resistant method based on 3-d silhouette vertical distribution." IEEE transactions on information technology in biomedicine 15.2 : 290-300. 2010. [16]Nguyen, Viet Dung, et al. "An efficient camera-based surveillance for fall detection of elderly people." 2014 9th IEEE Conference on Industrial Electronics and Applications. IEEE, 2014. [17]Feng, Pengming, et al. "Deep learning for posture analysis in fall detection." 2014 19th International Conference on Digital Signal Processing. IEEE, 2014. [18]Debard, Glen, et al. "Camera-based fall detection on real world data." Outdoor and Large-Scale Real-World Scene Analysis: 15th International Workshop on Theoretical Foundations of Computer Vision, Dagstuhl Castle, Germany, June 26-July 1, 2011. Revised Selected Papers. Springer Berlin Heidelberg, 2012. [19]NOOR, Nadhira; PARK, In Kyu. A lightweight skeleton-based 3D-CNN for real-time fall detection and action recognition. In: Proceedings of the IEEE/CVF International Conference on Computer Vision. p. 2179-2188. 2023. [20]H. Zheng, Y. Liu, X. Wu and Y. Zhang, "Realization of elderly fall integration monitoring system based on AlphaPose and YOLOV4," 2022 Asia Conference on Algorithms, Computing and Machine Learning (CACML), Hangzhou, China, 2022, pp. 604-620. [21]C. Vishnu, R. Datla, D. Roy, S. Babu and C. K. Mohan, "Human Fall Detection in Surveillance Videos Using Fall Motion Vector Modeling," in IEEE Sensors Journal, vol. 21, no. 15, pp. 17162-17170, 1 Aug.1, 2021. [22]B. -H. Wang, J. Yu, K. Wang, X. -Y. Bao and K. -M. Mao, "Fall Detection Based on Dual-Channel Feature Integration," in IEEE Access, vol. 8, pp. 103443-103453, 2020. [23]Redmon, Joseph, et al. "You only look once: Unified, real-time object detection." Proceedings of the IEEE conference on computer vision and pattern recognition. 2016. [24]Redmon, Joseph, and Ali Farhadi. "YOLO9000: better, faster, stronger." Proceedings of the IEEE conference on computer vision and pattern recognition. 2017. [25]Redmon, Joseph, and Ali Farhadi. "Yolov3: An incremental improvement." arXiv preprint arXiv:1804.02767 (2018). [26]Tommy Huang, C.-S. (2018, September 4). 深度學習‑什麼是 one‑stage,什麼是 two‑stage 物件偵測. https://reurl.cc/mxed91 [27]S. Ren, K. He, R. Girshick and J. Sun, "Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 39, no. 6, pp. 1137-1149, 1 June 2017. [28]Girshick, Ross, et al. "Rich feature hierarchies for accurate object detection and semantic segmentation." Proceedings of the IEEE conference on computer vision and pattern recognition. 2014. [29]Girshick, Ross. "Fast r-cnn." Proceedings of the IEEE international conference on computer vision. 2015. [30]Z. Cao, G. Hidalgo, T. Simon, S. -E. Wei and Y. Sheikh, "OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 43, no. 1, pp. 172-186, 1 Jan. 2021. [31]H. -S. Fang et al., "AlphaPose: Whole-Body Regional Multi-Person Pose Estimation and Tracking in Real-Time," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 6, pp. 7157-7173, 1 June 2023. [32]J. Wang et al., "Deep High-Resolution Representation Learning for Visual Recognition," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 43, no. 10, pp. 3349-3364, 1 Oct. 2021. [33]Charfi, Imen, et al. "Optimized spatio-temporal descriptors for real-time fall detection: comparison of support vector machine and Adaboost-based classification." Journal of Electronic Imaging 22.4: 041106-041106. 2013. [34]Bogdan Kwolek and Michal Kepski. Human fall detection on embedded platform using depth maps and wireless accelerometer. Computer Methods and Programs in Biomedicine, vol. 117, no. 3, pp. 489–501, 2014. [35]The Fall Detection Dataset. [Online]. Available: https://falldataset.com/ [36]The COCO Dataset. [Online]. Available: https://cocodataset.org/#explore [37]劉奕煊, “使用深度學習物件辨識與動作識別進行居家跌倒偵測之研究,” 碩士, 工程科學及海洋工程學研究所, 國立臺灣大學, 台北市, 2021. [Online]. Available: https://hdl.handle.net/11296/cpmm47 | - |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/99144 | - |
| dc.description.abstract | 隨著臺灣人口結構中高齡占比持續攀升,跌倒已儼然成為造成老人受傷與死亡的主要因素之一,面對步入高齡化的社會,對於老人醫療照護的需求,未來必然持續增加,但人手不足一直是老人長期照護議題中迫切面臨的問題。為了減輕人力資源短缺的問題,本研究提出一套可於居家環境即時運作的影像式跌倒偵測系統。首先,本研究針對人物偵測模型之跌倒姿態進行重新訓練,辨識出目標人物後再對其進行姿態估計,提取人物關鍵點位置作為輸入特徵,並使用支援向量機(SVM)與多層全連接神經網路進行動作分類,其中全連接神經網路分別使用兩種不同方式的輸入(FCNS, FCNT)。實驗結果顯示,SVM與兩種全連接神經網路皆能辨識包含跌倒動作在內的四種日常生活動作,其平均準確率分別為96.9%、96%與95.6%,且辨識速度能達到實時。此外,本研究設計一套判針對跌倒事件之判斷與通報流程系統也能有效地區分跌倒事件,並分別在URfall資料集上取得了100%、96.6%與100%的準確度。 | zh_TW |
| dc.description.abstract | As Taiwan’s population continues to age, falls have become one of the leading causes of disability and mortality among the elderly. While the demand for geriatric medical care is steadily increasing, chronic staff shortages remain a pressing challenge in long-term care. In order to improve this manpower gap, this thesis proposes a vision-based fall-detection system capable of real-time operation in home environments. First, a person-detection model is re-trained with fall-specific data to localize the target individual. The detected person is then subjected to pose estimation, from which keypoints coordinates are extracted as input features. These features are classified using a Support Vector Machine (SVM) and fully connected neural networks (FCNs), with the FCNs implemented using two different input arrangements. Experimental results demonstrate that the SVM and both fully connected networks are capable of recognizing four daily activities, including falls, with average accuracies of 96.9%, 96%, and 95.6%, respectively. All models achieved real-time inference speeds. In addition, a fall judgment and notification procedure was designed to reduce false alarms. The proposed system successfully distinguished true fall events, achieving accuracy rates of 100%, 96.6%, and 100% on the URfall dataset using SVM, FCNS, and FCNT, respectively. | en |
| dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2025-08-21T16:33:46Z No. of bitstreams: 0 | en |
| dc.description.provenance | Made available in DSpace on 2025-08-21T16:33:46Z (GMT). No. of bitstreams: 0 | en |
| dc.description.tableofcontents | 致謝 i
中文摘要 ii ABSTRACT iii 目次 iv 圖次 vii 表次 ix Chapter1 緒論 1 1.1 研究背景 1 1.2 研究目的 3 1.3 論文架構 3 Chapter2 文獻回顧 4 2.1 跌倒偵測相關研究 4 2.1.1 穿戴式跌倒偵測法 4 2.1.2 環境感測跌倒偵測法 4 2.1.3 視覺感測跌倒偵測法 5 2.1.4 深度學習特徵方法 6 2.2 物件偵測 7 2.2.1 單階段(One stage)偵測方法 7 2.2.2 雙階段(Two stage)偵測方法 9 2.3 姿態估計 11 2.3.1 HRNet 11 2.3.2 HRNet與其他方法的比較 12 Chapter3 研究方法 13 3.1 系統流程 13 3.2 跌倒資料集 14 3.2.1 Le2i fall dataset 14 3.2.2 UR fall dataset 14 3.2.3 Fall detection dataset 15 3.3 人物偵測 15 3.4 姿態估計 17 3.5 數據處理 19 3.5.1 人物追蹤 19 3.5.2 處理關鍵點缺失值 20 3.5.3 正規化 20 3.6 跌倒偵測 22 Chapter4 實驗結果與討論 27 4.1 實驗環境 27 4.2 人物偵測模型 27 4.3 正規化對分類效果的影響 29 4.4 動作偵測模型 31 4.4.1 二種動作分類 34 4.4.2 四種動作分類 39 4.5 跌倒偵測辨識準確率測試 42 4.6 與現有研究之數據比較 46 Chapter5 結論 48 參考文獻 49 | - |
| dc.language.iso | zh_TW | - |
| dc.subject | 影像處理 | zh_TW |
| dc.subject | 機器學習 | zh_TW |
| dc.subject | 跌倒偵測 | zh_TW |
| dc.subject | 姿態估計 | zh_TW |
| dc.subject | 物件偵測 | zh_TW |
| dc.subject | Object Detection | en |
| dc.subject | Fall Detection | en |
| dc.subject | Pose Estimation | en |
| dc.subject | Image Processing | en |
| dc.subject | Machine Learning | en |
| dc.title | 應用人體姿態估計之跌倒偵測 | zh_TW |
| dc.title | Fall Detection Based on Human Pose Estimation | en |
| dc.type | Thesis | - |
| dc.date.schoolyear | 113-2 | - |
| dc.description.degree | 碩士 | - |
| dc.contributor.oralexamcommittee | 謝傳璋;張恆華;陳昭宏;陳彥廷 | zh_TW |
| dc.contributor.oralexamcommittee | Chuan-Cheung Tse;Herng-Hua Chang;Jau-Horng Chen;Yen-Ting Chen | en |
| dc.subject.keyword | 機器學習,影像處理,物件偵測,姿態估計,跌倒偵測, | zh_TW |
| dc.subject.keyword | Machine Learning,Image Processing,Object Detection,Pose Estimation,Fall Detection, | en |
| dc.relation.page | 52 | - |
| dc.identifier.doi | 10.6342/NTU202502998 | - |
| dc.rights.note | 未授權 | - |
| dc.date.accepted | 2025-08-04 | - |
| dc.contributor.author-college | 工學院 | - |
| dc.contributor.author-dept | 工程科學及海洋工程學系 | - |
| dc.date.embargo-lift | N/A | - |
| 顯示於系所單位: | 工程科學及海洋工程學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-113-2.pdf 未授權公開取用 | 3.32 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
