請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/87937完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 賴飛羆 | zh_TW |
| dc.contributor.advisor | Feipei Lai | en |
| dc.contributor.author | 劉昌杰 | zh_TW |
| dc.contributor.author | Tom J. Liu | en |
| dc.date.accessioned | 2023-07-31T16:25:12Z | - |
| dc.date.available | 2023-11-09 | - |
| dc.date.copyright | 2023-07-31 | - |
| dc.date.issued | 2023 | - |
| dc.date.submitted | 2023-04-07 | - |
| dc.identifier.citation | References
D. Bluestein, A. Javaheri. Pressure ulcers: prevention, evaluation, and management. Am Fam Physician 2008;78:1186-1194. E. F. White-Chu, P. Flock, B. Struck, L. Aronson. Pressure ulcers in long-term care. Clin Geriatr Med 2011;27:241-258. C. K. Sen. Human Wounds and Its Burden: An Updated Compendium of Estimates. Adv Wound Care (New Rochelle) 2019;8:39-48. C. Wang, D. M. Anisuzzaman, V. Williamson, M. K. Dhar, B. Rostami, J. Niezgoda, et al. Fully automatic wound segmentation with deep convolutional neural networks. Sci Rep 2020;10:21897. Y. LeCun, Y. Bengio, G. Hinton. Deep learning. Nature 2015;521:436-444. A. Krizhevsky, I. Sutskever, G. E. Hinton. ImageNet Classification with Deep Convolutional Neural Networks. Commun Acm 2017;60:84-90. T. J. Liu, H. Wang, M. Christian, C. W. Chang, F. Lai, H. C. Tai. Automatic segmentation and measurement of pressure injuries using deep learning models and a LiDAR camera. Sci Rep 2023;13:680. T. J. Liu, M. Christian, Y. C. Chu, Y. C. Chen, C. W. Chang, F. Lai, et al. A pressure ulcers assessment system for diagnosis and decision making using convolutional neural networks. J Formos Med Assoc 2022;121:2227-2236. H. Wannous, S. Treuillet, Y. Lucas. Supervised tissue classification from color images for a complete wound assessment tool. Annu Int Conf IEEE Eng Med Biol Soc 2007;2007:6032-6035. F. Veredas, H. Mesa, L. Morente. Binary tissue classification on wound images with neural networks and bayesian classifiers. IEEE Trans Med Imaging 2010;29:410-427. D. Manohar Dhane, M. Maity, T. Mungle, C. Bar, A. Achar, M. Kolekar, et al. Fuzzy spectral clustering for automated delineation of chronic wound region using digital images. Comput Biol Med 2017;89:551-560. S. Moghimi, M. H. Baygi, G. Torkaman. Automatic evaluation of pressure sore status by combining information obtained from high-frequency ultrasound and digital photography. Comput Biol Med 2011;41:427-434. B. Garcia-Zapirain, M. Elmogy, A. El-Baz, A. S. Elmaghraby. Classification of pressure ulcer tissues with 3D convolutional neural network. Med Biol Eng Comput 2018;56:2245-2258. N. Ohura, R. Mitsuno, M. Sakisaka, Y. Terabe, Y. Morishige, A. Uchiyama, et al. Convolutional neural networks for wound detection: the role of artificial intelligence in wound care. J Wound Care 2019;28:S13-S24. S. Zahia, D. Sierra-Sosa, B. Garcia-Zapirain, A. Elmaghraby. Tissue classification and segmentation of pressure injuries using convolutional neural networks. Comput Methods Programs Biomed 2018;159:51-58. C. Wang, X. Yan, M. Smith, K. Kochhar, M. Rubin, S. M. Warren, et al. A unified framework for automatic wound segmentation and analysis with deep convolutional neural networks. In 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC); 25-29 Aug. 2015, 2015. M. Goyal, M. H. Yap, N. D. Reeves, S. Rajbhandari, J. Spragg. Fully convolutional networks for diabetic foot ulcer segmentation. In 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC); 5-8 Oct. 2017, 2017. X. Liu, C. Wang, F. Li, X. Zhao, E. Zhu, Y. Peng. A framework of wound segmentation based on deep convolutional networks. In 2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI); 14-16 Oct. 2017, 2017. C. W. Chang, M. Christian, D. H. Chang, F. Lai, T. J. Liu, Y. S. Chen, et al. Deep learning approach based on superpixel segmentation assisted labeling for automatic pressure ulcer diagnosis. PLoS One 2022;17:e0264139. B. Song, A. Sacan. Automated wound identification system based on image segmentation and Artificial Neural Networks. In 2012 IEEE International Conference on Bioinformatics and Biomedicine; 4-7 Oct. 2012, 2012. M. F. Ahmad Fauzi, I. Khansa, K. Catignani, G. Gordillo, C. K. Sen, M. N. Gurcan. Computerized segmentation and measurement of chronic wound images. Comput Biol Med 2015;60:74-85. N. D. J. Hettiarachchi, R. B. H. Mahindaratne, G. D. C. Mendis, H. T. Nanayakkara, N. D. Nanayakkara. Mobile based wound measurement. In 2013 IEEE Point-of-Care Healthcare Technologies (PHT); 16-18 Jan. 2013, 2013. A. F. M. Hani, L. Arshad, A. S. Malik, A. Jamil, F. Y. B. Bin. Haemoglobin distribution in ulcers for healing assessment. In 2012 4th International Conference on Intelligent and Advanced Systems (ICIAS2012); 12-14 June 2012, 2012. K. Wantanajittikul, S. Auephanwiriyakul, N. Theera-Umpon, T. Koanantakool. Automatic segmentation and degree identification in burn color images. In The 4th 2011 Biomedical Engineering International Conference; 29-31 Jan. 2012, 2012. A. Krizhevsky, I. Sutskever, G. Hinton. ImageNet Classification with Deep Convolutional Neural Networks. Neural Information Processing Systems 2012;25. H. Wu, Q. Liu, X. Liu. A Review on Deep Learning Approaches to Image Classification and Object Segmentation. Computers, Materials & Continua 2019;58:575-597. J. Long, E. Shelhamer, T. Darrell. Fully convolutional networks for semantic segmentation. In 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR); 7-12 June 2015, 2015. O. Ronneberger, P. Fischer, T. Brox. U-Net: Convolutional Networks for Biomedical Image Segmentation. In MICCAI2015. K. He, X. Zhang, S. Ren, J. Sun. Deep Residual Learning for Image Recognition. In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR); 27-30 June 2016, 2016. K. He, G. Gkioxari, P. Dollár, R. Girshick. Mask R-CNN. In 2017 IEEE International Conference on Computer Vision (ICCV); 22-29 Oct. 2017, 2017. T. Y. Lin, P. Dollár, R. Girshick, K. He, B. Hariharan, S. Belongie. Feature Pyramid Networks for Object Detection. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR); 21-26 July 2017, 2017. T.-Y. Lin, M. Maire, S. Belongie, L. Bourdev, R. Girshick, J. Hays, et al. Microsoft COCO: Common Objects in Context. 2014:arXiv:1405.0312. L. Personnaz, I. I. Guyon, G. Dreyfus. Collective computational properties of neural networks: New learning mechanisms. Phys Rev A Gen Phys 1986;34:4217-4228. D. Peterson. Polygon Coordinates and Areas. Available at: https://www.themathdoctors.org/polygon-coordinates-and-areas/. Accessed October 10 2020. European Pressure Ulcer Advisory Panel, National Pressure Injury Advisory Panel and Pan Pacific Pressure Injury Alliance. Prevention and Treatment of Pressure Ulcers/Injuries: Clinical Practice Guideline. The International Guideline. Emily Haesler (Ed.). EPUAP/NPIAP/PPPIA; 2019. K. Simonyan, A. Zisserman. Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv 14091556 2014. C. Szegedy, S. Ioffe, V. Vanhoucke, A. Alemi. Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning. Proceedings of the AAAI Conference on Artificial Intelligence 2017;31. G. Hinton, N. Srivastava, K. Swersky. Lecture 6d - A separate, adaptive learning rate for each connection. Slides of Lecture Neural Networks for Machine Learning, 2012. K. Wada. Labelme: Image Polygonal Annotation with Python. Available at: https://github.com/wkentaro/labelme. Accessed November 7 2018. C. A. Schneider, W. S. Rasband, K. W. Eliceiri. NIH Image to ImageJ: 25 years of image analysis. Nat Methods 2012;9:671-675. O. Ronneberger, P. Fischer, T. Brox. U-Net: Convolutional Networks for Biomedical Image Segmentation. In Medical Image Computing and Computer-Assisted Intervention – MICCAI 2015, Cham; 2015//, 2015. Y. Zhang, J. H. Chen, K. T. Chang, V. Y. Park, M. J. Kim, S. Chan, et al. Automatic Breast and Fibroglandular Tissue Segmentation in Breast MRI Using Deep Learning by a Fully-Convolutional Residual Neural Network U-Net. Acad Radiol 2019;26:1526-1535. X. Dong, Y. Lei, T. Wang, M. Thomas, L. Tang, W. J. Curran, et al. Automatic multiorgan segmentation in thorax CT images using U-net-GAN. Med Phys 2019;46:2157-2168. A. Fabijanska. Segmentation of corneal endothelium images using a U-Net-based convolutional neural network. Artif Intell Med 2018;88:1-13. P. Blanc-Durand, A. Van Der Gucht, N. Schaefer, E. Itti, J. O. Prior. Automatic lesion detection and segmentation of 18F-FET PET in gliomas: A full 3D U-Net convolutional neural network study. PLoS One 2018;13:e0195798. V. Couteaux, S. Si-Mohamed, O. Nempont, T. Lefevre, A. Popoff, G. Pizaine, et al. Automatic knee meniscus tear detection and orientation classification with Mask-RCNN. Diagn Interv Imaging 2019;100:235-242. J. Y. Chiao, K. Y. Chen, K. Y. Liao, P. H. Hsieh, G. Zhang, T. C. Huang. Detection and classification the breast tumors using mask R-CNN on sonograms. Medicine (Baltimore) 2019;98:e15200. R. Zhang, C. Cheng, X. Zhao, X. Li. Multiscale Mask R-CNN-Based Lung Tumor Detection Using PET Imaging. Mol Imaging 2019;18:1536012119863531. A. O. Vuola, S. U. Akram, J. Kannala. Mask-RCNN and U-Net Ensembled for Nuclei Segmentation. In 2019 IEEE 16th International Symposium on Biomedical Imaging (ISBI 2019); 8-11 April 2019, 2019. Z. Tiebiao, Y. Yonghuan, N. Haoyu, W. Dong, C. YangQuan. Comparing U-Net convolutional network with mask R-CNN in the performances of pomegranate tree canopy segmentation. In ProcSPIE2018. D. Bouget, A. Jorgensen, G. Kiss, H. O. Leira, T. Lango. Semantic segmentation and detection of mediastinal lymph nodes and anatomical structures in CT data for lung cancer staging. Int J Comput Assist Radiol Surg 2019;14:977-986. S. C. Wang, J. A. E. Anderson, R. Evans, K. Woo, B. Beland, D. Sasseville, et al. Point-of-care wound visioning technology: Reproducibility and accuracy of a wound measurement app. PLoS One 2017;12:e0183139. S. Kompalliy, V. Bakarajuy, S. B. Gogia. Cloud-Driven Application for Measurement of Wound Size. Stud Health Technol Inform 2019;264:1639-1640. Y. Lucas, R. Niri, S. Treuillet, H. Douzi, B. Castaneda. Wound Size Imaging: Ready for Smart Assessment and Monitoring. Adv Wound Care (New Rochelle) 2021;10:641-661. N. Tajbakhsh, J. Y. Shin, S. R. Gurudu, R. T. Hurst, C. B. Kendall, M. B. Gotway, et al. Convolutional Neural Networks for Medical Image Analysis: Full Training or Fine Tuning? IEEE Trans Med Imaging 2016;35:1299-1312. J. Chauhan, P. Goyal. BPBSAM: Body part-specific burn severity assessment model. Burns 2020;46:1407-1423. | - |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/87937 | - |
| dc.description.abstract | 褥瘡是長期護理或醫院護理中的常見問題。褥瘡是軟組織長時間受壓引起的,會造成局部組織損傷,甚至導致嚴重感染。褥瘡可能導致預後不良、長期住院和增加醫療費用,這在老齡化社會中尤其成問題。褥瘡的治療目標需要根據其不同階段和感染程度進行治療。治療選擇包括減少風險因素、治療局部傷口以及在必要時服用抗生素。本研究主要採用深度學習的方法對褥瘡進行分析、診斷和輔助決策,為一線護理及照顧人員提供更多參考。
我們所建立的褥瘡傷口診斷以及輔助決策系統主要包含傷口分析系統以及資料庫管理系統。其中傷口分析系統包含:一、自動傷口輪廓分割,二、自動傷口大小測量,三、自動傷口診斷和治療建議。 為了建立自動傷口輪廓分割,我們訓練比較兩種不同之深度學習架構 : (1)語意分割U型全卷積神經網路(U-Net),(2)實例分割遮罩型區域卷積神經網路。 為了實現自動傷口大小測量,我們採用了雷射雷達技術(Light Detection and Ranging,雷射探測與測距)找出空間深度以及三維座標,同時合併前項訓練良好的分割法找出傷口輪廓座標進而算出傷口面積。 至於自動傷口診斷和治療建議,我們提出了根據目前最新褥瘡治療準則所改良的判斷流程,其中包含了兩個深度學習的分類任務:發紅度分類與壞死度分類。我們訓練並比較四個經典的卷積神經網路架構:AlexNet,VGG,ResNet以及Inception-ResNet-V2。 我們分析比較以上三個任務和多種不同的深度學習架構,找出最佳表現的神經網路模型,以建立完善我們的褥瘡傷口診斷以及輔助決策系統。在自動傷口分割上,U型全卷積神經網路的表現勝於遮罩型區域卷積神經網路(交聯比: 0.7773 對0.4604)。在自動傷口大小測量上,我們對傷口面積的評估平均相對誤差為26.2%。而至於發紅度分類與壞死度分類,Inception-ResNet-V2有最好的表現,兩者分別達到98.5%以及97%的準確率。 在研究測試上,褥瘡傷口診斷以及輔助決策系統有著不錯的表現,但在臨床使用端還有使用者回饋的部分,還有待前瞻型研究實測驗證,以期能真正改善臨床執業的環境,達到真正輔助一線臨床工作者的目的。 | zh_TW |
| dc.description.abstract | At nursing homes or hospitals, pressure injuries are a frequent problem. Long-term compression of soft tissues results in pressure injuries, which can harm nearby tissue and possibly spread dangerous infections. Pressure injuries may result in poor prognosis, prolonged hospitalization, and higher medical expenses, all of which are problems in an aging population. The treatment goals of pressure injuries need to be treated according to their different stages and the degree of infection. The treatment option includes reducing the risk factors, treating the local wounds, and taking antibiotics if necessary. This study mainly uses the methods of deep learning to assess and assist making decision for pressure injuries and provide more references for first-line caregivers.
Our Pressure-Injury Assessment System (PIAS) is composed of wound analyzation system and database management system. The wound analyzation system was composed by three main components: 1. Automatic wound segmentation, 2. Automatic wound area measurement and 3. Automatic wound diagnosis and treatment suggestion. For automatic wound segmentation, we trained and compared two deep learning models: 1. Semantic segmentation: U-Net and 2. Instance segmentation: Mask Region-based Convolution Neural Network, Mask-RCNN. For automatic wound area measurement, we adopted the LiDAR technology (Light Detection And Ranging) to find the spatial depth and 3-dimentional (3D) coordinates. Combined with the well-trained segmentation model, we could calculate the area of the wound. As to automatic wound diagnosis and treatment suggestion, based on updated clinical guidelines for pressure injuries, we proposed a flow chart for the diagnosis and treatment. It mainly comprised two classification tasks: the erythema classification task and the necrotic tissue classification task. We trained and compared 4 classic architectures of CNN (Convolutional Neural Network): AlexNet, VGG, ResNet and Inception-ResNet-V2. We compared the performance of above tasks and models. On the segmentation task, U-Net had better performance than Mask R-CNN (IoU: 0.7773 versus IoU: 0.4604). For automatic wound area measurement, our estimation had a mean relative error (MRE) about 26.2%. On the erythema and the necrotic tissue classification tasks, Inception-ResNet-V2 had the best performance and got high accuracy about 98.5% and 97%, respectively. Due to above three successful studies and validations, we considered our Pressure-Injury Assessment System (PIAS) could give acceptable wound assessment and give treatment suggestions that are worthwhile of consideration. To determine whether our PIAS can be used in a clinical setting, whether it can assist first-line caregivers, and whether it can enhance overall treatment and care, more prospective trials are required. | en |
| dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-07-31T16:25:12Z No. of bitstreams: 0 | en |
| dc.description.provenance | Made available in DSpace on 2023-07-31T16:25:12Z (GMT). No. of bitstreams: 0 | en |
| dc.description.tableofcontents | Contents
致謝 i 中文摘要 ii Abstract iv List of Tables viii List of Figures ix List of Formulas x Chapter 1. Introduction 1 1.1 Background 1 1.2 Telemedicine 1 1.2.1 Artificial Intelligence and Computer Vision 2 1.3 Objective 2 Chapter 2. Methodology 3 2.1 The Pressure-Injury Assessment System (PIAS) 3 2.2 Automatic Wound Segmentation 4 2.2.1 Prior works on wound segmentation with traditional methods 4 2.2.2 Prior works on wound segmentation with deep learning methods 5 2.2.3 Semantic segmentation: U-Net 7 2.2.4 Instance segmentation: Mask R-CNN 9 2.2.5 Loss function 11 2.2.6 Optimization 14 2.3 Automatic Wound Area Measurement 16 2.3.1 LiDAR 16 2.3.2 Real world coordination conversion 17 2.3.3 Area estimation by Heron’s formula 18 2.4 Automatic Diagnosis and Treatment Suggestions 20 2.4.1 Prior work on automatic diagnosis and treatment suggestion 21 2.4.2 The algorithm 21 2.4.3 The erythema classification task 23 2.4.4 The necrotic tissue classification task 24 2.4.5 Convolution Neural Networks (CNNs) 25 2.4.6 Loss function 28 2.4.7 Optimization 29 2.5 Data Collection, Labeling and Training of Deep Learning 31 2.5.1 Data acquisition 31 2.5.2 Data labeling 32 2.5.3 Pretraining process 34 2.5.4 Statistical Analysis of internal and external validations 36 2.6.1 Performance evaluation 39 Chapter 3. Results 41 3.1 The Performance of Automatic Wound Segmentation 41 3.1.1 Internal validation 41 3.1.2 External validation 42 3.2 The Performance of Automatic Wound Area Measurement 43 3.3 The Performance of Automatic Wound Diagnosis and Treatment Suggestion 45 3.3.1 Comparison of the 4 classic architectures of CNN models 45 3.3.2 The erythema classification task 46 3.3.3 The necrotic tissue classification task 46 Chapter 4. Discussion 48 4.1 Principal Results 48 4.1.1 Automatic wound segmentation 48 4.1.2 Automatic wound area measurement 51 4.1.3 Classification tasks for automatic wound diagnosis 56 4.2 Limitation 59 4.3 Strength 59 4.4 Future Work 60 Chapter 5. Conclusion 62 References 63 | - |
| dc.language.iso | en | - |
| dc.subject | 遮罩型區域卷積神經網路 | zh_TW |
| dc.subject | 深度學習 | zh_TW |
| dc.subject | 雷射雷達技術 | zh_TW |
| dc.subject | 實例分割 | zh_TW |
| dc.subject | 卷積神經網路 | zh_TW |
| dc.subject | 褥瘡 | zh_TW |
| dc.subject | 語義分割 | zh_TW |
| dc.subject | Pressure injury | en |
| dc.subject | Deep learning | en |
| dc.subject | Convolutional neural network | en |
| dc.subject | U-Net | en |
| dc.subject | Light Detection and Ranging | en |
| dc.subject | LiDAR | en |
| dc.subject | Mask Region-based Convolution Neural Network | en |
| dc.subject | Mask-RCNN | en |
| dc.subject | Inception-ResNet-V2 | en |
| dc.title | 利用卷積神經網路建立褥瘡傷口分析系統 | zh_TW |
| dc.title | A Pressure-Injury Assessment System Using Convolutional Neural Networks | en |
| dc.type | Thesis | - |
| dc.date.schoolyear | 111-2 | - |
| dc.description.degree | 博士 | - |
| dc.contributor.oralexamcommittee | 張瑞峰;傅楸善;洪一平;李明穗;戴浩志;王水深;廖弘源;莊仁輝 | zh_TW |
| dc.contributor.oralexamcommittee | Ruey-Feng Chang;Chiou-Shann Fuh;Yi-Ping Hung;Ming-Sui Lee;Hao-Chih Tai;Shoei-Shen Wang;Mark Liao;Jen-Hui Chuang | en |
| dc.subject.keyword | 深度學習,卷積神經網路,遮罩型區域卷積神經網路,語義分割,實例分割,褥瘡,雷射雷達技術, | zh_TW |
| dc.subject.keyword | Deep learning,Convolutional neural network,U-Net,Light Detection and Ranging,LiDAR,Mask Region-based Convolution Neural Network,Mask-RCNN,Inception-ResNet-V2,Pressure injury, | en |
| dc.relation.page | 68 | - |
| dc.identifier.doi | 10.6342/NTU202300713 | - |
| dc.rights.note | 同意授權(限校園內公開) | - |
| dc.date.accepted | 2023-04-11 | - |
| dc.contributor.author-college | 電機資訊學院 | - |
| dc.contributor.author-dept | 生醫電子與資訊學研究所 | - |
| 顯示於系所單位: | 生醫電子與資訊學研究所 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-111-2.pdf 授權僅限NTU校內IP使用(校園外請利用VPN校外連線服務) | 3.82 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
