Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 生醫電子與資訊學研究所
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/85745
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor賴飛羆(Feipei Lai)
dc.contributor.authorChe-Wei Changen
dc.contributor.author張哲瑋zh_TW
dc.date.accessioned2023-03-19T23:23:06Z-
dc.date.copyright2022-07-05
dc.date.issued2022
dc.date.submitted2022-05-26
dc.identifier.citation[1] Harish V, Raymond AP, Issler AC, Lajevardi SS, Chang L-Y, Maitz PK, et al. Accuracy of burn size estimation in patients transferred to adult Burn Units in Sydney, Australia: an audit of 698 patients. Burns. 2015;41(1):91-9. [2] Baartmans MG, Van Baar M, Boxma H, Dokter J, Tibboel D, Nieuwenhuis MK. Accuracy of burn size assessment prior to arrival in Dutch burn centres and its consequences in children: a nationwide evaluation. Injury. 2012;43(9):1451-6. [3] Resch TR, Drake RM, Helmer SD, Jost GD, Osland JS. Estimation of burn depth at burn centers in the United States: a survey. Journal of Burn Care & Research. 2014;35(6):491-7. [4] Jaskille AD, Shupp JW, Jordan MH, Jeng JC. Critical review of burn depth assessment techniques: Part I. Historical review. Journal of burn care & research. 2009;30(6):937-47. [5] Monstrey S, Hoeksema H, Verbelen J, Pirayesh A, Blondeel P. Assessment of burn depth and burn wound healing potential. burns. 2008;34(6):761-9. [6] Jaspers ME, van Haasterecht L, van Zuijlen PP, Mokkink LB. A systematic review on the quality of measurement techniques for the assessment of burn wound depth or healing potential. Burns. 2019;45(2):261-81. [7] Thatcher JE, Squiers JJ, Kanick SC, King DR, Lu Y, Wang Y, et al. Imaging techniques for clinical burn assessment with a focus on multispectral imaging. Advances in wound care. 2016;5(8):360-78. [8] Thom D. Appraising current methods for preclinical calculation of burn size–a pre-hospital perspective. Burns. 2017;43(1):127-36. [9] Neaman KC, Andres LA, McClure AM, Burton ME, Kemmeter PR, Ford RD. A new method for estimation of involved BSAs for obese and normal-weight patients with burn injury. Journal of burn care & research. 2011;32(3):421-8. [10] Parvizi D, Kamolz L-P, Giretzlehner M, Haller HL, Trop M, Selig H, et al. The potential impact of wrong TBSA estimations on fluid resuscitation in patients suffering from burns: things to keep in mind. Burns. 2014;40(2):241-5. [11] Kwon S, Hong J, Choi E-K, Lee B, Baik C, Lee E, et al. Detection of atrial fibrillation using a ring-type wearable device (CardioTracker) and deep learning analysis of photoplethysmography signals: prospective observational proof-of-concept study. Journal of Medical Internet Research. 2020;22(5):e16443. [12] Adam G, Rampášek L, Safikhani Z, Smirnov P, Haibe-Kains B, Goldenberg A. Machine learning approaches to drug response prediction: challenges and recent progress. NPJ precision oncology. 2020;4(1):1-10. [13] Kanavati F, Toyokawa G, Momosaki S, Rambeau M, Kozuma Y, Shoji F, et al. Weakly-supervised learning for lung carcinoma classification using deep learning. Scientific reports. 2020;10(1):1-11. [14] Han W, Johnson C, Gaed M, Gómez JA, Moussa M, Chin JL, et al. Histologic tissue components provide major cues for machine learning-based prostate cancer detection and grading on prostatectomy specimens. Scientific reports. 2020;10(1):1-12. [15] Kanevsky J, Corban J, Gaster R, Kanevsky A, Lin S, Gilardino M. Big data and machine learning in plastic surgery: a new frontier in surgical innovation. Plastic and reconstructive surgery. 2016;137(5):890e-7e. [16] Liu NT, Salinas J. Machine learning in burn care and research: a systematic review of the literature. Burns. 2015;41(8):1636-41. [17] Dinsdale SM. Decubitus ulcers: role of pressure and friction in causation. Archives of physical medicine and rehabilitation. 1974;55(4):147-52. [18] Horn SD, Bender SA, Ferguson ML, Smout RJ, Bergstrom N, Taler G, et al. The National Pressure Ulcer Long‐Term Care Study: pressure ulcer development in long‐term care residents. Journal of the American Geriatrics Society. 2004;52(3):359-67. [19] Margolis DJ, Malay DS, Hoffstad OJ, Leonard CE, MaCurdy T, de Nava KL, et al. Incidence of diabetic foot ulcer and lower extremity amputation among Medicare beneficiaries, 2006 to 2008. Data Points Publication Series [Internet]. 2011. [20] Mervis JS, Phillips TJ. Pressure ulcers: Pathophysiology, epidemiology, risk factors, and presentation. Journal of the American Academy of Dermatology. 2019;81(4):881-90. [21] Vanderwee K, Clark M, Dealey C, Gunningberg L, Defloor T. Pressure ulcer prevalence in Europe: a pilot study. Journal of evaluation in clinical practice. 2007;13(2):227-35. [22] Baumgarten M, Margolis D, Gruber-Baldini AL, Zimmerman S, German P, Hebel JR, et al. Pressure ulcers and the transition to long-term care. Advances in Skin & Wound Care. 2003;16(6):299-304. [23] Nogales A, García-Tejedor ÁJ, Monge D, Vara JS, Antón C. A survey of deep learning models in medical therapeutic areas. Artificial Intelligence in Medicine. 2021;112:102020. [24] Suvarna M, Niranjan U. Classification methods of skin burn images. AIRCC's International Journal of Computer Science and Information Technology. 2013;5(1):109-18. [25] Tran H, Le T, Le T, Nguyen T, editors. Burn image classification using one-class support vector machine. ICCASA; 2015: Springer. [26] Kuan P, Chua S, Safawi E, Wang H, Tiong W. A comparative study of the classification of skin burn depth in human. Journal of Telecommunication, Electronic and Computer Engineering (JTEC). 2017;9(2-10):15-23. [27] Yadav D, Sharma A, Singh M, Goyal A. Feature extraction based machine learning for human burn diagnosis from burn images. IEEE Journal of Translational Engineering in Health and Medicine. 2019;7:1-7. [28] Wang Y, Ke Z, He Z, Chen X, Zhang Y, Xie P, et al. Real-time burn depth assessment using artificial networks: a large-scale, multicentre study. Burns. 2020;46(8):1829-38. [29] Acha B, Serrano C, Acha JI, Roa LM, editors. CAD tool for burn diagnosis. Biennial International Conference on Information Processing in Medical Imaging; 2003: Springer. [30] Serrano C, Acha B, Gómez-Cía T, Acha JI, Roa LM. A computer assisted diagnosis tool for the classification of burns by depth of injury. Burns. 2005;31(3):275-81. [31] Pinero BA, Serrano C, Acha JI, Roa LM. Segmentation and classification of burn images by color and texture information. Journal of Biomedical Optics. 2005;10(3):034014. [32] Acha B, Serrano C, Palencia S, Murillo JJ, editors. Classification of burn wounds using support vector machines. Medical Imaging 2004: Image Processing; 2004: SPIE. [33] Acha B, Serrano C, Fondón I, Gómez-Cía T. Burn depth analysis using multidimensional scaling applied to psychophysical experiment data. IEEE transactions on medical imaging. 2013;32(6):1111-20. [34] Serrano C, Boloix-Tortosa R, Gómez-Cía T, Acha B. Features identification for automatic burn classification. Burns. 2015;41(8):1883-90. [35] Wantanajittikul K, Auephanwiriyakul S, Theera-Umpon N, Koanantakool T, editors. Automatic segmentation and degree identification in burn color images. The 4th 2011 Biomedical Engineering International Conference; 2012: IEEE. [36] Cirillo MD, Mirdell R, Sjöberg F, Pham TD. Time-independent prediction of burn depth using deep convolutional neural networks. Journal of Burn Care & Research. 2019;40(6):857-63. [37] Despo O, Yeung S, Jopling J, Pridgen B, Sheckter C, Silberstein S, et al. BURNED: towards efficient and accurate burn prognosis using deep learning. 2017. [38] Jiao C, Su K, Xie W, Ye Z. Burn image segmentation based on Mask Regions with Convolutional Neural Network deep learning framework: more accurate and more convenient. Burns & Trauma. 2019;7. [39] Badea M-S, Vertan C, Florea C, Florea L, Bădoiu S, editors. Severe burns assessment by joint color-thermal imagery and ensemble methods. 2016 IEEE 18th International Conference on e-Health Networking, Applications and Services (Healthcom); 2016: IEEE. [40] Chang CW, Lai F, Christian M, Chen YC, Hsu C, Chen YS, et al. Deep Learning–Assisted Burn Wound Diagnosis: Diagnostic Model Development Study. JMIR Medical Informatics. 2021;9(12):e22798. [41] E Moura FS, Amin K, Ekwobi C. Artificial intelligence in the management and treatment of burns: a systematic review. Burns & trauma. 2021;9. [42] Mantelakis A, Assael Y, Sorooshian P, Khajuria A. Machine learning demonstrates high accuracy for disease diagnosis and prognosis in plastic surgery. Plastic and Reconstructive Surgery Global Open. 2021;9(6). [43] Thapa C, Chamikara MAP, Camtepe SA. Advancements of federated learning towards privacy preservation: from federated learning to split learning. Federated Learning Systems: Springer; 2021. p. 79-109. [44] Yang D, Xu Z, Li W, Myronenko A, Roth HR, Harmon S, et al. Federated semi-supervised learning for COVID region segmentation in chest CT using multi-national data from China, Italy, Japan. Medical image analysis. 2021;70:101992. [45] Marijanović D, Filko D. A systematic overview of recent methods for non-contact chronic wound analysis. Applied Sciences. 2020;10(21):7613. [46] Zahia S, Zapirain MBG, Sevillano X, González A, Kim PJ, Elmaghraby A. Pressure injury image analysis with machine learning techniques: A systematic review on previous and possible future methods. Artificial intelligence in medicine. 2020;102:101742. [47] Anisuzzaman D, Wang C, Rostami B, Gopalakrishnan S, Niezgoda J, Yu Z. Image-Based Artificial Intelligence in Wound Assessment: A Systematic Review. Advances in Wound Care. 2021. [48] Khan S, Paul S, Rao SS, Krishnareddy A, editors. Segmenting skin ulcers based on thresholding and watershed segmentation. 2015 international conference on communications and signal processing (ICCSP); 2015: IEEE. [49] Veredas FJ, Luque-Baena RM, Martín-Santos FJ, Morilla-Herrera JC, Morente L. Wound image evaluation with machine learning. Neurocomputing. 2015;164:112-22. [50] Zahia S, Sierra-Sosa D, Garcia-Zapirain B, Elmaghraby A. Tissue classification and segmentation of pressure injuries using convolutional neural networks. Computer methods and programs in biomedicine. 2018;159:51-8. [51] Rani P, Aliahmad B, Kumar DK, editors. A novel approach for quantification of contour irregularities of diabetic foot ulcers and its association with ischemic heart disease. 2017 39th annual international conference of the IEEE engineering in medicine and biology society (EMBC); 2017: IEEE. [52] Trabelsi O, Tlig L, Sayadi M, Fnaiech F, editors. Skin disease analysis and tracking based on image segmentation. 2013 International Conference on Electrical Engineering and Software Applications; 2013: IEEE. [53] Seixas JL, Barbon S, Siqueira CM, Dias IFL, Castaldin AG, Felinto AS, editors. Color energy as a seed descriptor for image segmentation with region growing algorithms on skin wound images. 2014 IEEE 16th international conference on e-health networking, applications and services (Healthcom); 2014: IEEE. [54] Mesa H, Veredas FJ, Morente L, editors. A hybrid approach for tissue recognition on wound images. 2008 Eighth International Conference on Hybrid Intelligent Systems; 2008: IEEE. [55] Silveira M, Nascimento JC, Marques JS, Marçal AR, Mendonça T, Yamauchi S, et al. Comparison of segmentation methods for melanoma diagnosis in dermoscopy images. IEEE Journal of Selected Topics in Signal Processing. 2009;3(1):35-45. [56] Gholami P, Ahmadi-Pajouh MA, Abolftahi N, Hamarneh G, Kayvanrad M. Segmentation and measurement of chronic wounds for bioprinting. IEEE journal of biomedical and health informatics. 2017;22(4):1269-77. [57] Li D, Mathews C. Automated measurement of pressure injury through image processing. Journal of Clinical Nursing. 2017;26(21-22):3564-75. [58] García-Zapirain B, Elmogy M, El-Baz A, Elmaghraby AS. Classification of pressure ulcer tissues with 3D convolutional neural network. Medical & biological engineering & computing. 2018;56(12):2245-58. [59] Elmogy M, García-Zapirain B, Elmaghraby AS, El-Baz A, editors. An automated classification framework for pressure ulcer tissues based on 3d convolutional neural network. 2018 24th International Conference on Pattern Recognition (ICPR); 2018: IEEE. [60] Rajathi V, Bhavani R, Wiselin Jiji G. Varicose ulcer (C6) wound image tissue classification using multidimensional convolutional neural networks. The Imaging Science Journal. 2019;67(7):374-84. [61] Lin T-Y, Maire M, Belongie S, Hays J, Perona P, Ramanan D, et al., editors. Microsoft coco: Common objects in context. European conference on computer vision; 2014: Springer. [62] Wah C, Branson S, Welinder P, Perona P, Belongie S. The caltech-ucsd birds-200-2011 dataset. 2011. [63] Zhang C, Bengio S, Hardt M, Recht B, Vinyals O. Understanding deep learning (still) requires rethinking generalization. Communications of the ACM. 2021;64(3):107-15. [64] Ronneberger O, Fischer P, Brox T, editors. U-net: Convolutional networks for biomedical image segmentation. International Conference on Medical image computing and computer-assisted intervention; 2015: Springer. [65] Blanc-Durand P, Van Der Gucht A, Schaefer N, Itti E, Prior JO. Automatic lesion detection and segmentation of 18F-FET PET in gliomas: a full 3D U-Net convolutional neural network study. PLoS One. 2018;13(4):e0195798. [66] Fabijańska A. Segmentation of corneal endothelium images using a U-Net-based convolutional neural network. Artificial intelligence in medicine. 2018;88:1-13. [67] Dong X, Lei Y, Wang T, Thomas M, Tang L, Curran WJ, et al. Automatic multiorgan segmentation in thorax CT images using U‐net‐GAN. Medical physics. 2019;46(5):2157-68. [68] Zhang Y, Chen J-H, Chang K-T, Park VY, Kim MJ, Chan S, et al. Automatic breast and fibroglandular tissue segmentation in breast MRI using deep learning by a fully-convolutional residual neural network U-net. Academic radiology. 2019;26(11):1526-35. [69] Ren S, He K, Girshick R, Sun J. Faster r-cnn: Towards real-time object detection with region proposal networks. Advances in neural information processing systems. 2015;28. [70] He K, Gkioxari G, Dollár P, Girshick R, editors. Mask r-cnn. Proceedings of the IEEE international conference on computer vision; 2017. [71] Zhang R, Cheng C, Zhao X, Li X. Multiscale mask R-CNN–based lung tumor detection using PET Imaging. Molecular imaging. 2019;18:1536012119863531. [72] Chiao J-Y, Chen K-Y, Liao KY-K, Hsieh P-H, Zhang G, Huang T-C. Detection and classification the breast tumors using mask R-CNN on sonograms. Medicine. 2019;98(19). [73] Vuola AO, Akram SU, Kannala J, editors. Mask-RCNN and U-net ensembled for nuclei segmentation. 2019 IEEE 16th International Symposium on Biomedical Imaging (ISBI 2019); 2019: IEEE. [74] Zhao T, Yang Y, Niu H, Wang D, Chen Y, editors. Comparing U-Net convolutional network with mask R-CNN in the performances of pomegranate tree canopy segmentation. Multispectral, hyperspectral, and ultraspectral remote sensing technology, techniques and applications VII; 2018: International Society for Optics and Photonics. [75] Bouget D, Jørgensen A, Kiss G, Leira HO, Langø T. Semantic segmentation and detection of mediastinal lymph nodes and anatomical structures in CT data for lung cancer staging. International journal of computer assisted radiology and surgery. 2019;14(6):977-86. [76] Redlarski G, Palkowski A, Krawczuk M. Body surface area formulae: an alarming ambiguity. Scientific reports. 2016;6(1):1-8. [77] Shuter B, Aslani A. Body surface area: Du bois and Du bois revisited. European journal of applied physiology. 2000;82(3):250-4. [78] Yu C-Y, Lin C-H, Yang Y-H. Human body surface area database and estimation formula. Burns. 2010;36(5):616-29. [79] Rumpf RW, Stewart WC, Martinez SK, Gerrard CY, Adolphi NL, Thakkar R, et al. Comparison of the Lund and Browder table to computed tomography scan three-dimensional surface area measurement for a pediatric cohort. journal of surgical research. 2018;221:275-84. [80] Amit LM, Song Y-W. Formulae evaluation for estimating body surface area of Korean children. Journal of UOEH. 2018;40(1):19-32. [81] Sigurdsson TS, Lindberg L. Six commonly used empirical body surface area formulas disagreed in young children undergoing corrective heart surgery. Acta Paediatrica. 2020;109(9):1838-46. [82] Choi J, Patil A, Vendrow E, Touponse G, Aboukhater L, Forrester JD, et al. Practical Computer Vision Application to Compute Total Body Surface Area Burn: Reappraising a Fundamental Burn Injury Formula in the Modern Era. JAMA surgery. 2022;157(2):129-35. [83] Williams RY, Wohlgemuth SD. Does the “rule of nines” apply to morbidly obese burn victims? Journal of Burn Care & Research. 2013;34(4):447-52. [84] Borhani-Khomani K, Partoft S, Holmgaard R. Assessment of burn size in obese adults; a literature review. Journal of plastic surgery and hand surgery. 2017;51(6):375-80. [85] Giretzlehner M, Dirnberger J, Owen R, Haller H, Lumenta D, Kamolz L-P. The determination of total burn surface area: how much difference? Burns. 2013;39(6):1107-13. [86] Parvizi D, Giretzlehner M, Wurzer P, Klein LD, Shoham Y, Bohanon FJ, et al. BurnCase 3D software validation study: burn size measurement accuracy and inter-rater reliability. Burns. 2016;42(2):329-35. [87] Rhodes J, Clay C, Phillips M. The surface area of the hand and the palm for estimating percentage of total body surface area: results of a meta‐analysis. British Journal of Dermatology. 2013;169(1):76-84. [88] Jose RM, Roy DK, Wright PK, Erdmann M. Hand surface area—Do racial differences exist? Burns. 2006;32(2):216-7. [89] Cox S, Kriho K, De Klerk S, van Dijk M, Rode H. Total body and hand surface area: Measurements, calculations, and comparisons in ethnically diverse children in South Africa. Burns. 2017;43(7):1567-74. [90] Choi H, Park MS, Lee H-M. Hand surface area as a percentage of body surface area in Asian children: A pilot study. Burns. 2011;37(6):1062-6. [91] Giretzlehner M, Ganitzer I, Haller H. Technical and medical aspects of burn size assessment and documentation. Medicina. 2021;57(3):242. [92] Dargan D, Mandal A, Shokrollahi K. Hand burns surface area: A rule of thumb. Burns. 2018;44(5):1346-51. [93] Huang Z, Huang L, Gong Y, Huang C, Wang X, editors. Mask scoring r-cnn. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition; 2019. [94] Abubakar A, Ugail H, Bukar AM. Assessment of human skin Burns: a deep transfer learning approach. Journal of Medical and Biological Engineering. 2020;40(3):321-33. [95] Rousselle P, Braye F, Dayan G. Re-epithelialization of adult skin wounds: cellular mechanisms and therapeutic strategies. Advanced Drug Delivery Reviews. 2019;146:344-65. [96] Ali A, Shamsuddin SM, Ralescu AL. Classification with class imbalance problem. Int J Advance Soft Compu Appl. 2013;5(3). [97] Johnson JM, Khoshgoftaar TM. Survey on deep learning with class imbalance. Journal of Big Data. 2019;6(1):1-54. [98] García V, Alejo R, Sánchez JS, Sotoca JM, Mollineda RA, editors. Combined effects of class imbalance and class overlap on instance-based classification. International Conference on Intelligent Data Engineering and Automated Learning; 2006: Springer. [99] Abramovich F, Pensky M. Classification with many classes: challenges and pluses. Journal of Multivariate Analysis. 2019;174:104536. [100] Hashemi S, Yang Y, Mirzamomen Z, Kangavari M. Adapted one-versus-all decision trees for data stream classification. IEEE Transactions on Knowledge and Data Engineering. 2008;21(5):624-37. [101] Ren X, Malik J, editors. Learning a classification model for segmentation. Computer Vision, IEEE International Conference on; 2003: IEEE Computer Society. [102] Achanta R, Shaji A, Smith K, Lucchi A, Fua P, Süsstrunk S. Slic superpixels. 2010. [103] Achanta R, Shaji A, Smith K, Lucchi A, Fua P, Süsstrunk S. SLIC superpixels compared to state-of-the-art superpixel methods. IEEE transactions on pattern analysis and machine intelligence. 2012;34(11):2274-82. [104] Li Z, Chen J, editors. Superpixel segmentation using linear spectral clustering. Proceedings of the IEEE conference on computer vision and pattern recognition; 2015. [105] Zhang H, Wu C, Zhang L, Zheng H, editors. A novel centroid update approach for clustering-based superpixel methods and superpixel-based edge detection. 2020 IEEE International Conference on Image Processing (ICIP); 2020: IEEE. [106] Zhang J, Aviles-Rivero AI, Heydecker D, Zhuang X, Chan R, Schönlieb C-B. Dynamic spectral residual superpixels. Pattern Recognition. 2021;112:107705. [107] Gong Y-J, Zhou Y. Differential evolutionary superpixel segmentation. IEEE Transactions on Image Processing. 2017;27(3):1390-404. [108] Zhang Y, Guo Q, Zhang C. Simple and fast image superpixels generation with color and boundary probability. The Visual Computer. 2021;37(5):1061-74. [109] Maghsoudi OH, editor Superpixel based segmentation and classification of polyps in wireless capsule endoscopy. 2017 IEEE Signal Processing in Medicine and Biology Symposium (SPMB); 2017: IEEE. [110] Wannous H, Treuillet S, Lucas Y, editors. Supervised tissue classification from color images for a complete wound assessment tool. 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society; 2007: IEEE. [111] Cabitza F, Campagner A, Albano D, Aliprandi A, Bruno A, Chianca V, et al. The elephant in the machine: Proposing a new metric of data reliability and its application to a medical case to assess classification reliability. Applied Sciences. 2020;10(11):4014. [112] Lu S-L, Xiao F-R, Cheng JC-H, Yang W-C, Cheng Y-H, Chang Y-C, et al. Randomized multi-reader evaluation of automated detection and segmentation of brain tumors in stereotactic radiosurgery with deep neural networks. Neuro-oncology. 2021;23(9):1560-8. [113] Irving B, Franklin JM, Papież BW, Anderson EM, Sharma RA, Gleeson FV, et al. Pieces-of-parts for supervoxel segmentation with global context: Application to DCE-MRI tumour delineation. Medical image analysis. 2016;32:69-83. [114] hen L-C, Papandreou G, Schroff F, Adam H. Rethinking atrous convolution for semantic image segmentation. arXiv preprint arXiv:170605587. 2017. [115] Zhao H, Shi J, Qi X, Wang X, Jia J, editors. Pyramid scene parsing network. Proceedings of the IEEE conference on computer vision and pattern recognition; 2017. [116] Lin T-Y, Dollár P, Girshick R, He K, Hariharan B, Belongie S, editors. Feature pyramid networks for object detection. Proceedings of the IEEE conference on computer vision and pattern recognition; 2017. [117] Committee JSoPUGR. JSPU guidelines for the prevention and management of pressure ulcers. Jpn J PU. 2016;18(4):455-544. [118] Galvagno SM, Nahmias JT, Young DA. Advanced trauma life support® Update 2019: management and applications for adults and special populations. Anesthesiology clinics. 2019;37(1):13-32. [119] Smolle C, Cambiaso-Daniel J, Forbes AA, Wurzer P, Hundeshagen G, Branski LK, et al. Recent trends in burn epidemiology worldwide: a systematic review. Burns. 2017;43(2):249-57. [120] Sadideen H, D'Asta F, Moiemen N, Wilson Y. Does overestimation of burn size in children requiring fluid resuscitation cause any harm? Journal of Burn Care & Research. 2017;38(2):e546-e51. [121] Huang S, Dang J, Sheckter CC, Yenikomshian HA, Gillenwater J. A systematic review of machine learning and automation in burn wound evaluation: a promising but developing frontier. Burns. 2021;47(8):1691-704. [122] Veredas F, Mesa H, Morente L. Binary tissue classification on wound images with neural networks and bayesian classifiers. IEEE transactions on medical imaging. 2009;29(2):410-27. [123] Nizam K, Fauzi MFA, Ahmad NN, Nair HK, editors. Characterization of Tissues in Chronic Wound Images. 2018 IEEE Student Conference on Research and Development (SCOReD); 2018: IEEE. [124] Weiss K, Khoshgoftaar TM, Wang D. A survey of transfer learning. Journal of Big data. 2016;3(1):1-40. [125] Pan SJ, Yang Q. A survey on transfer learning. IEEE Transactions on knowledge and data engineering. 2009;22(10):1345-59. [126] Zhuang F, Qi Z, Duan K, Xi D, Zhu Y, Zhu H, et al. A comprehensive survey on transfer learning. Proceedings of the IEEE. 2020;109(1):43-76. [127] Stockton K, McMillan C, Storey K, David M, Kimble R. 3D photography is as accurate as digital planimetry tracing in determining burn wound area. Burns. 2015;41(1):80-4. [128] Chang AC, Dearman B, Greenwood JE. A comparison of wound area measurement techniques: visitrak versus photography. Eplasty. 2011;11. [129] Jørgensen LB, Sørensen JA, Jemec GB, Yderstræde KB. Methods to assess area and volume of wounds–a systematic review. International wound journal. 2016;13(4):540-53. [130] Haller H, Dirnberger J, Giretzlehner M, Rodemund C, Kamolz L. “Understanding burns”: research project BurnCase 3D—overcome the limits of existing methods in burns documentation. Elsevier; 2009. p. 311-7. [131] Retrouvey H, Chan J, Shahrokhi S. Comparison of two-dimensional methods versus three-dimensional scanning systems in the assessment of total body surface area estimation in burn patients. Burns. 2018;44(1):195-200. [132] Benjamin NC, Lee JO, Norbury WB, Branski LK, Wurzer P, Jimenez CJ, et al. Accuracy of currently used paper burn diagram vs a three-dimensional computerized model. Journal of Burn Care & Research. 2017;38(1):e254-e60.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/85745-
dc.description.abstract機器學習用在非常多的醫學領域,例如:腫瘤偵測、基因分析、藥物的開發,然而運用於傷口的診斷卻非常的少見。主要的是因為傷口照片取得的限制,傷口的照片不同其他醫學影像是標準化,也無法完全去識別化,拍攝時更無法準確控制病患姿勢,以及控制拍攝的條件(光線及角度),因此傳統的機器學習,以非常有限的訓練集,儘管在文獻中結果不錯,實際使用卻不如預期。 與傳統的機器學習相比,深度學習可以輸入所有看似不利結果的資訊,並且讓模型在各種環境下皆可以準確診斷。例如:光線的通道,常在傳統的機械學習中被排除,以便使用少量的資料訓練就得到很好的結果,然而以此訓練出的模型,卻只要有稍微光線不良的情況,預測的結果就大打折扣,針對深度學習模型的訓練,則可以把這些因素都一起加入訓練,訓練的情境越多元化,模型實際應用的判斷能力越好,然而深度模型的建立,起初需要大量且系統化標註正確的資料,而正確的標註傷口是十分重要卻又困難。 不同種類的傷口,有不同的診斷需要。例如:急性燙傷的評估,最重要的是燙傷面積佔全身多少比例的體表面積(% TBSA)以及可能需要植皮及清創的深度燙傷;對於慢性傷口,不同組織的組成比例很重要,傷口的絕對面積大小也很重要。上述的這些參數,大致上是兩個任務的衍伸,傷口分割及組織分類。傷口分割是便是那些範圍是傷口,那些是正常皮膚,進一步的可以算出傷口的絕對或相對面積。而組織辨識是為了區分傷口床內以及傷口周圍的不同組織,以得知傷口癒合狀態,肉芽是慢性傷口的一種組織,而深度燙傷也可視為燙傷大範圍中的一種組織。因此這樣概念可運用於急性燙傷也可用於慢性潰瘍。 針對傷口的邊界及周圍組織,使用邊界的標記方式,對於傷口內部的組織,使用以超像素切割後的區域標定方式法。此標記的方式可以用於急性的燙傷及慢性傷口標定。不同深度模型以此系統性標定方法的資料庫訓練,可在傷口分割及組織辨識都有不錯的結果。zh_TW
dc.description.abstractMachine learning (ML) has been applied in many medical fields, such as tumor detection, genetic decoding, and drug development. The clinical application for wound assessment is relatively scarce. The reason is mainly because of the limitation of images of wounds. The images of wounds are not standardized and de-identification. Images are usually taken in poor conditions (light or angle) when patients are unable to maintain their positions. The previous articles on ML for wound assessment were usually trained from a limited dataset. Although the performances in papers are satisfactory, the real applications show the opposite. Compared with traditional ML, deep learning (DL) can comprehend all unfavored factors rather than exclude them in order to get better results. For example, the luminance component was usually eliminated to minimize the effect of lighting. Although the elimination of the luminance component will produce better validation, it makes the model difficult to apply in actual clinical conditions. However, training DL models requires plenty of labeled images. A systemic processing and labeling method to set up datasets is most crucial. Different types of wounds need to be described by various outputs. For example, the assessment of acute burn wounds is to calculate the percentage of total body surface (%TBSA) burned and the deep burn area requiring debridement or graft, whereas the evaluations of chronic ulcers focus on types of tissues and the absolute size of wounds. All the parameters are the extension of two tasks: wound segmentation and tissue classification/segmentation. Wound segmentation is to differentiate all areas of wounds from normal skin. The results are transferred to the size of the wounds. Tissue classification is to segment different tissues inside and peri-wound. The granulation on pressure ulcers is a type of tissue, while the deep burn without perfusion is also a type of tissue inside the whole burn area. The two concepts can be applied to any wound, from acute burns to chronic ulcers. In this study, the boundary-based labeling method is used for wound edge and peri-wound tissues labeling. The region-based labeled method by superpixel segmentation pre-processing is applied for tissue labeling inside wounds. Both acute burn wounds and chronic ulcers can build the high quality and standard datasets by the approach. Several DL models training from these datasets have decent results for wound segmentation and tissue classifications.en
dc.description.provenanceMade available in DSpace on 2023-03-19T23:23:06Z (GMT). No. of bitstreams: 1
U0001-1805202214432500.pdf: 6462372 bytes, checksum: 01941b339b82f85a4315bb68762626d1 (MD5)
Previous issue date: 2022
en
dc.description.tableofcontents口試委員會審定書 # 誌謝 i 中文摘要 ii ABSTRACT iii LIST OF FIGURES vii LIST OF TABLES xiii Chapter 1 Introduction 1 1.1 Acute burn wounds 1 1.2 Chronic ulcers 3 Chapter 2. Related Works 5 2.1 Evolution of ML in burn wounds 5 2.1.1 Burn depth classification 7 2.1.2 Burn depth Segmentation 12 2.1.3 Burn wound segmentation 15 2.2 Potential challenges 16 2.3 Chronic ulcers (Pressure ulcers) 17 2.3.1 Wound segmentation 18 2.3.2 Tissue classification (segmentation) 19 Chapter 3. Research Methods 23 3.1 Output of computer vision 23 3.2 Image classification 25 3.3 Labeling and processing 30 3.4 Evaluation metrics 32 3.5 Deep learning models 34 3.6 Burn wound segmentation 38 3.7 Convert to %TBSA 42 3.8 Comparision study 49 3.9 Limitations 51 4.1 Chronic ulcers (pressure ulcers) 54 4.1.1 Boundary-based labeling 55 4.1.2 Different combinations of classes 59 4.1.3 Region-based labeling (Superpixel segmentation) 63 4.1.4 Deep learning models 70 4.1.5 Ulceration and Re-ep segmentation 73 4.1.6 Inside wound tissues segmentation 74 4.1.7 Automatic diagnosis of pressure ulcers 77 4.2 Acute burn wounds 82 4.2.1 Boundary-based labeling (total burn & palm) 83 4.2.2 Region-based labeling (deep burn area) 84 4.2.3 Total burn wound segmentation 86 4.2.4 Palm segmentation 88 4.2.5 Deep burn segmentation 90 4.2.6 Automatic diagnosis of acute burn wounds 92 Chapter 5. Conclusion and Future work 96 5.1 Other types of chronic ulcers 98 5.1.1 Tissues outside ulcerations 100 5.1.2 Transfer learning 102 5.2 3D measure & 3D segmentation 105 Reference 107
dc.language.isoen
dc.subject急性燙傷zh_TW
dc.subject壓瘡zh_TW
dc.subject慢性傷口zh_TW
dc.subject深度學習zh_TW
dc.subject傷口分割zh_TW
dc.subject組織辨識zh_TW
dc.subject超級像素分割zh_TW
dc.subjectChronic ulcersen
dc.subjectAcute burn woundsen
dc.subjectTissue classificationen
dc.subjectSuperpixel segmentationen
dc.subjectDeep learningen
dc.subjectWound Segmentationen
dc.title深度學習用於傷口診斷:從急性燙傷到慢性傷口zh_TW
dc.titleDeep Learning Assisted Wound Diagnoses: From Acute Burn Wounds to Chronic Ulcersen
dc.typeThesis
dc.date.schoolyear110-2
dc.description.degree博士
dc.contributor.author-orcid0000-0002-5864-017X
dc.contributor.advisor-orcid賴飛羆(0000-0001-7147-8122)
dc.contributor.oralexamcommittee戴浩志(Hao-Chih Tai),趙坤茂(Kun-Mao Chao),蔡坤霖(Kun-Lin Tsai),林永松(Yeong-Sung Lin),王水深(Shoei-Shen Wang),邱冠明(Kuan-Ming Chiu),李源德(Yuan-Teh Lee)
dc.subject.keyword急性燙傷,壓瘡,慢性傷口,深度學習,傷口分割,組織辨識,超級像素分割,zh_TW
dc.subject.keywordAcute burn wounds,Chronic ulcers,Deep learning,Wound Segmentation,Tissue classification,Superpixel segmentation,en
dc.relation.page123
dc.identifier.doi10.6342/NTU202200776
dc.rights.note同意授權(全球公開)
dc.date.accepted2022-05-27
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept生醫電子與資訊學研究所zh_TW
dc.date.embargo-lift2022-12-31-
顯示於系所單位:生醫電子與資訊學研究所

文件中的檔案:
檔案 大小格式 
U0001-1805202214432500.pdf6.31 MBAdobe PDF檢視/開啟
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved