請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/73764完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 張瑞峰 | |
| dc.contributor.author | Li Lin | en |
| dc.contributor.author | 林立 | zh_TW |
| dc.date.accessioned | 2021-06-17T08:09:42Z | - |
| dc.date.available | 2024-08-19 | |
| dc.date.copyright | 2019-08-19 | |
| dc.date.issued | 2019 | |
| dc.date.submitted | 2019-08-16 | |
| dc.identifier.citation | [1] F. Bray, J. Ferlay, I. Soerjomataram, R. L. Siegel, L. A. Torre, and A. Jemal, 'Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries,' CA: A Cancer Journal for Clinicians, vol. 68, no. 6, pp. 394-424, Sep 2018.
[2] L. Wang, 'Early Diagnosis of Breast Cancer,' Sensors, vol. 17, no. 7, p. 1572, Jul 2017. [3] E. B. Mendelson, M. Böhm-Vélez, and W. A. Berg, 'ACR BI-RADS atlas, breast imaging reporting and data system.,' Reston, VA: American College of Radiology, Jan 2014. [4] P. Kisilev, E. Walach, E. Barkan, B. Ophir, S. Alpert, and S. Y. Hashoul, 'From medical image to automatic medical report generation,' IBM Journal of Research and Development, vol. 59, no. 2/3, pp. 2:1-2:7, Apr 2015. [5] P. Kisilev, E. Walach, S. Y. Hashoul, E. Barkan, B. Ophir, and S. Alpert, 'Semantic description of medical image findings: structured learning approach,' in Proceedings of the British Machine Vision Conference (BMVC), 2015, pp. 171.1-171.11. [6] P. Kisilev, E. Sason, E. Barkan, and S. Hashoul, 'Medical image captioning: learning to describe medical image findings using multi-task-loss CNN,' Deep Learning for Precision Medicine, 2016. [7] Y. Jiang, L. Chen, H. Zhang, and X. Xiao, 'Breast cancer histopathological image classification using convolutional neural networks with small SE-ResNet module,' PloS one, vol. 14, no. 3, Mar 2019. [8] M. Nawaz, A. A. Sewissy, and T. H. A. Soliman, 'Multi-Class Breast Cancer Classification using Deep Learning Convolutional Neural Network,' International Journal of Advanced Computer Science and Applications, vol. 9, Jan 2018. [9] T. Xiao, L. Liu, K. Li, W. Qin, S. Yu, and Z. Li, 'Comparison of Transferred Deep Neural Networks in Ultrasonic Breast Masses Discrimination,' BioMed research international, vol. 2018, pp. 4605191-4605191, Jun 2018. [10] G. Ye, J. Ruan, C. Wu, J. Zhou, S. He, J. Wang, Z. Zhu, J. Yue, and Y. Zhang, 'Multitask Classification of Breast Cancer Pathological Images Using SE-DenseNet,' in International Conference on Advanced Computational Intelligence (ICACI), 2019, pp. 173-178. [11] A. Akselrod-Ballin, M. Chorev, Y. Shoshan, A. Spiro, A. Hazan, R. Melamed, E. Barkan, E. Herzel, S. Naor, E. Karavani, G. Koren, Y. Goldschmidt, V. Shalev, M. Rosen-Zvi, and M. Guindy, 'Predicting Breast Cancer by Applying Deep Learning to Linked Health Records and Mammograms,' Radiology, vol. 292, no. 2, pp. 331-342, Jun 2019. [12] J. Chhatwal, O. Alagoz, M. J. Lindstrom, C. E. Kahn, Jr., K. A. Shaffer, and E. S. Burnside, 'A logistic regression model based on the national mammography database format to aid breast cancer diagnosis,' AJR. American journal of roentgenology, vol. 192, no. 4, pp. 1117-1127, May 2009. [13] W. Lu, Z. Wang, Y. He, H. Yu, N. Xiong, and J. Wei, 'Breast Cancer Detection Based on Merging Four Modes MRI Using Convolutional Neural Networks,' in IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2019, pp. 1035-1039. [14] B. Lei, S. Huang, R. Li, C. Bian, H. Li, Y.-H. Chou, and J.-Z. Cheng, 'Segmentation of breast anatomy for automated whole breast ultrasound images with boundary regularized convolutional encoder–decoder network,' Neurocomputing, vol. 321, pp. 178-186, Dec 2018. [15] H. Zhao, J. Shi, X. Qi, X. Wang, and J. Jia, 'Pyramid Scene Parsing Network,' in IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017, pp. 6230-6239. [16] C. Szegedy, L. Wei, J. Yangqing, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke, and A. Rabinovich, 'Going deeper with convolutions,' in IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2015, pp. 1-9. [17] P. Krähenbühl and V. Koltun, 'Efficient inference in fully connected CRFs with Gaussian edge potentials,' presented at the Neural Information Processing Systems, Granada, Spain, 2011. [18] O. Ronneberger, P. Fischer, and T. Brox, 'U-Net: Convolutional Networks for Biomedical Image Segmentation,' in Medical Image Computing and Computer-Assisted Intervention, 2015, pp. 234-241. [19] Z. Zhou, M. M. Rahman Siddiquee, N. Tajbakhsh, and J. Liang, 'UNet++: A Nested U-Net Architecture for Medical Image Segmentation,' in Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support, 2018, pp. 3-11. [20] O. Oktay, J. Schlemper, L. L. Folgoc, M. Lee, M. Heinrich, K. Misawa, K. Mori, S. McDonagh, N. Y. Hammerla, B. Kainz, B. Glocker, and D. Rueckert, 'Attention U-Net: Learning Where to Look for the Pancreas,' presented at the Conference on Medical Imaging with Deep Learning, 2018. [21] S. C. Chen, Y. C. Cheung, C. H. Su, M. F. Chen, T. L. Hwang, and S. Hsueh, 'Analysis of sonographic features for the differentiation of benign and malignant breast tumors of different sizes,' Ultrasound in Obstetrics & Gynecology, vol. 23, no. 2, pp. 188-193, 2004/02/01 2004. [22] A. S. Hong, E. L. Rosen, M. S. Soo, and J. A. Baker, 'BI-RADS for Sonography: Positive and Negative Predictive Values of Sonographic Features,' American Journal of Roentgenology, vol. 184, no. 4, pp. 1260-1265, 2005/04/01 2005. [23] G. Rahbar, A. C. Sie, G. C. Hansen, J. S. Prince, M. L. Melany, H. E. Reynolds, V. P. Jackson, J. W. Sayre, and L. W. Bassett, 'Benign versus Malignant Solid Breast Masses: US Differentiation,' Radiology, vol. 213, no. 3, pp. 889-894, 1999/12/01 1999. [24] J. Shan, S. K. Alam, B. Garra, Y. Zhang, and T. Ahmed, 'Computer-Aided Diagnosis for Breast Ultrasound Using Computerized BI-RADS Features and Machine Learning Methods,' Ultrasound in Medicine & Biology, vol. 42, no. 4, pp. 980-988, 2016/04/01/ 2016. [25] T. M. G. Oliveira, T. K. Brasileiro Sant'Anna, F. M. Mauad, J. Elias Jr, and V. F. Muglia, 'Breast imaging: is the sonographic descriptor of orientation valid for magnetic resonance imaging?,' Journal of Magnetic Resonance Imaging, vol. 36, no. 6, pp. 1383-1388, 2012/12/01 2012. [26] S. Raza, S. A. Chikarmane, S. S. Neilsen, L. M. Zorn, and R. L. Birdwell, 'BI-RADS 3, 4, and 5 Lesions: Value of US in Management—Follow-up and Outcome,' Radiology, vol. 248, no. 3, pp. 773-781, 2008/09/01 2008. [27] A. T. Stavros, D. Thickman, C. L. Rapp, M. A. Dennis, S. H. Parker, and G. A. Sisney, 'Solid breast nodules: use of sonography to distinguish between benign and malignant lesions,' Radiology, vol. 196, no. 1, pp. 123-134, 1995/07/01 1995. [28] N. Sannomiya, Y. Hattori, N. Ueda, A. Kamida, Y. Koyanagi, H. Nagira, S. Ikunishi, K. Shimabayashi, Y. Hashimoto, A. Murata, K. Sato, Y. Hirooka, K. Hosoya, K. Ishiguro, Y. Murata, and Y. Hirooka, 'Correlation between Ultrasound Findings of Tumor Margin and Clinicopathological Findings in Patients with Invasive Ductal Carcinoma of the Breast,' (in eng), Yonago acta medica, vol. 59, no. 2, pp. 163-168, 2016. [29] M. J. G. Calas, H. A. Koch, and M. V. P. Dutra, 'Ultra-sonografia mamária: avaliação dos critérios ecográficos na diferenciação das lesões mamárias,' Radiologia Brasileira, vol. 40, pp. 1-7, 2007. [30] K. Horsch, M. L. Giger, L. A. Venta, and C. J. Vyborny, 'Computerized diagnosis of breast lesions on ultrasound,' Medical Physics, vol. 29, no. 2, pp. 157-164, 2002/02/01 2002. [31] W. C. Shen, R. F. Chang, W. K. Moon, Y. H. Chou, and C. S. Huang, 'Breast ultrasound computer-aided diagnosis using BI-RADS features,' (in English), Academic Radiology, vol. 14, no. 8, pp. 928-939, Aug 2007. [32] Z. Zhou, S. Wu, K.-J. Chang, W.-R. Chen, Y.-S. Chen, W.-H. Kuo, C.-C. Lin, and P.-H. Tsui, 'Classification of Benign and Malignant Breast Tumors in Ultrasound Images with Posterior Acoustic Shadowing Using Half-Contour Features,' Journal of Medical and Biological Engineering, vol. 35, no. 2, pp. 178-187, Apr 2015. [33] N. Antropova, B. Q. Huynh, and M. L. Giger, 'A deep feature fusion methodology for breast cancer diagnosis demonstrated on three imaging modality datasets,' (in eng), Medical physics, vol. 44, no. 10, pp. 5162-5171, 2017. [34] B. Q. Huynh, H. Li, and M. L. Giger, 'Digital mammographic tumor classification using transfer learning from deep convolutional neural networks,' (in eng), Journal of medical imaging, vol. 3, pp. 034501-034501, Aug 2016. [35] F. Chollet, 'Xception: Deep Learning with Depthwise Separable Convolutions,' in IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017, pp. 1800-1807. [36] C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, 'Rethinking the Inception Architecture for Computer Vision,' in IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp. 2818-2826. [37] L. Sifre and S. Mallat, 'Rotation, Scaling and Deformation Invariant Scattering for Texture Discrimination,' in IEEE Conference on Computer Vision and Pattern Recognition, 2013, pp. 1233-1240. [38] B. Jing, P. Xie, and E. Xing, 'On the Automatic Generation of Medical Imaging Reports,' presented at the Association for Computational Linguistics, 2018. [39] H. Shin, K. Roberts, L. Lu, D. Demner-Fushman, J. Yao, and R. M. Summers, 'Learning to Read Chest X-Rays: Recurrent Neural Cascade Model for Automated Image Annotation,' in IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp. 2497-2506. [40] X. Wang, Y. Peng, L. Lu, Z. Lu, and R. M. Summers, 'TieNet: Text-Image Embedding Network for Common Thorax Disease Classification and Reporting in Chest X-Rays,' in IEEE Conference on Computer Vision and Pattern Recognition, 2018, pp. 9049-9058. [41] K. Pearson, 'On the criterion that a given system of deviations from the probable in the case of a correlated system of variables is such that it can be reasonably supposed to have arisen from random sampling,' Philosophical Magazine Series 5, pp. 157-175, 1900. | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/73764 | - |
| dc.description.abstract | 乳癌是女性中最常見的癌症,藉由早期偵測可以降低乳癌的致死率。超音波是一種常見的早期偵測方法,放射科醫生分析超音波影像來撰寫成報告,並根據報告來決定病人是否要進行更進一步的檢查。然而對於放射科醫生來說,撰寫報告除了要具備對於乳癌方面的基本知識,也必須要有分析超音波影像的能力。同時,撰寫報告是一件枯燥且耗時的事情。本研究提出了一個自動報告產生系統來幫助醫生分析影像與完成報告。首先,我們使用了PSPNet的切割方法從超音波影像提取腫瘤區域並使用dense CRFs來進行後處理。接著我們使用深度學習模型、機器學習的分類器與集成學習的模型來進行腫瘤特徵的預測。最後,我們使用了這些預測結果來產生報告。在這個實驗中,利用了318個腫瘤來測試我們提出的方法。由實驗結果可知,由平均集成學習生成的模型在不同的腫瘤特徵分類上有最好的結果,其形狀、平行度、邊界特性、均質度、後方區域特性準確度分別為85.85% (273/318)、83.02% (264/318)、80.19% (255/318)、78.62% (250/318)、87.11% (277/318)。 | zh_TW |
| dc.description.abstract | Breast cancer is the most common cancer in women. Early detection could reduce the mortality rate of breast cancer. Ultrasound is often used for early detection, and ultrasound images are used to write the medical reports for the evaluation of further examinations by radiologists. However, for radiologists, writing reports requires domain knowledge of breast and skills of ultrasound images analysis, and it is tedious and time-consuming. In this study, an automatic reporting system was proposed to assist radiologists in writing the reports and analyzing the ultrasound images. First, the tumor region was extracted by the pyramid scene parsing network (PSPNet) segmentation model with dense dense condition random fields (CRFs). Second, the DL model and ML classifiers were applied to predict the lexicon of the tumor, and we also used the ensemble method the combine the DL model and ML classifiers. Finally, the predicted lexicons were applied to generate medical imaging reports. In this experiment, a totally of 318 tumors with ultrasound lexicons were used to evaluate our proposed method. According to the experiment results, the ensemble method using the average strategy combined with the DL model and the ML classifiers has the highest lexicon prediction performance, and the accuracy of lexicon prediction (shape, orientation, margin, heterogeneity, posterior features) were 85.85% (273/318), 83.02% (264/318), 80.19% (255/318), 78.62% (250/318), 87.11% (277/318). | en |
| dc.description.provenance | Made available in DSpace on 2021-06-17T08:09:42Z (GMT). No. of bitstreams: 1 ntu-108-R06922124-1.pdf: 1538483 bytes, checksum: 8f89dff5f2d8fec22802f6d92d374c1b (MD5) Previous issue date: 2019 | en |
| dc.description.tableofcontents | 口試委員審定書 i
致謝 ii 摘要 iii Abstract iv Table of Contents vi List of Figures viii List of Tables x Chapter 1. Introduction 1 Chapter 2. Material 3 Chapter 3. Methods 5 3.1 Tumor segmentation 6 3.1.1 PSPNet 6 3.1.2 PSPNet training details 6 3.1.3 Dense CRFs 8 3.2 Lexicon prediction 9 3.2.1 BI-RADS Lexicon 9 3.2.2 ML classifiers 15 3.2.3 The DL model 18 3.2.4 Ensemble method 21 3.3 Automatic report generation 21 Chapter 4. Experiment Result 23 4.1 Comparison of the DL model prediction with/without tumor segmentation 23 4.2 Comparison of different classifiers 26 4.2.1 Shape 26 4.2.2 Orientation 29 4.2.3 Margin 32 4.2.4 Heterogeneity 35 4.2.5 Posterior features 38 4.3 Comparison of ensemble results 41 Chapter 5. Discussion and Conclusion 44 References 47 | |
| dc.language.iso | en | |
| dc.subject | 乳癌 | zh_TW |
| dc.subject | 醫學影像報告 | zh_TW |
| dc.subject | 深度學習 | zh_TW |
| dc.subject | BI-RADS 字典 | zh_TW |
| dc.subject | 超音波影像 | zh_TW |
| dc.subject | Xception | zh_TW |
| dc.subject | PSPNet | zh_TW |
| dc.subject | breast cancer | en |
| dc.subject | deep learning | en |
| dc.subject | Xception | en |
| dc.subject | PSPNet | en |
| dc.subject | ultrasound images | en |
| dc.subject | BI-RADS lexicon | en |
| dc.subject | medical imaging reports | en |
| dc.title | 基於深度學習於2-D乳房超音波報告自動產生系統 | zh_TW |
| dc.title | Automatic Reporting System for 2-D Ultrasound Images Using Deep Learning | en |
| dc.type | Thesis | |
| dc.date.schoolyear | 107-2 | |
| dc.description.degree | 碩士 | |
| dc.contributor.oralexamcommittee | 羅崇銘,陳鴻豪 | |
| dc.subject.keyword | 乳癌,醫學影像報告,BI-RADS 字典,超音波影像,PSPNet,Xception,深度學習, | zh_TW |
| dc.subject.keyword | breast cancer,medical imaging reports,BI-RADS lexicon,ultrasound images,PSPNet,Xception,deep learning, | en |
| dc.relation.page | 50 | |
| dc.identifier.doi | 10.6342/NTU201903872 | |
| dc.rights.note | 有償授權 | |
| dc.date.accepted | 2019-08-16 | |
| dc.contributor.author-college | 電機資訊學院 | zh_TW |
| dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
| 顯示於系所單位: | 資訊工程學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-108-1.pdf 未授權公開取用 | 1.5 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
