Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊網路與多媒體研究所
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/59883
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor張瑞峰(Ruey-Feng Chang)
dc.contributor.authorYi-Tzun Laien
dc.contributor.author賴以尊zh_TW
dc.date.accessioned2021-06-16T09:43:03Z-
dc.date.available2023-09-01
dc.date.copyright2020-09-17
dc.date.issued2020
dc.date.submitted2020-08-14
dc.identifier.citation[1] R. L. Siegel, K. D. Miller, and A. Jemal, 'Cancer statistics, 2019,' CA: a cancer journal for clinicians, vol. 69, no. 1, pp. 7-34, 2019.
[2] A. Takemura, A. Shimizu, and K. Hamamoto, 'Discrimination of breast tumors in ultrasonic images using an ensemble classifier based on the AdaBoost algorithm with feature selection,' IEEE T on medical imaging, vol. 29, no. 3, pp. 598-609, 2009.
[3] R. J. Hooley, L. M. Scoutt, and L. E. Philpotts, 'Breast ultrasonography: state of the art,' Radiology, vol. 268, no. 3, pp. 642-659, 2013.
[4] T. Tan, B. Platel, H. Huisman, C. I. Sánchez, R. Mus, and N. Karssemeijer, 'Computer-aided lesion diagnosis in automated 3-D breast ultrasound using coronal spiculation,' IEEE T on medical imaging, vol. 31, no. 5, pp. 1034-1042, 2012.
[5] W. K. Moon, Y. W. Shen, C. S. Huang, L. R. Chiang, and R. F. Chang, 'Computer-aided diagnosis for the classification of breast masses in automated whole breast ultrasound images,' Ultrasound Med Biol, vol. 37, no. 4, pp. 539-48, Apr 2011, doi: 10.1016/j.ultrasmedbio.2011.01.006.
[6] J.-Z. Cheng et al., 'Computer-aided diagnosis with deep learning architecture: applications to breast lesions in US images and pulmonary nodules in CT scans,' Scientific reports, vol. 6, no. 1, pp. 1-13, 2016.
[7] L. Zhang, Z. Luo, R. Chai, D. Arefan, J. Sumkin, and S. Wu, 'Deep-learning method for tumor segmentation in breast DCE-MRI,' in Medical Imaging 2019: Imaging Informatics for Healthcare, Research, and Applications, 2019, vol. 10954: International Society for Optics and Photonics, p. 109540F.
[8] M. Z. Alom, C. Yakopcic, M. S. Nasrin, T. M. Taha, and V. K. Asari, 'Breast cancer classification from histopathological images with inception recurrent residual convolutional neural network,' Journal of digital imaging, vol. 32, no. 4, pp. 605-617, 2019.
[9] T.-C. Chiang, Y.-S. Huang, R.-T. Chen, C.-S. Huang, and R.-F. Chang, 'Tumor detection in automated breast ultrasound using 3-D CNN and prioritized candidate aggregation,' IEEE T on medical imaging, vol. 38, no. 1, pp. 240-249, 2018.
[10] D.-R. Chen, R.-F. Chang, W.-J. Kuo, M.-C. Chen, and Y.-L. Huang, 'Diagnosis of breast tumors with sonographic texture analysis using wavelet transform and neural networks,' Ultrasound in medicine biology, vol. 28, no. 10, pp. 1301-1310, 2002.
[11] A. S. Hong, E. L. Rosen, M. S. Soo, and J. A. Baker, 'BI-RADS for sonography: positive and negative predictive values of sonographic features,' American Journal of Roentgenology, vol. 184, no. 4, pp. 1260-1265, 2005.
[12] O. Ronneberger, P. Fischer, and T. Brox, 'U-net: Convolutional networks for biomedical image segmentation,' in 2015 International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI), 2015: Springer, pp. 234-241.
[13] Z. Zhou, M. M. R. Siddiquee, N. Tajbakhsh, and J. Liang, 'Unet++: Redesigning skip connections to exploit multiscale features in image segmentation,' IEEE T on medical imaging, vol. 39, no. 6, pp. 1856-1867, 2019.
[14] G. Huang, Z. Liu, L. Van Der Maaten, and K. Q. Weinberger, 'Densely connected convolutional networks,' in 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017, pp. 4700-4708.
[15] D.-R. Chen, R.-F. Chang, W.-J. Wu, W. K. Moon, and W.-L. Wu, '3-D breast ultrasound segmentation using active contour model,' Ultrasound in medicine biology, vol. 29, no. 7, pp. 1017-1026, 2003.
[16] K. He, X. Zhang, S. Ren, and J. Sun, 'Deep residual learning for image recognition,' in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp. 770-778.
[17] S. Xie, R. Girshick, P. Dollár, Z. Tu, and K. He, 'Aggregated residual transformations for deep neural networks,' in 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017, pp. 1492-1500.
[18] Y. Chen et al., 'Drop an octave: Reducing spatial redundancy in convolutional neural networks with octave convolution,' in 2019 IEEE International Conference on Computer Vision (ICCV), 2019, pp. 3435-3444.
[19] J. Hu, L. Shen, S. Albanie, G. Sun, and E. Wu, 'Squeeze-and-Excitation Networks,' IEEE T Pattern Anal Mach Intell, Apr 29 2019, doi: 10.1109/TPAMI.2019.2913372.
[20] M. C. Yang et al., 'Robust Texture Analysis Using Multi-Resolution Gray-Scale Invariant Features for Breast Sonographic Tumor Diagnosis,' IEEE T on Medical Imaging, vol. 32, no. 12, pp. 2262-2273, Dec 2013.
[21] E. Shelhamer, J. Long, and T. Darrell, 'Fully Convolutional Networks for Semantic Segmentation,' IEEE T Pattern Anal Mach Intell, vol. 39, no. 4, pp. 640-651, Apr 2017, doi: 10.1109/TPAMI.2016.2572683.
[22] C. H. Sudre, W. Li, T. Vercauteren, S. Ourselin, and M. J. Cardoso, 'Generalised dice overlap as a deep learning loss function for highly unbalanced segmentations,' in Deep learning in medical image analysis and multimodal learning for clinical decision support: Springer, 2017, pp. 240-248.
[23] H. Rezatofighi, N. Tsoi, J. Gwak, A. Sadeghian, I. Reid, and S. Savarese, 'Generalized intersection over union: A metric and a loss for bounding box regression,' in 2019 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2019, pp. 658-666.
[24] C. Szegedy, S. Ioffe, V. Vanhoucke, and A. A. Alemi, 'Inception-v4, inception-resnet and the impact of residual connections on learning,' in 2017 Thirty-first AAAI Conference on Artificial Intelligence (AAAI), 2017.
[25] C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, 'Rethinking the inception architecture for computer vision,' in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp. 2818-2826.
[26] R. Kohavi, 'A study of cross-validation and bootstrap for accuracy estimation and model selection,' in Ijcai, 1995, vol. 14, no. 2: Montreal, Canada, pp. 1137-1145.
[27] L. R. Dice, 'Measures of the Amount of Ecologic Association between Species,' Ecology, vol. 26, no. 3, pp. 297-302, 1945.
[28] J. A. Hanley and B. J. Mcneil, 'The Meaning and Use of the Area under a Receiver Operating Characteristic (Roc) Curve,' Radiology, vol. 143, no. 1, pp. 29-36, 1982.
[29] A. Sadeghi-Naini et al., 'Breast-lesion characterization using textural features of quantitative ultrasound parametric maps,' Scientific reports, vol. 7, no. 1, pp. 1-10, 2017.
[30] B. Barz and J. Denzler, 'Deep learning on small datasets without pre-training using cosine loss,' in 2020 IEEE Winter Conference on Applications of Computer Vision (WACV), 2020, pp. 1371-1380.
[31] H. Wang et al., 'Cosface: Large margin cosine loss for deep face recognition,' in 2018 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2018, pp. 5265-5274.
[32] S. Thulasidasan, G. Chennupati, J. A. Bilmes, T. Bhattacharya, and S. Michalak, 'On mixup training: Improved calibration and predictive uncertainty for deep neural networks,' in 2019 Neural Information Processing Systems (NeurIPS), 2019, pp. 13888-13899.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/59883-
dc.description.abstract近年來乳癌已成為女性最常見的疾病之一。透過早期的偵測、診斷和治療可以顯著地改善死亡率。在乳癌的檢測中,全自動乳房超音波系統(Automated Breast Ultrasound System, ABUS)因其可以提供完整的乳房三維影像資訊,逐漸被採用於乳癌檢測。然而,即使是經驗豐富的醫生在檢查超音波圖像時也容易受到腫瘤形狀和內部紋理的影響導致初步的病灶誤判,為了幫助解決這個問題,基於卷積神經網路(Convolutional Neural Network, CNN)開發的電腦輔助診斷系統(Computer-Aided Diagnosis system)應運而生。本研究提出一個包含三維腫瘤切割網路和三維腫瘤分類網路的電腦輔助診斷系統,首先,透過從原始ABUS影像中提取含有紋理資訊的腫瘤區域,接著通過融合了殘差模塊(Residual Block)和巢狀U型網路(Nested U-Net)的腫瘤切割模型產生含有形狀資訊的腫瘤遮罩,最後將腫瘤遮罩和腫瘤區域同時放入由腫瘤分類網路取出形狀和紋理特徵以進行腫瘤良惡性的判斷。在腫瘤分類網路中使用了具有八度卷積以及擠壓和激發模組的匯總殘差網路瓶頸塊(Bottleneck Block from ResNeXt)作為基本結構來構建我們的模型,以獲得更準確的結果。本研究中總共使用了403顆腫瘤,其中包含了199顆良性腫瘤和204顆惡性腫瘤。實驗結果顯示,所提出的系統能達到準確率88.6%、靈敏性90.6%、特異性86.9%和ROC曲線下面積0.9333的成果,這樣的表現在臨床上與擁有三年ABUS經驗的醫生相同,顯示提出的系統有足夠的能力進行腫瘤良惡性的預測。zh_TW
dc.description.abstractIn recent years, breast cancer has become one of the most common diseases in women. Through early detection, diagnosis, and treatment, the mortality rate could be significantly improved. In breast cancer examination, the Automated Breast Ultrasound System (ABUS) was gradually adopted for breast cancer examination because it could provide complete information by recording the whole breast in the three-dimensional (3-D) image. However, even the experienced physician might be susceptible to the shape and internal texture of the tumor while reviewing ABUS images and misjudged the lesion. To solve this problem, the computer-aided diagnosis (CADx) system based on a convolutional neural network (CNN) was provided. In this study, a CADx system consisting of a 3-D tumor segmentation model and a 3-D tumor classification model was proposed for tumor diagnosis. First, the tumor region with texture information was extracted from the original ABUS image. Then, tumor masks containing shape information were generated by the 3-D tumor segmentation model, which fused the residual block and the U-net++. Finally, the tumor region and the corresponding tumor mask were both fed into our tumor classification model to extract the shape and texture feature maps for determining the tumor as benign or malignant. In our tumor classification model, the bottleneck block from ResNeXt with the octave convolution and the squeeze-and-excitation module was used as a basic structure to construct our model. In experiments, a total of 403 tumors, including 199 benign tumors and 204 malignant tumors, were used in this study to evaluate the proposed system performance. The experimental results showed that the proposed system could achieve 88.6% accuracy, 90.6% in sensitivity, 86.9% in specificity, and 0.9333 in the area under the ROC curve. Such performance was clinically the same as that of a doctor with three years of ABUS experience, showing that the proposed system had sufficient ability to predict tumors as benign or malignant.en
dc.description.provenanceMade available in DSpace on 2021-06-16T09:43:03Z (GMT). No. of bitstreams: 1
U0001-1308202016175700.pdf: 2344663 bytes, checksum: 0486b97b44536a12f5d17f20e5b951ec (MD5)
Previous issue date: 2020
en
dc.description.tableofcontents口試委員會審定書 i
致謝 ii
摘要 iii
Abstract v
Table of Contents vii
List of Figures viii
List of Tables ix
Chapter 1. Introduction 1
Chapter 2. Materials 5
2.1. Patient 5
2.2. ABUS Imaging 5
Chapter 3. Method 7
3.1. VOI Extraction 8
3.2. Tumor Segmentation 8
3.2.1 3-D Res-U-net++ 9
3.2.2 Loss Function 12
3.3. Tumor Classification 14
3.3.1 3-D SE-Octave-ResNeXt 14
Chapter 4. Experiment Results and Discussions 20
4.1. Experiment Environment 20
4.2. Experiment Results 20
4.2.1 Model Comparisons 21
4.2.2 Model Comparisons with Different Inputs 31
4.3. Discussions 33
Chapter 5. Conclusion 38
Reference 39
dc.language.isoen
dc.subject巢狀U型網路zh_TW
dc.subject殘差網路zh_TW
dc.subject電腦輔助診斷zh_TW
dc.subject三維卷積神經網路zh_TW
dc.subject全自動乳房超音波zh_TW
dc.subject乳癌zh_TW
dc.subject分組卷積zh_TW
dc.subject八度卷積zh_TW
dc.subject擠壓和激發模組zh_TW
dc.subjectautomated breast ultrasounden
dc.subjectBreast canceren
dc.subject3-D convolutional neural networken
dc.subjectcomputer-aided diagnosisen
dc.subjectresidual networken
dc.subjectnested U-neten
dc.subjectoctave convolutionen
dc.subjectsqueeze-and-excitation moduleen
dc.title3-D匯總八度卷積神經網路使用於自動乳房超音波電腦輔助腫瘤診斷zh_TW
dc.titleAutomated Breast Ultrasound for Computer-Aided Tumor Diagnosis Using 3-D Aggregated Octave Convolutional Neural Networken
dc.typeThesis
dc.date.schoolyear108-2
dc.description.degree碩士
dc.contributor.oralexamcommittee羅崇銘(Chung-Ming Lo),陳鴻豪(Hong-Hao Chen)
dc.subject.keyword乳癌,全自動乳房超音波,三維卷積神經網路,電腦輔助診斷,殘差網路,巢狀U型網路,分組卷積,八度卷積,擠壓和激發模組,zh_TW
dc.subject.keywordBreast cancer,automated breast ultrasound,3-D convolutional neural network,computer-aided diagnosis,residual network,nested U-net,octave convolution,squeeze-and-excitation module,en
dc.relation.page44
dc.identifier.doi10.6342/NTU202003287
dc.rights.note有償授權
dc.date.accepted2020-08-14
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊網路與多媒體研究所zh_TW
顯示於系所單位:資訊網路與多媒體研究所

文件中的檔案:
檔案 大小格式 
U0001-1308202016175700.pdf
  未授權公開取用
2.29 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved