Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/71465
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor張瑞峰
dc.contributor.authorCHIN-HUA HSUen
dc.contributor.author許晉華zh_TW
dc.date.accessioned2021-06-17T06:01:13Z-
dc.date.available2022-02-14
dc.date.copyright2019-02-14
dc.date.issued2018
dc.date.submitted2019-02-01
dc.identifier.citation[1] L. A. Torre et al., 'Ovarian cancer statistics, 2018,' CA: a cancer journal for clinicians, 2018.
[2] A. Migowski, 'Early detection of breast cancer and the interpretation of results of survival studies,' Ciencia & saude coletiva, vol. 20, no. 4, pp. 1309-1309, 2015.
[3] M. S. Islam, N. Kaabouch, and W.-C. Hu, 'A survey of medical imaging techniques used for breast cancer detection,' in EIT, 2013, pp. 1-5.
[4] R. J. Hooley, L. M. Scoutt, and L. E. Philpotts, 'Breast ultrasonography: state of the art,' Radiology, vol. 268, no. 3, pp. 642-659, 2013.
[5] A. Vourtsis and A. Kachulis, 'The performance of 3D ABUS versus HHUS in the visualisation and BI-RADS characterisation of breast lesions in a large cohort of 1,886 women,' European radiology, vol. 28, no. 2, pp. 592-601, 2018.
[6] H. J. Shin, H. H. Kim, and J. H. Cha, 'Current status of automated breast ultrasonography,' Ultrasonography, vol. 34, no. 3, p. 165, 2015.
[7] J. H. Kim et al., 'Computer-aided detection system for masses in automated whole breast ultrasonography: development and evaluation of the effectiveness,' Ultrasonography, vol. 33, no. 2, p. 105, 2014.
[8] C.-M. Lo et al., 'Multi-dimensional tumor detection in automated whole breast ultrasound using topographic watershed,' IEEE transactions on medical imaging, vol. 33, no. 7, pp. 1503-1511, 2014.
[9] W. K. Moon, Y.-W. Shen, M. S. Bae, C.-S. Huang, J.-H. Chen, and R.-F. Chang, 'Computer-aided tumor detection based on multi-scale blob detection algorithm in automated breast ultrasound images,' IEEE transactions on medical imaging, vol. 32, no. 7, pp. 1191-1200, 2013.
[10] W. Rawat and Z. Wang, 'Deep convolutional neural networks for image classification: A comprehensive review,' Neural computation, vol. 29, no. 9, pp. 2352-2449, 2017.
[11] S. Ramachandran, J. George, S. Skaria, and V. Varun, 'Using YOLO based deep learning network for real time detection and localization of lung nodules from low dose CT scans,' in Medical Imaging 2018: Computer-Aided Diagnosis, 2018, vol. 10575, p. 105751I: International Society for Optics and Photonics.
[12] J. Long, E. Shelhamer, and T. Darrell, 'Fully convolutional networks for semantic segmentation,' in Proceedings of the IEEE conference on computer vision and pattern recognition, 2015, pp. 3431-3440.
[13] A. Krizhevsky, I. Sutskever, and G. E. Hinton, 'Imagenet classification with deep convolutional neural networks,' in Advances in neural information processing systems, 2012, pp. 1097-1105.
[14] S. Liu and W. Deng, 'Very deep convolutional neural network based image classification using small training sample size,' in 2015 3rd IAPR Asian Conference on Pattern Recognition (ACPR), 2015, pp. 730-734.
[15] K. He, X. Zhang, S. Ren, and J. Sun, 'Deep residual learning for image recognition,' in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 770-778.
[16] G. Huang, Z. Liu, K. Q. Weinberger, and L. van der Maaten, 'Densely connected convolutional networks,' in Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, vol. 1, no. 2, p. 3.
[17] M. H. Yap et al., 'Automated breast ultrasound lesions detection using convolutional neural networks,' IEEE journal of biomedical and health informatics, 2017.
[18] A. A. A. Setio et al., 'Pulmonary nodule detection in CT images: false positive reduction using multi-view convolutional networks,' IEEE transactions on medical imaging, vol. 35, no. 5, pp. 1160-1169, 2016.
[19] Z.-H. Zhou, Ensemble methods: foundations and algorithms. CRC press, 2012.
[20] K. U. Sharma and N. V. Thakur, 'A review and an approach for object detection in images,' Int. J. Comput. Vision Robot., vol. 7, no. 1/2, pp. 196-237, 2017.
[21] C. Szegedy et al., 'Going deeper with convolutions,' in Proceedings of the IEEE conference on computer vision and pattern recognition, 2015, pp. 1-9.
[22] A. L. Maas, A. Y. Hannun, and A. Y. Ng, 'Rectifier nonlinearities improve neural network acoustic models,' in Proc. icml, 2013, vol. 30, no. 1, p. 3.
[23] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, 'Dropout: A simple way to prevent neural networks from overfitting,' The Journal of Machine Learning Research, vol. 15, no. 1, pp. 1929-1958, 2014.
[24] R. K. Srivastava, K. Greff, and J. Schmidhuber, 'Training very deep networks,' in Advances in neural information processing systems, 2015, pp. 2377-2385.
[25] G. Larsson, M. Maire, and G. Shakhnarovich, 'FractalNet: Ultra-Deep Neural Networks without Residuals,' CoRR, vol. abs/1605.07648, 2016.
[26] T. Y. Lin, P. Goyal, R. Girshick, K. He, and P. Dollár, 'Focal Loss for Dense Object Detection,' in 2017 IEEE International Conference on Computer Vision (ICCV), 2017, pp. 2999-3007.
[27] D. P. K. JLB, 'Adam: A method for stochastic optimization,' Proc. of ICLR, 2015.
[28] M. D. Zeiler, 'ADADELTA: An Adaptive Learning Rate Method,' ArXiv e-prints, vol. 1212, Accessed on: December 1, 2012
[29] H. Li, Z. Lin, X. Shen, J. Brandt, and G. Hua, 'A convolutional neural network cascade for face detection,' in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2015, pp. 5325-5334.
[30] A. Kumar, J. Kim, D. Lyndon, M. Fulham, and D. Feng, 'An ensemble of fine-tuned convolutional neural networks for medical image classification,' IEEE journal of biomedical and health informatics, vol. 21, no. 1, pp. 31-40, 2017.
[31] T. G. Dietterich, 'Ensemble learning,' The handbook of brain theory and neural networks, vol. 2, pp. 110-125, 2002.
[32] W. H. Day and H. Edelsbrunner, 'Efficient algorithms for agglomerative hierarchical clustering methods,' Journal of classification, vol. 1, no. 1, pp. 7-24, 1984.
[33] S. Lloyd, 'Least squares quantization in PCM,' IEEE transactions on information theory, vol. 28, no. 2, pp. 129-137, 1982.
[34] R. Sibson, 'SLINK: an optimally efficient algorithm for the single-link cluster method,' The computer journal, vol. 16, no. 1, pp. 30-34, 1973.
[35] D. P. Chakraborty, 'Maximum likelihood analysis of free‐response receiver operating characteristic (FROC) data,' Medical physics, vol. 16, no. 4, pp. 561-568, 1989.
[36] R. N. Strickland, Image-processing techniques for tumor detection. Boca Raton: CRC Press, 2002.
[37] W. Liu et al., 'Ssd: Single shot multibox detector,' in European conference on computer vision, 2016, pp. 21-37: Springer.
[38] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, 'You only look once: Unified, real-time object detection,' in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 779-788.
[39] K. He, G. Gkioxari, P. Dollár, and R. Girshick, 'Mask R-CNN,' in 2017 IEEE International Conference on Computer Vision (ICCV), 2017, pp. 2980-2988.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/71465-
dc.description.abstract乳癌為女性最常見的癌症之一,根據過去的研究顯示,早期發現可以降低乳癌死亡率,超音波檢查為主要乳癌篩檢工具之一,自動化乳房超音波使用大型探測端掃描乳房三維影像,搭配電腦輔助診斷系統以協助醫生判讀超音波影像,進一步篩檢乳癌。而近年來卷積類神經網路已被廣泛應用在各個領域,有鑒於此,本研究提出一套應用卷積類神經網路的電腦輔助診斷系統,以突破過往使用手動提取特徵的限制。
本研究中首先使用滑動視窗的方式來提取三維超音波影像中感興趣的區塊,再者,將提取出的區塊作為三維整體學習卷積類神經網路的輸入,在類神經網路中學習影像特徵後計算判讀為腫瘤的機率為何,最後,加入影像後處理以避免腫瘤候選區塊的過多重疊。該研究中使用的資料集共包含了246份病例,其中分別切為訓練、驗證及測試資料集,並將測試資料集用於評估提出的電腦輔助診斷系統效能,測試資料集共包含81份病例、473通道影像,實驗結果發現在21.6/126.2、6.0/34.8和4.6/27.1三種不同偽陽性(一通道/一病例)的情況下,靈敏度分別達到了100% (81/81)、95.3% (77/81)和90.9% (74/81)。除此之外,系統的執行時間在每一通道影像下為28.3秒,完成一病例共需174.6秒。總結來說,應用三維卷積類神經網路於電腦輔助診斷系統可有效提升整體偵測表現並降低執行時間。
zh_TW
dc.description.abstractThe breast cancer is one of the most common cancers in the female. According to the previous studies, the detection in the early stage can help reduce the breast cancer mortality. The automated breast ultrasound (ABUS) is useful for breast cancer examination that utilizes a larger transducer to record the whole breast images and provides the 3-D large volume of the breast. Besides, the computer-aided detection (CADe) system is introduced to help the radiologists while reviewing and diagnosing in ABUS images. With the success of convolutional neural network (CNN) in image tasks, a CADe system using 3-D CNN is proposed to overcome the limitations of conventional CADe system that requires hand-crafted features. In this study, a sliding window based volume of interest (VOI) extraction approach is adopted. Then, the extracted VOIs are considered as the input of 3-D ensemble CNN for estimating the tumor likelihood. Finally, the post-processing method is employed to address the overlapping tumor candidates’ issues. In this study, the dataset contains 246 cases and is divided into training, validation, and testing sets. The proposed CADe system is evaluated on the testing set that consists of 81 cases with 473 passes and 104 tumors. This study achieves the sensitivities of 100% (81/81), 95.3% (77/81), and 90.9% (74/81) with the false positives per pass/per case 21.6/126.2, 6.0/34.8, and 4.6/27.1 respectively. Besides, the execution time is 28.3 seconds per pass and 174.6 seconds per case. In conclusion, the proposed CADe system using 3-D CNN is much more time efficient and obtains better performance than previous works.en
dc.description.provenanceMade available in DSpace on 2021-06-17T06:01:13Z (GMT). No. of bitstreams: 1
ntu-107-R05922102-1.pdf: 1918064 bytes, checksum: 04bf53038c9374b7f598f5e9ee7bb72f (MD5)
Previous issue date: 2018
en
dc.description.tableofcontents口試委員會審定書 ii
致謝 iii
摘要 iv
Abstract v
Table of Contents vi
List of Figures vii
List of Tables viii
Chapter 1. Introduction 1
Chapter 2. Materials 4
Chapter 3. Methods 7
3.1 VOI Extraction 8
3.2 3-D Tumor Detection CNN 9
3.2.1. 3-D VGG-16 CNN 10
3.2.2. 3-D DenseNet 12
3.2.3. Focal Loss 15
3.2.4. 3-D CNNs Training 16
3.2.5. Ensemble Method 19
3.3 Post-processing 20
3.3.1. Hierarchical Clustering (HC) 20
3.3.2. Merge of tumor candidates 21
Chapter 4. Experiment Results and Discussions 23
4.1 Experiment Environment 23
4.2 Evaluation 23
4.3 Experiment Results 24
4.4 Discussions 31
Chapter 5. Conclusions and Future Works 35
References 36
dc.language.isoen
dc.subject乳癌zh_TW
dc.subject電腦輔助診斷系統zh_TW
dc.subject腫瘤zh_TW
dc.subject三維卷積類神經網路zh_TW
dc.subject深度學習zh_TW
dc.subjectDeep learningen
dc.subjectBreast canceren
dc.subjectConvolutional neural networken
dc.subjectCADeen
dc.subjectABUSen
dc.title使用三維卷積神經網路於全乳房自動超音波腫瘤偵測zh_TW
dc.titleTumor Detection in Automated Breast Ultrasound using 3-D CNNen
dc.typeThesis
dc.date.schoolyear107-1
dc.description.degree碩士
dc.contributor.oralexamcommittee羅崇銘,陳啟禎
dc.subject.keyword深度學習,乳癌,腫瘤,三維卷積類神經網路,電腦輔助診斷系統,zh_TW
dc.subject.keywordDeep learning,Breast cancer,Convolutional neural network,CADe,ABUS,en
dc.relation.page39
dc.identifier.doi10.6342/NTU201900374
dc.rights.note有償授權
dc.date.accepted2019-02-12
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-107-1.pdf
  未授權公開取用
1.87 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved