Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 理學院
  3. 物理學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/85932
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳凱風(Kai-Feng Chen)
dc.contributor.authorTu-Jung Chengen
dc.contributor.author鄭篤容zh_TW
dc.date.accessioned2023-03-19T23:29:25Z-
dc.date.copyright2022-09-30
dc.date.issued2022
dc.date.submitted2022-09-22
dc.identifier.citation[1] Weinzierl, Stefan. Introduction to Monte Carlo Methods, ArXiv:hep-Ph/0006269, 2000. [2] GEANT4 website, URL: geant4.web.cern.ch/. [3] Kurokawa, S., and E. Kikutani. Overview of the KEKB Accelerators, Nuclear Instru- ments and Methods in Physics Research Section A: Accelerators, Spectrometers, De- tectors and Associated Equipment, vol. 499, no. 1, 2003, pp. 1–7. [4] Koiso, H, et al. Optimization of Beam Optics in the KEKB Rings, 2000. [5] Sauvan, J.-B. Calorimetry in HEP from Concepts to Experiments, 2020. [6] Abashian, A., et al. The Belle Detector, Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equip- ment, vol. 479, no. 1, 2002, pp. 117–232. [7] Ohnishi, Yukiyoshi, etal. Accelerator Design at SuperKEKB, Progress of Theoretical and Experimental Physics, no. 3, 2013, p. 03A011. [8] Doležal, Z, and S Uno. Belle II Technical Design Report, 2010. [9] Adachi, I., et al. Detectors for Extreme Luminosity: Belle II, Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, vol. 907, 2018, pp. 46–59. [10] Kaushik, Venkatesh S. Electromagnetic Showers and Shower Detectors, Cite- seerx.ist.psu.edu, 2002. [11] Serway, Raymond A, et al. Modern Physics, Andover, Cengage, 2014, pp. 597–598. [12] Sathya, R., and Annamma Abraham. Comparison of Supervised and Unsupervised Learning Algorithms for Pattern Classification, International Journal of Advanced Research in Artificial Intelligence, vol. 2, no. 2, 2013. [13] Berry, Michael W, et al. Supervised and Unsupervised Learning for Data Science, Cham, Switzerland, Springer, 2020. [14] Gurney, Kevin, and New York. An Introduction to Neural Networks an Introduction to Neural Networks, 1997. [15] Singh,Himanshi. NeuralNetwork| Introduction to Neural Network|Neural Network for DL, Analytics Vidhya, 1 Mar. 2021, www.analyticsvidhya.com/blog/2021/03/ basics-of-neural-network/. [16] CS231n Convolutional Neural Networks for Visual Recognition, Cs231n.github.io, cs231n.github.io/neural-networks-1/. [17] Sharma, Pranshu. A Basic Introduction to Activation Function in Deep Learning, Analytics Vidhya, 3 Mar. 2022, www.analyticsvidhya.com/blog/2022/03/a-basic- introduction-to-activation-function-in-deep-learning/. [18] Goodfellow, Ian, et al. Maxout Networks, 2013. [19] O’shea,Keiron,andRyanNash.AnIntroductiontoConvolutionalNeuralNetworks, 2015. [20] Weng, Wei-Hung. Machine Learning for Clinical Predictive Analytics, Leveraging Data Science for Global Health, 2020, pp. 199–217. [21] Srivastava, Nitish, et al. Dropout: A Simple Way to Prevent Neural Networks from Overfitting, Journal of Machine Learning Research, vol. 15, no. 56, 2014, pp. 1929– 1958. [22] Ioffe, Sergey, and Christian Szegedy. Batch Normalization: Accelerating Deep Net- work Training by Reducing Internal Covariate Shift, ArXiv.org, 2015. [23] Ruder, Sebastian. An Overview of Gradient Descent Optimization Algorithms, Se- bastian Ruder, Sebastian Ruder, 19 Jan. 2016, ruder.io/optimizing-gradient-descent/. [24] Kingma, Diederik, andJimmyLeiBa. Adam:AMethod for Stochastic Optimization, 2017. [25] Choi, Dami, et al. On Empirical Comparisons of Optimizers for Deep Learning, 2020. [26] Brownlee, Jason. How to Control the Stability of Training Neural Networks with the Batch Size, Machine Learning Mastery, 2019, machinelearningmastery.com/ how-to-control-the-speed-and-stability-of-training-neural-networks-with-gradient- descent-batch-size/. [27] Lin, Cheng-Wei. Overlapped Shower Splitting in Belle II ECL with Convolutional Neural Network, and Measurements of Two-Particle Correlations in E+E- Collisions at Belle, Airitilibrary.com, 2016, pp. 1–110. [28] Lecun, Yann, et al. Gradient-Based Learning Applied to Document Recognition, PROC. OF the IEEE, 2006. [29] Krizhevsky, Alex, et al. ImageNet Classification with Deep Convolutional Neural Networks, Communications of the ACM, vol. 60, no. 6, 24 May 2012, pp. 84–90.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/85932-
dc.description.abstract在 Belle II 電磁熱量計中,當兩個粒子射叢十分靠近,能量可能會重疊,而能量較難分配給兩個粒子射叢。在蒙特卡羅模擬下,粒子射叢被劃分為5×5扣掉4 個角的CsI晶體。輸入能量圖並使用卷積神經網絡(CNN),選擇2GeV以下的射叢能量分佈,當作數據集作為訓練模型,而4GeV的能量作為測試已建立模型的數據集。主要測試不同結構的CNN模型對能量解析度的影響。此外,將全連接網絡作為一個簡單的模型進行測試,並將 LeNet 和 AlexNet 作為一種著名的 CNN 進行測試,目的是提高能量解析度。zh_TW
dc.description.abstractIn the Belle II electromagnetic calorimeter (ECL), two particle showers close together, the energy may be overlap. It is difficult to separate the energy. Under Monte Carlo simulation, the shower is grouped into 5×5 CsI crystals without 4 corners. By the image map of the energies, convolutional neural network (CNN) is used to split the photon shower. Choose the energy distribution of the showers under 2 GeV, the dataset is concerned as the training model. The energy of 4 GeV as the dataset that test the model. Mainly test the influence of CNN models of different structures on energy resolution. In addition, test Fully-connected network as a simple model, and test LeNet and AlexNet as a famous kind of CNN. The purpose is to improve the energy resolution.en
dc.description.provenanceMade available in DSpace on 2023-03-19T23:29:25Z (GMT). No. of bitstreams: 1
U0001-1909202203571300.pdf: 8065630 bytes, checksum: 8a80388f01c341c807c614ea98e8ae68 (MD5)
Previous issue date: 2022
en
dc.description.tableofcontents誌謝 1 摘要 2 Abstract 3 Contents 4 List of Figures 7 List of Tables 9 Chapter 1 Introduction 1 1.1 Monte Carlo Method 1 1.1.1 Introduction 1 1.1.2 GEANT4 2 1.2 KEKB 3 1.3 Calorimeter 5 1.4 Belle Detector 6 1.5 SuperKEKB and Belle II Detector 11 1.6 Particle Shower 13 Chapter 2 Overlapped Shower Splitting in Belle II ECL with CNN 15 2.1 Introduction 15 2.2 Machine Learning Basic 16 2.2.1 Supervised Learning and Unsupervised Learning 16 2.2.2 Neural Network 17 2.2.3 Activation Function 20 2.2.4 Introduction to CNN 22 2.2.5 Loss Function 25 2.2.6 Dropout 27 2.2.7 Batch Normalization 27 2.2.8 L1 and L2 Regularization 28 2.2.9 Optimizer 29 2.3 Photon Shower in Belle II ECL 31 2.4 Analysis Strategy 33 2.5 Data Process 35 2.5.1 Dataset 35 2.5.2 Image Process 36 2.5.3 Initial Study to Separate Two Showers 42 2.5.4 Construct CNN Structure 44 2.5.5 Hyper-parameters 47 2.5.6 Prevent from Overtraining 51 2.6 Result 57 2.6.1 Classification 58 2.7 Conclusion 62 Chapter 3 Another Network 63 3.1 Fully-connected Network 63 3.2 LeNet 64 3.3 AlexNet 69 3.4 Conclusion 70 Bibliography 72
dc.language.isoen
dc.title以卷積神經網路分離 Belle II 電磁量能器中重疊光子射叢zh_TW
dc.titleOverlapped shower Splitting in Belle II ECL with CNNen
dc.typeThesis
dc.date.schoolyear110-2
dc.description.degree碩士
dc.contributor.oralexamcommittee張寶棣(Pao-Ti Chang),張敏娟(Ming-Chuan Chang)
dc.subject.keyword卷積神經網路,機器學習,電磁量能器,zh_TW
dc.subject.keywordconvolution neural netwo,machine learning,electromagnetic calorimeter,en
dc.relation.page75
dc.identifier.doi10.6342/NTU202203548
dc.rights.note同意授權(全球公開)
dc.date.accepted2022-09-23
dc.contributor.author-college理學院zh_TW
dc.contributor.author-dept物理學研究所zh_TW
dc.date.embargo-lift2022-09-30-
顯示於系所單位:物理學系

文件中的檔案:
檔案 大小格式 
U0001-1909202203571300.pdf7.88 MBAdobe PDF檢視/開啟
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved