Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 理學院
  3. 統計與數據科學研究所
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/96536
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor張馨文zh_TW
dc.contributor.advisorHsin-wen Changen
dc.contributor.author林允文zh_TW
dc.contributor.authorYun-Wen Linen
dc.date.accessioned2025-02-19T16:24:46Z-
dc.date.available2025-02-20-
dc.date.copyright2025-02-19-
dc.date.issued2025-
dc.date.submitted2025-02-06-
dc.identifier.citation[1] S. Bates, T. Hastie, and R. Tibshirani. Cross-validation: what does it estimate and how well does it do it? Journal of the American Statistical Association, pages 1–12, 2023.
[2] P. Bayle, A. Bayle, L. Janson, and L. Mackey. Cross-validation confidence intervals for test error. Advances in Neural Information Processing Systems, 33:16339–16350, 2020.
[3] Y. Bengio. Practical recommendations for gradient-based training of deep architectures. In Neural Networks: Tricks of the Trade, pages 437–478. Springer, 2012.
[4] J. Bergstra and Y. Bengio. Random search for hyper-parameter optimization. Journal of Machine Learning Research, 13(2):281–305, 2012.
[5] D. Berrar et al. Cross-validation., 2019.
[6] F. Cao and R. Zhang. The errors of approximation for feedforward neural networks in the lp metric. Mathematical and Computer Modelling, 49(7-8):1563–1572, 2009.
[7] V. Chernozhukov, D. Chetverikov, and K. Kato. Inference on causal and structural parameters using many moment inequalities. The Review of Economic Studies, 86(5):1867–1900, 2019.
[8] K. T. Chui, R. W. Liu, M. Zhao, and P. O. De Pablos. Predicting students'performance with school and family tutoring using generative adversarial network-based deep support vector machine. IEEE Access, 8:86745–86752, 2020.
[9] X. Dong, Z. Yu, W. Cao, Y. Shi, and Q. Ma. A survey on ensemble learning. Frontiers of Computer Science, 14:241–258, 2020.
[10] T. Elsken, J. H. Metzen, and F. Hutter. Neural architecture search: A survey. Journal of Machine Learning Research, 20(55):1–21, 2019.
[11] N. J. Guliyev and V. E. Ismailov. On the approximation by single hidden layer feed-forward neural networks with fixed weights. Neural Networks, 98:296–304, 2018.
[12] J. Khan, E. Lee, and K. Kim. A higher prediction accuracy–based alpha–beta filter algorithm using the feedforward artificial neural network. CAAI Transactions on Intelligence Technology, 2022.
[13] V. Koltchinskii and K. Lounici. Concentration inequalities and moment bounds for sample covariance operators. Bernoulli, pages 110–133, 2017.
[14] J. Lei. Cross-validation with confidence. Journal of the American Statistical Association, 115(532):1978–1997, 2020.
[15] Y. Liu. Create stable neural networks by cross-validation. In The 2006 IEEE International Joint Conference on Neural Network Proceedings, pages 3925–3928. IEEE, 2006.
[16] D. J. Montana and L. Davis. Training feedforward neural networks using genetic algorithms. In Proceedings of the 11th International Joint Conference on Artificial Intelligence (IJCAI), pages 762–767, 1989.
[17] R. Setiono. Feedforward neural network construction using cross validation. Neural Computation, 13(12):2865–2877, 2001.
[18] S. Singha, S. Pasupuleti, S. S. Singha, R. Singh, and S. Kumar. Prediction of groundwater quality using efficient machine learning technique. Chemosphere, 276:130265, 2021.
[19] D. Svozil, V. Kvasnicka, and J. Pospichal. Introduction to multi-layer feed-forward neural networks. Chemometrics and intelligent laboratory systems, 39(1):43–62, 1997.
[20] R. Vershynin. High-dimensional probability: An introduction with applications in data science, volume 47. Cambridge university press, 2018.
[21] S. Wang, C. Qin, Q. Feng, F. Javadpour, and Z. Rui. A framework for predicting the production performance of unconventional resources using deep learning. Applied Energy, 295:117016, 2021.
[22] L. Wasserman. Bayesian model selection and model averaging. Journal of mathematical psychology, 44(1):92–107, 2000.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/96536-
dc.description.abstract交叉驗證(Cross-validation, CV)是機器學習中用於模型選擇和效能評估的基礎技術。然而,傳統的交叉驗證方法可能因折疊損失間的相關性而低估預測風險變異,且未能充分考慮測試數據中的近似誤差。信賴交叉驗證(Cross-Validation with Confidence, CVC)方法的提出解決了這些限制,為模型選擇結果提供統計信心,但其應用主要侷限於線性模型。

本研究探討將CVC方法應用於前饋式神經網路(Feed-Forward Neural Networks, FNNs),主要關注模型選擇的問題。透過數值研究,我們觀察到這個方法相較於傳統交叉驗證方法,在模型選擇時可能提供額外的參考資訊。

研究結果顯示,將CVC應用於神經網路可能有助於降低神經網路訓練中常見的不穩定性。我們的方法嘗試為神經網路架構選擇提供額外的統計考量。本研究初步探討了CVC在神經網路上的可能應用,期待未來能有更多相關的理論發展與實務應用。
zh_TW
dc.description.abstractCross-Validation (CV) is a fundamental technique in machine learning for model selection and performance evaluation. However, traditional CV methods may underestimate prediction risk variance due to correlations among fold losses and fail to account for approximation errors in test data. Cross-Validation with Confidence (CVC) has been proposed to address these limitations by providing statistical confidence in model selection results, but its application has primarily been limited to linear models.

This thesis investigates the application of CVC to Feed-Forward Neural Networks (FNNs), focusing on architecture selection. Through simulation studies, we explore how this approach might provide additional insights for model selection compared to conventional CV methods. The investigation suggests that our approach could offer complementary information to existing model selection techniques.

The results indicate that applying CVC to neural networks might help reduce the instability typically associated with neural network training. This study represents an initial exploration of CVC in neural network architecture selection, contributing to the ongoing discussion of confident model selection in neural networks.
en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2025-02-19T16:24:46Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2025-02-19T16:24:46Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontentsAcknowledgements i
摘要 iii
Abstract v
Contents vii
List of Figures ix
List of Tables xi
Chapter 1 Introduction 1
Chapter 2 Preliminaries 5
2.1 Feed-Forward Neural Network . . . . . . . . . . . . . . . . . . . . . 5
2.2 Cross-Validation Methods . . . . . . . . . . . . . . . . . . . . . . . 8
2.2.1 V-fold Cross-Validation . . . . . . . . . . . . . . . . . . . . . . . . 8
2.2.2 Extended Cross-Validation: CVC . . . . . . . . . . . . . . . . . . . 10
Chapter 3 The Proposed Methodology 15
3.1 FNN with CVC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.2 Theoretical Results . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Chapter 4 Simulation Study 19
4.1 Data Generating Process . . . . . . . . . . . . . . . . . . . . . . . . 19
4.2 Study Design and Implementation . . . . . . . . . . . . . . . . . . . 21
4.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
4.3.1 Comparison between CVC and CV . . . . . . . . . . . . . . . . . . 23
4.3.2 Sensitivity Analysis of Significance Levels . . . . . . . . . . . . . 24
Chapter 5 Conclusion 27
References 29
Appendix A — Proof of Theorem 1 33
A.1 Technical Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . 33
A.2 Main Proof . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
-
dc.language.isoen-
dc.subject模型選擇zh_TW
dc.subject前饋式神經網路zh_TW
dc.subject信賴交叉驗證zh_TW
dc.subjectModel Selectionen
dc.subjectFeed-Forward Neural Networksen
dc.subjectCross-Validation with Confidenceen
dc.title前饋神經網路之信賴交叉驗證法zh_TW
dc.titleCross-Validation with Confidence in Feed-Forward Neural Networken
dc.typeThesis-
dc.date.schoolyear113-1-
dc.description.degree碩士-
dc.contributor.oralexamcommittee黃名鉞;楊鈞澔;黃世豪zh_TW
dc.contributor.oralexamcommitteeMing-Yueh Huang;Chun-Hao Yang;Shih-Hao Huangen
dc.subject.keyword信賴交叉驗證,前饋式神經網路,模型選擇,zh_TW
dc.subject.keywordCross-Validation with Confidence,Feed-Forward Neural Networks,Model Selection,en
dc.relation.page40-
dc.identifier.doi10.6342/NTU202500421-
dc.rights.note同意授權(限校園內公開)-
dc.date.accepted2025-02-06-
dc.contributor.author-college理學院-
dc.contributor.author-dept統計與數據科學研究所-
dc.date.embargo-lift2027-02-28-
顯示於系所單位:統計與數據科學研究所

文件中的檔案:
檔案 大小格式 
ntu-113-1.pdf
  未授權公開取用
3.57 MBAdobe PDF檢視/開啟
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved