Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/93861
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor林守德zh_TW
dc.contributor.advisorShou-De Linen
dc.contributor.author郭濬睿zh_TW
dc.contributor.authorChun-Jui Kuoen
dc.date.accessioned2024-08-08T16:37:07Z-
dc.date.available2024-08-09-
dc.date.copyright2024-08-08-
dc.date.issued2024-
dc.date.submitted2024-08-05-
dc.identifier.citation[1] K. Ahuja, K. Shanmugam, K. Varshney, and A. Dhurandhar. Invariant risk minimization games. In International Conference on Machine Learning, pages 145–155. PMLR, 2020.
[2] M. Arjovsky, L. Bottou, I. Gulrajani, and D. Lopez-Paz. Invariant risk minimization. arXiv preprint arXiv:1907.02893, 2019.
[3] Y. Chen, K. Zhou, Y. Bian, B. Xie, B. Wu, Y. Zhang, K. Ma, H. Yang, P. Zhao, B. Han, et al. Pareto invariant risk minimization: Towards mitigating the optimization dilemma in out-of-distribution generalization. arXiv preprint arXiv:2206.07766, 2022.
[4] Y. J. Choe, J. Ham, and K. Park. An empirical study of invariant risk minimization. arXiv preprint arXiv:2004.05007, 2020.
[5] E.Creager,J.-H.Jacobsen, and R.Zemel. Environment inference for invariant learning. In International Conference on Machine Learning, pages 2189–2200. PMLR, 2021.
[6] B.-W.Huang, K.-T.Liao, C.-S.Kao, and S.-D.Lin. Environment diversification with multi-head neural network for invariant learning. Advances in Neural Information Processing Systems, 35:915–927, 2022.
[7] D. Krueger, E. Caballero, J.-H. Jacobsen, A. Zhang, J. Binas, D. Zhang, R. Le Priol, and A. Courville. Out-of-distribution generalization via risk extrapolation (rex). In International conference on machine learning, pages 5815–5826. PMLR, 2021.
[8] Y. Lin, S. Zhu, L. Tan, and P. Cui. Zin: When and how to learn invariance without environment partition? Advances in Neural Information Processing Systems, 35:24529–24542, 2022.
[9] S. Sagawa, P. W. Koh, T. B. Hashimoto, and P. Liang. Distributionally robust neural networks for group shifts: On the importance of regularization for worst-case generalization. arXiv preprint arXiv:1911.08731, 2019.
[10] X. Tan, L. Yong, S. Zhu, C. Qu, X. Qiu, X. Yinghui, P. Cui, and Y. Qi. Provably invariant learning without domain information. In International Conference on Machine Learning, pages 33563–33580. PMLR, 2023.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/93861-
dc.description.abstract實現風險穩健的分佈外泛化(OOD)仍然是透過經驗最小化(ERM)訓練的機器學習模型面臨的巨大挑戰。儘管不變式最小化(IRM)提供了一種有前景的方法,但它要求對訓練資料進行環境劃分,造成實際使用上的困難。最近出現使用無環境資料的方法雖然有效,但往往依賴假設或輔助資訊。我們提出了EEPL,這是一種新穎的方法,透過部分預定義環境來產生高品質的增強環境。EEPL克服了理論上的限制,並在複雜的場景中實現了超越現有方法的最佳表現。此外,我們將「多樣性」確定為有效不變學習的關鍵屬性。我們的研究為實現可靠的OOD泛化開拓了新的研究途徑。zh_TW
dc.description.abstractAchieving robust out-of-distribution (OOD) generalization remains a challenge for machine learning models trained with Empirical Risk Minimization (ERM). While Invariant Risk Minimization (IRM) offers a promising approach, it requires impractical environment partitioning of the training data. Recent environment-free methods have emerged, but they often rely on assumptions or auxiliary information. We propose EEPL, a novel approach that leverages a small set of pre-defined environments to generate high-quality augmented environments. EEPL overcomes theoretical limitations and achieves state-of-the-art performance, surpassing existing methods in challenging scenarios. Additionally, we verify ``diversity'' as a crucial property for effective invariant learning. Our work opens new avenues for research in achieving reliable OOD generalization.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2024-08-08T16:37:07Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2024-08-08T16:37:07Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents口試委員會審定書 (i)
誌謝 (ii)
摘要 (iv)
Abstract (v)
Contents (vi)
List of Figures (viii)
List of Tables (ix)
1 Introduction (1)
2 Preliminary and Related work (4)
2.1 Preliminary (4)
2.2 Invariant Learning with Environment Labels (5)
3 Methodology (7)
3.1 Environment Diversification with Multi-head Neural Network for In- variantLearning(EDNIL) (7)
3.1.1 Environment Inference Model (7)
3.1.2 Invariant Learning Model (9)
3.1.3 Limitations of EDNIL (9)
3.2 Environment Enhancement with Partial Labeling (EEPL) (10)
3.2.1 Spurious Feature Extraction (11)
3.2.2 Environment Diversification (12)
4 Experiment (13)
4.1 Synthetic dataset (14)
4.1.1 Color-MNIST (CMNIST) (14)
4.1.2 MCOLOR (15)
4.2 Real-world dataset (16)
4.2.1 Waterbirds (16)
4.2.2 Waterbirds-rev (17)
4.2.3 CelebA (18)
4.2.4 HousePrice (19)
5 Conclusion (21)
References (22)
Appendix A - Ideal Environment Properties (24)
A.1 Definitions (24)
A.2 Empirical Results on Covariate Color-MNIST (25)
A.3 Infered environments of Color-MNIST (26)
Appendix B - Native environment size experiments (27)
Appendix C - Hyperparmeters settings (28)
-
dc.language.isoen-
dc.subject領域外泛化zh_TW
dc.subject分佈外泛化zh_TW
dc.subject不變式學習zh_TW
dc.subjectIRMen
dc.subjectOut-of-Distribution generalizationen
dc.subjectInvariant Learningen
dc.subjectInvariant Risk Minimizationen
dc.subjectOODen
dc.title基於部分標記環境下的不變式學習zh_TW
dc.titleInvariant Learning with Partially Labeled Environmentsen
dc.typeThesis-
dc.date.schoolyear112-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee林軒田;葉彌妍;廖耿德zh_TW
dc.contributor.oralexamcommitteeHsuan-Tien Lin;Mi-Yen Yeh;Keng-Te Liaoen
dc.subject.keyword領域外泛化,分佈外泛化,不變式學習,zh_TW
dc.subject.keywordOut-of-Distribution generalization,Invariant Learning,Invariant Risk Minimization,IRM,OOD,en
dc.relation.page29-
dc.identifier.doi10.6342/NTU202403378-
dc.rights.note未授權-
dc.date.accepted2024-08-08-
dc.contributor.author-college電機資訊學院-
dc.contributor.author-dept資訊工程學系-
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-112-2.pdf
  未授權公開取用
1.43 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved