Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 管理學院
  3. 資訊管理學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/99390
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor李家岩zh_TW
dc.contributor.advisorChia-Yen Leeen
dc.contributor.author洪睿謙zh_TW
dc.contributor.authorRui-Qian Hongen
dc.date.accessioned2025-09-10T16:08:27Z-
dc.date.available2025-09-11-
dc.date.copyright2025-09-10-
dc.date.issued2025-
dc.date.submitted2025-07-25-
dc.identifier.citationBrunton, S. L., Proctor, J. L., and Kutz, J. N. (2016a). Discovering governing equations from data by sparse identification of nonlinear dynamical systems. Proceedings of the National Academy of Sciences, 113(15):3932–3937.
Brunton, S. L., Proctor, J. L., and Kutz, J. N. (2016b). Sparse identification of nonlinear dynamics with control (sindyc). IFAC-PapersOnLine, 49(18):710–715.
Cai, S., Wang, Z., Wang, S., Perdikaris, P., and Karniadakis, G. E. (2021). Physics-informed neural networks for heat transfer problems. Journal of Heat Transfer, 143(6):060801.
Chakraborty, S. (2021). Transfer learning based multi-fidelity physics informed deep neural network. Journal of Computational Physics, 426:109942.
Chen, Y., Hosseini, B., Owhadi, H., and Stuart, A. M. (2021a). Solving and learning nonlinear pdes with gaussian processes. Journal of Computational Physics, 447:110668.
Chen, Y., Lu, L., Karniadakis, G. E., and Dal Negro, L. (2020). Physics-informed neural networks for inverse problems in nano-optics and metamaterials. Optics Express, 28(8):11618–11633.
Chen, Z., Liu, Y., and Sun, H. (2021b). Physics-informed learning of governing equations from scarce data. Nature Communications, 12(1):6136.
Cheng, Y.-H., Lee, C.-Y., Tsai, C.-H., and Wu, J.-M. (2025). Two-phase data science framework for compensation of the friction force in cnc machines. International Journal of Computer Integrated Manufacturing, 38(4):520–535.
Chiu, P.-H., Wong, J. C., Ooi, C., Dao, M. H., and Ong, Y.-S. (2022). Can-pinn: A fast physics-informed neural network based on coupled-automatic–numerical differentiation method. Computer Methods in Applied Mechanics and Engineering, 395:114909.
Cranmer, M. (2023). Interpretable machine learning for science with pysr and symbolicregression. jl. arXiv preprint arXiv:2305.01582.
Cranmer, M., Sanchez Gonzalez, A., Battaglia, P., Xu, R., Cranmer, K., Spergel, D., and Ho, S. (2020). Discovering symbolic models from deep learning with inductive biases. Advances in Neural Information Processing Systems, 33:17429–17442.
de Silva, B. M., Champion, K., Quade, M., Loiseau, J.-C., Kutz, J. N., and Brunton, S. L. (2020). Pysindy: a python package for the sparse identification of nonlinear dynamics from data. arXiv preprint arXiv:2004.08424.
Desai, S., Mattheakis, M., Joy, H., Protopapas, P., and Roberts, S. (2021). One-shot transfer learning of physics-informed neural networks. arXiv preprint arXiv:2110.11286.
Dwivedi, V., Parashar, N., and Srinivasan, B. (2019). Distributed physics informed neural network for data-efficient solution to partial differential equations. arXiv preprint arXiv:1907.08967.
Fang, Z. and Zhan, J. (2019). Deep physical informed neural networks for metamaterial design. IEEE Access, 8:24506–24513.
Gao, H., Zahr, M. J., and Wang, J.-X. (2022). Physics-informed graph neural galerkin networks: A unified framework for solving pde-governed forward and inverse problems. Computer Methods in Applied Mechanics and Engineering, 390:114502.
Goswami, S., Anitescu, C., Chakraborty, S., and Rabczuk, T. (2020). Transfer learning enhanced physics informed neural network for phase-field modeling of fracture. Theoretical and Applied Fracture Mechanics, 106:102447.
Hennigh, O., Narasimhan, S., Nabian, M. A., Subramaniam, A., Tangsali, K., Fang, Z., Rietmann, M., Byeon, W., and Choudhry, S. (2021). Nvidia simnet™: An ai-accelerated multi-physics simulation framework. In International Conference on Computational Science, pages 447–461. Springer.
Hu, Z., Jagtap, A. D., Karniadakis, G. E., and Kawaguchi, K. (2023). Augmented physics-informed neural networks (apinns): A gating network-based soft domain decomposition methodology. Engineering Applications of Artificial Intelligence, 126:107183.
Iten, R., Metger, T., Wilming, H., Del Rio, L., and Renner, R. (2020). Discovering physical concepts with neural networks. Physical Review Letters, 124(1):010508.
Jagtap, A. D. and Karniadakis, G. E. (2020). Extended physics-informed neural networks (xpinns): A generalized space-time domain decomposition based deep learning framework for nonlinear partial differential equations. Communications in Computational Physics, 28(5).
Jagtap, A. D., Kharazmi, E., and Karniadakis, G. E. (2020). Conservative physics-informed neural networks on discrete domains for conservation laws: Applications to forward and inverse problems. Computer Methods in Applied Mechanics and Engineering, 365:113028.
Jin, P., Zhang, Z., Zhu, A., Tang, Y., and Karniadakis, G. E. (2020). Sympnets: Intrinsic structure-preserving symplectic networks for identifying hamiltonian systems. Neural Networks, 132:166–179.
Kemeth, F. P., Alonso, S., Echebarria, B., Moldenhawer, T., Beta, C., and Kevrekidis, I. G. (2023). Black and gray box learning of amplitude equations: Application to phase field systems. Physical Review E, 107(2):025305.
Kingma, D. P. (2014). Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980.
Kissas, G., Yang, Y., Hwuang, E., Witschey, W. R., Detre, J. A., and Perdikaris, P. (2020). Machine learning in cardiovascular flows modeling: Predicting arterial blood pressure from non-invasive 4d flow mri data using physics-informed neural networks. Computer Methods in Applied Mechanics and Engineering, 358:112623.
Kiyani, E., Shukla, K., Karniadakis, G. E., and Karttunen, M. (2023). A framework based on symbolic regression coupled with extended physics-informed neural networks for gray-box learning of equations of motion from data. Computer Methods in Applied Mechanics and Engineering, 415:116258.
Koza, J. R. (1994). Genetic programming as a means for programming computers by natural selection. Statistics and Computing, 4:87–112.
Landajuela, M., Lee, C. S., Yang, J., Glatt, R., Santiago, C. P., Aravena, I., Mundhenk, T., Mulcahy, G., and Petersen, B. K. (2022). A unified framework for deep symbolic regression. Advances in Neural Information Processing Systems, 35:33985–33998.
Lee, K., Trask, N., and Stinis, P. (2022). Structure-preserving sparse identification of nonlinear dynamics for data-driven modeling. In Mathematical and Scientific Machine Learning, pages 65–80. PMLR.
Lee, S., Kooshkbaghi, M., Spiliotis, K., Siettos, C. I., and Kevrekidis, I. G. (2020). Coarse-scale pdes from fine-scale observations via machine learning. Chaos: An Interdisciplinary Journal of Nonlinear Science, 30(1).
Long, Z., Lu, Y., and Dong, B. (2019). Pde-net 2.0: Learning pdes from data with a numeric-symbolic hybrid deep network. Journal of Computational Physics, 399:108925.
Long, Z., Lu, Y., Ma, X., and Dong, B. (2018). PDE-net: Learning PDEs from data. In Dy, J. and Krause, A., editors, Proceedings of the 35th International Conference on Machine Learning, volume 80 of Proceedings of Machine Learning Research, pages 3208–3216. PMLR.
Lu, L., Jin, P., Pang, G., Zhang, Z., and Karniadakis, G. E. (2021). Learning nonlinear operators via deeponet based on the universal approximation theorem of operators. Nature Machine Intelligence, 3(3):218–229.
Lusch, B., Kutz, J. N., and Brunton, S. L. (2018). Deep learning for universal linear embeddings of nonlinear dynamics. Nature Communications, 9(1):4950.
Nabian, M. A., Gladstone, R. J., and Meidani, H. (2021). Efficient training of physics-informed neural networks via importance sampling. Computer-Aided Civil and Infrastructure Engineering, 36(8):962–977.
Ouyang, X., Chen, Y., and Agam, G. (2021). Accelerated wgan update strategy with loss change rate balancing. In Proceedings of the IEEE/CVF winter conference on applications of computer vision, pages 2546–2555.
Pang, G., Lu, L., and Karniadakis, G. E. (2019). fpinns: Fractional physics-informed neural networks. SIAM Journal on Scientific Computing, 41(4):A2603–A2626.
Raissi, M. (2018). Deep hidden physics models: Deep learning of nonlinear partial differential equations. Journal of Machine Learning Research, 19(25):1–24.
Raissi, M., Perdikaris, P., and Karniadakis, G. (2019). Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378:686–707.
Raissi, M., Yazdani, A., and Karniadakis, G. E. (2020). Hidden fluid mechanics: Learning velocity and pressure fields from flow visualizations. Science, 367(6481):1026–1030.
Rudy, S. H., Brunton, S. L., Proctor, J. L., and Kutz, J. N. (2017). Data-driven discovery of partial differential equations. Science Advances, 3(4):e1602614.
Sahli Costabal, F., Yang, Y., Perdikaris, P., Hurtado, D. E., and Kuhl, E. (2020). Physics-informed neural networks for cardiac activation mapping. Frontiers in Physics, 8:42.
Schmidt, M. and Lipson, H. (2009). Distilling free-form natural laws from experimental data. Science, 324(5923):81–85.
Shukla, K., Jagtap, A. D., and Karniadakis, G. E. (2021). Parallel physics-informed neural networks via domain decomposition. Journal of Computational Physics, 447:110683.
Sun, L., Gao, H., Pan, S., and Wang, J.-X. (2020). Surrogate modeling for fluid flows based on physics-constrained deep learning without simulation data. Computer Methods in Applied Mechanics and Engineering, 361:112732.
Tancik, M., Srinivasan, P., Mildenhall, B., Fridovich-Keil, S., Raghavan, N., Singhal, U., Ramamoorthi, R., Barron, J., and Ng, R. (2020). Fourier features let networks learn high frequency functions in low dimensional domains. Advances in Neural Information Processing Systems, 33:7537–7547.
Taylor, J., Wang, W., Bala, B., and Bednarz, T. (2022). Optimizing the optimizer for data driven deep neural networks and physics informed neural networks. arXiv preprint arXiv:2205.07430.
Tenachi, W., Ibata, R., and Diakogiannis, F. I. (2023). Deep symbolic regression for physics guided by units constraints: toward the automated discovery of physical laws. The Astrophysical Journal, 959(2):99.
Toscano, J. D., Oommen, V., Varghese, A. J., Zou, Z., Ahmadi Daryakenari, N., Wu, C., and Karniadakis, G. E. (2025). From pinns to pikans: Recent advances in physics-informed machine learning. Machine Learning for Computational Science and Engineering, 1(1):1–43.
Vaddireddy, H., Rasheed, A., Staples, A. E., and San, O. (2020). Feature engineering and symbolic regression methods for detecting hidden physics from sparse sensor observation data. Physics of Fluids, 32(1).
Wang, C., Li, S., He, D., and Wang, L. (2022a). Is l^2 physics informed loss always suitable for training physics informed neural network? Advances in Neural Information Processing Systems, 35:8278–8290.
Wang, S., Teng, Y., and Perdikaris, P. (2021). Understanding and mitigating gradient flow pathologies in physics-informed neural networks. SIAM Journal on Scientific Computing, 43(5):A3055–A3081.
Wang, S., Yu, X., and Perdikaris, P. (2022b). When and why pinns fail to train: A neural tangent kernel perspective. Journal of Computational Physics, 449:110768.
Wu, C., Zhu, M., Tan, Q., Kartha, Y., and Lu, L. (2023). A comprehensive study of non-adaptive and residual-based adaptive sampling for physics-informed neural networks. Computer Methods in Applied Mechanics and Engineering, 403:115671.
Yaghoubi, S. and Fainekos, G. (2019). Gray-box adversarial testing for control systems with machine learning components. In Proceedings of the 22nd ACM International Conference on Hybrid Systems: Computation and Control, pages 179–184.
Yang, L., Meng, X., and Karniadakis, G. E. (2021). B-pinns: Bayesian physics-informed neural networks for forward and inverse pde problems with noisy data. Journal of Computational Physics, 425:109913.
Zhang, E., Dao, M., Karniadakis, G. E., and Suresh, S. (2022a). Analyses of internal structures and defects in materials using physics-informed neural networks. Science Advances, 8(7):eabk0644.
Zhang, Z., Shin, Y., and Em Karniadakis, G. (2022b). Gfinns: Generic formalism informed neural networks for deterministic and stochastic dynamical systems. Philosophical Transactions of the Royal Society A, 380(2229):20210207.
Zou, Z., Meng, X., and Karniadakis, G. E. (2024). Correcting model misspecification in physics-informed neural networks (pinns). Journal of Computational Physics, 505:112918.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/99390-
dc.description.abstract物理資訊神經網路(PINNs)已成為模擬複雜動態系統的強大工具,透過將物理定律(以微分方程形式表示)融入神經網路架構中。然而,其效能受到模型錯誤設定的顯著限制,當物理先驗知識不完整或錯誤時,會導致非物理的解以及預測精度的下降。為了解決此挑戰,我們提出領域自適應物理資訊神經網路(DAPINNs)框架,並結合基於自動微分的物理校正(ADPC)模型。此框架透過三階段流程整合部分物理知識與資料驅動的校正:源域預訓練、目標域與ADPC的微調,以及用於差異識別的符號迴歸。ADPC模型利用自動微分技術動態校正錯誤設定的控制方程,捕捉包括高階與非線性交互作用在內的複雜物理現象。交替更新策略提升了訓練穩定性,而符號迴歸確保校正結果的可解釋性,從而增進科學理解。透過結合領域自適應與穩健的校正機制,DAPINNs與ADPC提供了一個多功能且具可解釋性的解決方案,適用於在不完整物理知識下模擬動態系統。zh_TW
dc.description.abstractPhysics-Informed Neural Networks (PINNs) have emerged as a powerful paradigm for modeling complex dynamical systems by embedding physical laws, expressed as differential equations, into neural network architectures. However, their performance is critically limited by model misspecification, where incomplete or incorrect physical priors lead to non-physical solutions and diminished predictive accuracy. To address this challenge, we propose the Domain-Adaptive Physics-Informed Neural Networks (DAPINNs) framework augmented with an Auto-Differentiation-based Physics Correction (ADPC) model. This framework integrates partial physical knowledge with data-driven corrections through a three-stage pipeline: source-domain pre-training, target-domain fine-tuning with ADPC, and symbolic regression for discrepancy identification. The ADPC model leverages automatic differentiation to dynamically correct misspecified governing equations, capturing complex physical phenomena, including higher-order and nonlinear interactions. An alternating update scheme enhances training stability, while symbolic regression ensures interpretable corrections, improving scientific understanding. By combining domain adaptation with robust correction mechanisms, DAPINNs with ADPC offers a versatile and interpretable solution for modeling dynamical systems under incomplete physical knowledge.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2025-09-10T16:08:27Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2025-09-10T16:08:27Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents謝辭 i
摘要 ii
Abstract iii
Contents v
List of Figures ix
List of Tables xi
Chapter 1 Introduction 1
1.1 Background and Motivation 1
1.2 Research Objective 3
1.3 Thesis Architecture 4
Chapter 2 Literature Review 7
2.1 Data-Driven Modeling of Dynamical Systems 7
2.2 PINNs: Principles and Applications 10
2.3 Hybrid and Correction-Based Modeling Approaches 13
2.4 Summary 15
Chapter 3 Methodology 17
3.1 Overview of the DAPINNs with ADPC Framework 17
3.2 Pre-training Stage 19
3.3 Fine-tuning Stage 21
3.4 Discrepancy Identification 24
Chapter 4 Results 26
4.1 Overview of Case Studies 26
4.2 Case I: Damped Harmonic Oscillator 28
4.2.1 Problem Description 28
4.2.1.1 True and Misspecified Governing Equations 28
4.2.1.2 Inference Objectives and Experimental Setup 28
4.2.2 Model Performance and Evaluation 30
4.2.3 Performance Comparison Under Varying Levels of Data Scarcity 34
4.2.4 Robustness to Noise 38
4.3 Case II: Quadratic Damped Harmonic Oscillator 41
4.3.1 Problem Description 41
4.3.1.1 True and Misspecified Governing Equations 41
4.3.1.2 Inference Objectives and Experimental Setup 41
4.3.2 Model Performance and Evaluation 43
4.3.3 Performance Comparison Under Varying Level of Data Scarcity 47
4.3.4 Robustness to Noise 51
4.4 Case III: Viscous Burgers’ Equation 54
4.4.1 Problem Description 54
4.4.1.1 True and Misspecified Governing Equations 54
4.4.1.2 Inference Objectives and Experimental Setup 54
4.4.2 Model Performance and Evaluation 55
4.4.3 Performance Comparison Under Varying Levels of Data Scarcity 62
4.4.4 Robustness to Noise 66
4.5 Ablation Study 68
4.5.1 Experimental Setup 68
4.5.2 Results and Discussion 68
Chapter 5 Conclusion 71
5.1 Summary 71
5.2 Key Contributions 72
5.3 Limitations 73
5.4 Future Works 74
Bibliography 77
Appendix A — Hyperparameters 84
Appendix B — Correction Model Comparison 85
-
dc.language.isoen-
dc.subject物理資訊神經網絡zh_TW
dc.subject稀疏數據zh_TW
dc.subject領域自適應zh_TW
dc.subject符號回歸zh_TW
dc.subject模型錯誤設定zh_TW
dc.subjectmodel misspecificationen
dc.subjectscarce dataen
dc.subjectdomain adaptationen
dc.subjectsymbolic regressionen
dc.subjectphysics-informed neural networksen
dc.title透過具可解釋自動微分校正的領域自適應物理資訊神經網路修正模型錯誤設定zh_TW
dc.titleCorrecting Model Misspecification by Domain-Adaptive Physics-Informed Neural Networks with Interpretable Auto-Differentiation-Based Correctionen
dc.typeThesis-
dc.date.schoolyear113-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee柯坤呈;陳明志;舒宇宸zh_TW
dc.contributor.oralexamcommitteeKun-Cheng Ke;Ming-Jyh Chern;Yu-Chen Shuen
dc.subject.keyword物理資訊神經網絡,稀疏數據,領域自適應,符號回歸,模型錯誤設定,zh_TW
dc.subject.keywordphysics-informed neural networks,scarce data,domain adaptation,symbolic regression,model misspecification,en
dc.relation.page86-
dc.identifier.doi10.6342/NTU202502474-
dc.rights.note同意授權(全球公開)-
dc.date.accepted2025-07-29-
dc.contributor.author-college管理學院-
dc.contributor.author-dept資訊管理學系-
dc.date.embargo-lift2028-08-01-
顯示於系所單位:資訊管理學系

文件中的檔案:
檔案 大小格式 
ntu-113-2.pdf
  此日期後於網路公開 2028-08-01
17.9 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved