Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 工學院
  3. 土木工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/83171
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor黃尹男zh_TW
dc.contributor.advisorYin-Nan Huangen
dc.contributor.author郭柏志zh_TW
dc.contributor.authorPo-Chih Kuoen
dc.date.accessioned2023-01-10T17:06:58Z-
dc.date.available2023-11-09-
dc.date.copyright2023-01-07-
dc.date.issued2022-
dc.date.submitted2022-12-29-
dc.identifier.citation[1] Ruiyang Zhang, Zhao Chen, Su Chen, Jingwei Zheng, Oral Büyüköztürk, and Hao Sun. Deep long short-term memory networks for nonlinear structural seismic response prediction. Computers & Structures, 220:55–68, 2019.
[2] Ruiyang Zhang, Yang Liu, and Hao Sun. Physics-informed multi-lstm networks for metamodeling of nonlinear structures. Computer Methods in Applied Mechanics and Engineering, 369:113226, 2020.
[3] Teng Li, Yuxin Pan, Kaitai Tong, Carlos E Ventura, and Clarence W de Silva. Attention-based sequence-to-sequence learning for online structural response forecasting under seismic excitation. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 52(4):2184–2200, 2021.
[4] Mehmet Polat Saka and Zong Woo Geem. Mathematical and metaheuristic applications in design optimization of steel frame structures: an extensive review. Mathematical problems in engineering, 2013, 2013.
[5] Ahmed A Torky and Anas A Aburawwash. A deep learning approach to automated structural engineering of prestressed members. Int J Struct Civ Eng Res, 7(4):347– 352, 2018.
[6] Vinicius Alves, Alexandre Cury, Ney Roitman, Carlos Magluta, and Christian Cre- mona. Structural modification assessment using supervised learning methods applied to vibration data. Engineering Structures, 99:439–448, 2015.
[7] Mohsen Azimi and Gokhan Pekcan. Structural health monitoring using extremely compressed data through deep learning. Computer-Aided Civil and Infrastructure Engineering, 35(6):597–614, 2020.
[8] Panagiotis Seventekidis, Dimitrios Giagopoulos, Alexandros Arailopoulos, and Olga Markogiannaki. Structural health monitoring using deep learning with optimal finite element model generated data. Mechanical Systems and Signal Processing, 145:106972, 2020.
[9] I Karimi, N Khaji, MT Ahmadi, and M Mirzayee. System identification of concrete gravity dams using artificial neural networks based on a hybrid finite element– boundary element approach. Engineering structures, 32(11):3583–3591, 2010.
[10] Rih-Teng Wu and Mohammad R Jahanshahi. Deep convolutional neural network for structural dynamic response estimation and system identification. Journal of Engineering Mechanics, 145(1):04018125, 2019.
[11] Akbar A Javadi, Teng Puay Tan, and ASI Elkassas. Intelligent finite element method and application to simulation of behavior of soils under cyclic loading. In Foundations of Computational Intelligence Volume 5, pages 317–338. Springer, 2009.
[12] Christoph Zopf and Michael Kaliske. Numerical characterisation of uncured elastomers by a neural network based approach. Computers & Structures, 182:504–525, 2017.
[13] Qing Wang, Jianhui Wang, Xiaofang Huang, and Li Zhang. Semiactive nonsmooth control for building structure with deep learning. Complexity, 2017, 2017.
[14] Hyun-Su Kim. Development of seismic response simulation model for building structures with semi-active control devices using recurrent neural network. Applied Sciences, 10(11):3915, 2020.
[15] Gordon Lightbody and George W Irwin. Multi-layer perceptron based modelling of nonlinear systems. Fuzzy sets and systems, 79(1):93–112, 1996.
[16] Chiung-Shiann Huang, Shih-Lin Hung, CM Wen, and TT Tu. A neural network approach for structural identification and diagnosis of a building from seismic response data. Earthquake engineering & structural dynamics, 32(2):187–206, 2003.
[17] Kai-Hung Chang and Chin-Yi Cheng. Learning to simulate and design for structural engineering. In International Conference on Machine Learning, pages 1426–1436. PMLR, 2020.
[18] Yuan-Tung Chou, Wei-Tze Chang, Jimmy G. Jean, Kai-Hung Chang, and Chuin- Shan Chen. Structural analysis with graph neural networks. unpublished thesis, September 2022.
[19] Eamon Whalen and Caitlin Mueller. Toward reusable surrogate models: Graph-based transfer learning on trusses. Journal of Mechanical Design, 144(2), 2022.
[20] Michael M Bronstein, Joan Bruna, Yann LeCun, Arthur Szlam, and Pierre Van- dergheynst. Geometric deep learning: going beyond euclidean data. IEEE Signal Processing Magazine, 34(4):18–42, 2017.
[21] Sam T Roweis and Lawrence K Saul. Nonlinear dimensionality reduction by locally linear embedding. science, 290(5500):2323–2326, 2000.
[22] Shaosheng Cao, Wei Lu, and Qiongkai Xu. Grarep: Learning graph representations with global structural information. In Proceedings of the 24th ACM international on conference on information and knowledge management, pages 891–900, 2015.
[23] Bryan Perozzi, Rami Al-Rfou, and Steven Skiena. Deepwalk: Online learning of social representations. In Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining, pages 701–710, 2014.
[24] AdityaGroverandJureLeskovec.node2vec:Scalablefeaturelearningfornetworks. In Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining, pages 855–864, 2016.
[25] Daixin Wang, Peng Cui, and Wenwu Zhu. Structural deep network embedding. In Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining, pages 1225–1234, 2016.
[26] Jian Tang, Meng Qu, Mingzhe Wang, Ming Zhang, Jun Yan, and Qiaozhu Mei. Line: Large-scale information network embedding. In Proceedings of the 24th international conference on world wide web, pages 1067–1077, 2015.
[27] Will Hamilton, Zhitao Ying, and Jure Leskovec. Inductive representation learning on large graphs. Advances in neural information processing systems, 30, 2017.
[28] Franco Scarselli, Marco Gori, Ah Chung Tsoi, Markus Hagenbuchner, and Gabriele Monfardini. The graph neural network model. IEEE transactions on neural networks, 20(1):61–80, 2008.
[29] Justin Gilmer, Samuel S Schoenholz, Patrick F Riley, Oriol Vinyals, and George E Dahl. Neural message passing for quantum chemistry. In International conference on machine learning, pages 1263–1272. PMLR, 2017.
[30] Thomas N Kipf and Max Welling. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.
[31] PetarVeličković,GuillemCucurull,ArantxaCasanova,AdrianaRomero,PietroLio, and Yoshua Bengio. Graph attention networks. arXiv preprint arXiv:1710.10903, 2017.
[32] Thomas N Kipf and Max Welling. Variational graph auto-encoders. arXiv preprint arXiv:1611.07308, 2016.
[33] Keyulu Xu, Weihua Hu, Jure Leskovec, and Stefanie Jegelka. How powerful are graph neural networks? arXiv preprint arXiv:1810.00826, 2018.
[34] ZonghanWu,ShiruiPan,FengwenChen,GuodongLong,ChengqiZhang,andSYu Philip. A comprehensive survey on graph neural networks. IEEE transactions on neural networks and learning systems, 32(1):4–24, 2020.
[35] MarcoGori,GabrieleMonfardini,andFrancoScarselli.Anewmodelforlearningin graph domains. In Proceedings. 2005 IEEE international joint conference on neural networks, volume 2, pages 729–734, 2005.
[36] Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473, 2014.
[37] AshishVaswani,NoamShazeer,NikiParmar,JakobUszkoreit,LlionJones,AidanN Gomez, Łukasz Kaiser, and Illia Polosukhin. Attention is all you need. Advances in neural information processing systems, 30, 2017.
[38] Minh-Thang Luong, Hieu Pham, and Christopher D Manning. Effective approaches to attention-based neural machine translation. arXiv preprint arXiv:1508.04025, 2015.
[39] Ilya Sutskever, Oriol Vinyals, and Quoc V Le. Sequence to sequence learning with neural networks. Advances in neural information processing systems, 27, 2014.
[40] Ronald J Williams and David Zipser. A learning algorithm for continually running fully recurrent neural networks. Neural computation, 1(2):270–280, 1989.
[41] Sebastian Ruder. An overview of gradient descent optimization algorithms. arXiv preprint arXiv:1609.04747, 2016.
[42] David E Rumelhart, Geoffrey E Hinton, and Ronald J Williams. Learning representations by back-propagating errors. nature, 323(6088):533–536, 1986.
[43] Paul J Werbos. Backpropagation through time: what it does and how to do it. Proceedings of the IEEE, 78(10):1550–1560, 1990.
[44] Yoshua Bengio, Patrice Simard, and Paolo Frasconi. Learning long-term dependencies with gradient descent is difficult. IEEE transactions on neural networks, 5(2):157–166, 1994.
[45] Sepp Hochreiter and Jürgen Schmidhuber. Long short-term memory. Neural computation, 9(8):1735–1780, 1997.
[46] Ralf C Staudemeyer and Eric Rothstein Morris. Understanding lstm–a tutorial into long short-term memory recurrent neural networks. arXiv preprint arXiv:1909.09586, 2019.
[47] Felix A Gers, Jürgen Schmidhuber, and Fred Cummins. Learning to forget: Continual prediction with lstm. Neural computation, 12(10):2451–2471, 2000.
[48] Shih-Cheng Huang, Anuj Pareek, Saeed Seyyedi, Imon Banerjee, and Matthew P Lungren. Fusion of medical imaging and electronic health records using deep learning: a systematic review and implementation guidelines. NPJ digital medicine, 3(1):1–9, 2020.
[49] Bo-Zhou Lin, Ming-Chieh Chuang, and Keh-Chyuan Tsai. Object-oriented development and application of a nonlinear structural analysis framework. Advances in Engineering Software, 40(1):66–82, 2009.
[50] 中華民國內政部營建署. 「建築物耐震設計規範及解說」, 2011.
[51] Timothy D Ancheta, Robert B Darragh, Jonathan P Stewart, Emel Seyhan, Walter J Silva, Brian S-J Chiou, Katie E Wooddell, Robert W Graves, Albert R Kottke, David M Boore, et al. Nga-west2 database. Earthquake Spectra, 30(3):989–1005, 2014.
[52] Central Weather Bureau (CWB, Taiwan). Central weather bureau seismographic network, 2012.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/83171-
dc.description.abstract近年來深度學習的快速發展極大地擴展了其在結構工程中的應用可能性,基於深度學習方法的建築結構代理模型在過去幾年被廣泛地研究,至今已有許多研究提出不同的深度學習模型來預測特定結構物受震後的反應歷時。雖然過去的研究所提出的深度學習模型在特定結構物的反應歷時預測任務上能達到良好的準確度,但這些深度學習模型僅適用於預測特定結構物,無法同時預測其他不同結構物的反應歷時,這使得原結構物變更設計或使用者需要評估其他結構案例進行時,就需要從新蒐集資料來訓練新的代理模型。
本篇論文針對結構物反應歷時預測任務提出一全新的深度學習方法。本研究將建築結構資料以圖資料結構封裝,與地表加速度歷時資料一併作為輸入資料,並提出一基於圖嵌入網路(Graph embedding network)與長短期記憶神經網路(Long short-tern memory, LSTM)模型的融合深度學習模型架構,能夠根據輸入的結構物與地震資訊,預測出對應結構物的反應歷時。本篇研究針對結構反應歷時預測任務提出了完整的深度學習方法論,分別探討序列模型的優化演算法,以及圖嵌入網路中不同聚合函數擷取結構物特徵的能力。本篇研究針對 LSTM 演算法的計算特性提出了集裝填充序列學習策略與序列壓縮學習策略,透過實驗證實提出的學習策略能有效提升模型的預測能力與訓練效率。圖嵌入網路的部分,以圖卷積網路 (Graph convolutional network, GCN) 與圖注意力網路 (Graph attention network, GAT) 兩種不同的圖神經網路,實驗不同聚合函數的預測能力,並透過非監督式學習降維方法將高維的圖嵌入向量降為至二維空間進行資料視覺化。本研究觀察出不同建築結構的圖嵌入依據第一模態週期以及樓高有顯著的分群效果,驗證了深度學習模型內部的圖嵌入確實攜帶結構物的物理信息,同時也說明了本研究提出的深度學習架構具有優異的建築結構特徵擷取能力。
zh_TW
dc.description.abstractThe development of deep learning in recent years has dramatically expanded its application possibilities in structural engineering. Structural surrogate models based on deep learning methods have been widely studied in the past few years, and many studies have proposed different deep learning models to predict the response history of specific structures. Although the deep learning models presented in past studies can achieve good accuracy in the task of predicting the response history of particular structures, these deep learning models are only suitable for predicting a specific structure but not for other different structures at the same time, which makes it necessary to re-collect data to train a new surrogate model when the original structure changes the design or the user needs to evaluate other structural cases.
This study proposes a new deep-learning method for predicting the response history of different structures. We encapsulate the structure data in a graph and use the graph as input data along with the ground-motion data. Then we propose a fusion deep learning model architecture based on a graph embedding network and sequence model, which can predict the response history of the corresponding structure based on the input structural graph and seismic information. This study proposes a complete deep learning methodology for structural response time prediction tasks, exploring optimization algorithms for sequence models and the ability of different aggregation functions in graph embedding networks to capture structural features. This study proposes the Packing Padded Sequences (PPS) learning strategy and the Sequence Compression (SC) learning strategy for the computational characteristics of the LSTM algorithm. Through experiments, it has been proven that the proposed learning strategy can effectively improve the model’s prediction ability and training efficiency. For the part of the graph embedding network, two different graph neural networks, the Graph Convolutional Network (GCN) and the Graph Attention Network (GAT), are used to experiment on the effects of varying aggregation functions on prediction ability. For data visualization, unsupervised learning dimensionality reduction methods are used to reduce high-dimensional graph embedding vectors to two-dimensional space. We observed that the graph embedding of different building structures has a significant grouping effect according to the first modal period and the building height. This verifies that the graph embedding inside the deep learning model carries structural information. It also shows that the deep learning architecture proposed in this study can extract structural features.
en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-01-10T17:06:58Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2023-01-10T17:06:58Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents口試委員審定書 i
致謝 ii
摘要 iii
Abstract iv
目錄 vi
圖目錄 ix
表目錄 xii
第一章 緒論 1
1.1 研究背景與研究動機 1
1.2 文獻回顧 3
1.2.1 應用序列模型於預測結構動力反應 4
1.2.2 應用圖資料結構與圖神經網路建立建築結構特徵表示 5
1.3 研究目的 7
1.4 學術貢獻 8
1.5 論文架構 8
第二章 研究方法 10
2.1 圖資料結構(Graph)與圖嵌入(Graph Embedding) 10
2.1.1 圖資料結構(Graph) 11
2.1.2 圖嵌入(GraphEmbedding) 13
2.2 圖神經網路(GraphNeuralNetwork,GNN) 15
2.2.1 圖卷積網路 (Graph Convolutional Network, GCN) 17
2.2.2 圖注意力網路 (Graph Attention Network, GAT) 18
2.3 序列模型 21
2.3.1 遞迴神經網路 (Recurrent Neural Network,RNN) 22
2.3.2 長短期記憶神經網路 (Long Short-Term Memory,LSTM) 23
2.4 基於圖神經網路與長短記憶神經網路之融合模型 26
2.4.1 結構信息圖嵌入(Structure-informed graph embedding) 27
2.4.2 序列資料之數據融合(Datafusion) 28
第三章 資料搜集與資料前處理 31
3.1 建築結構數值模型資料 31
3.1.1 建物搜集範圍 32
3.1.2 建物場址資訊 32
3.1.3 數值模型設定 33
3.2 地震資料 36
3.2.1 地震歷時篩選 36
3.2.2 地震歷時縮放 36
3.3 結構反應歷時資料 40
3.3.1 線性與非線性歷時分析 40
3.4 深度學習資料集 41
3.4.1 資料生成流程 41
3.4.2 資料前處理 43
3.4.3 訓練資料及與測試資料集 44
第四章 深度學習模型實驗方法 46
4.1 學習策略 46
4.1.1 集裝填充序列(Packing Padded Sequences, PPS) 47
4.1.2 序列壓縮(SequenceCompression,SC) 49
4.2 實驗設置 51
4.3 評估指標 52
第五章 實驗結果與討論 53
5.1 集裝填充序列學習策略之影響 53
5.2 序列壓縮學習策略之影響 59
5.3 圖嵌入網路之可解釋性與不同 GNN 之比較 66
第六章 結論與建議 86
6.1 結論 86
6.2 建議 87
參考文獻 88
附錄 A — 地震事件資料 94
附錄 B — 斷面資料 96
-
dc.language.isozh_TW-
dc.subject長短期記憶神經網路zh_TW
dc.subject深度學習zh_TW
dc.subject圖嵌入zh_TW
dc.subject結構歷時反應zh_TW
dc.subject圖神經網路zh_TW
dc.subjectstructural response historyen
dc.subjectdeep leaningen
dc.subjectLSTMen
dc.subjectGNNen
dc.subjectgraph embeddingen
dc.title以結構信息圖嵌入結合長短期記憶深度神經網路預測結構物之動力反應zh_TW
dc.titleStructural Dynamic Responses Prediction with Structure-informed Graph Embedding and Deep LSTM Neural Networken
dc.title.alternativeStructural Dynamic Responses Prediction with Structure-informed Graph Embedding and Deep LSTM Neural Network-
dc.typeThesis-
dc.date.schoolyear111-1-
dc.description.degree碩士-
dc.contributor.oralexamcommittee吳日騰;陳俊杉zh_TW
dc.contributor.oralexamcommitteeRih-Teng Wu;Chuin-Shan Chenen
dc.subject.keyword深度學習,長短期記憶神經網路,圖神經網路,圖嵌入,結構歷時反應,zh_TW
dc.subject.keyworddeep leaning,LSTM,GNN,graph embedding,structural response history,en
dc.relation.page96-
dc.identifier.doi10.6342/NTU202210197-
dc.rights.note同意授權(全球公開)-
dc.date.accepted2023-01-03-
dc.contributor.author-college工學院-
dc.contributor.author-dept土木工程學系-
顯示於系所單位:土木工程學系

文件中的檔案:
檔案 大小格式 
U0001-0803221229481205.pdf15.15 MBAdobe PDF檢視/開啟
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved