Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電信工程學研究所
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101287
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor林澤zh_TW
dc.contributor.advisorChe Linen
dc.contributor.author林少恩zh_TW
dc.contributor.authorShao-En Linen
dc.date.accessioned2026-01-13T16:13:22Z-
dc.date.available2026-01-14-
dc.date.copyright2026-01-13-
dc.date.issued2025-
dc.date.submitted2025-12-16-
dc.identifier.citationT. Akiba, S. Sano, T. Yanase, T. Ohta, and M. Koyama. Optuna: A next-generation hyperparameter optimization framework. In The 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pages 2623–2631, 2019.
L. Butler, A. Parada-Mayorga, and A. Ribeiro. Convolutional learning on multigraphs. IEEE Transactions on Signal Processing, 71:933–946, Jan. 2023. ISSN 1053-587X. doi:10.1109/TSP.2023.3259144.
I. Cantador, P. Brusilovsky, and T. Kuflik. Second workshop on information heterogeneity and fusion in recommender systems (HetRec2011). In Proceedings of the Fifth ACM Conference on Recommender Systems, pages 387–388, New York, NY, USA, 2011. Association for Computing Machinery. ISBN 9781450306836. doi:10.1145/2043932.2044016.
Y. Cen, X. Zou, J. Zhang, H. Yang, J. Zhou, and J. Tang. Representation learning for attributed multiplex heterogeneous network. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pages 1358–1368, ACM, 2019.
D. Chen, Y. Lin, W. Li, P. Li, J. Zhou, and X. Sun. Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04):3438–3445, Apr. 2020. doi:10.1609/aaai.v34i04.5747.
K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation, 6(2):182–197, 2002. doi:10.1109/4235.996017.
M. Defferrard, X. Bresson, and P. Vandergheynst. Convolutional neural networks on graphs with fast localized spectral filtering. In Advances in Neural Information Processing Systems, 2016.
M. Fey, W. Hu, K. Huang, J. E. Lenssen, R. Ranjan, J. Robinson, R. Ying, J. You, and J. Leskovec. Position: Relational deep learning – graph representation learning on relational databases. In Proceedings of the 41st International Conference on Machine Learning, volume 235 of Proceedings of Machine Learning Research, pages 13592–13607, PMLR, Jul. 2024.
X. Fu, J. Zhang, Z. Meng, and I. King. MAGNN: Metapath aggregated graph neural network for heterogeneous graph embedding. In Proceedings of The Web Conference 2020, pages 2331–2341, New York, NY, USA, 2020. Association for Computing Machinery.
W. L. Hamilton, R. Ying, and J. Leskovec. Inductive representation learning on large graphs. In Advances in Neural Information Processing Systems, 2017.
M. He, Z. Wei, S. Feng, Z. Huang, W. Li, Y. Sun, and D. Yu. Spectral heterogeneous graph convolutions via positive noncommutative polynomials. In Proceedings of the ACM Web Conference 2024, pages 685–696, New York, NY, USA, 2024. Association for Computing Machinery.
X. He, K. Deng, X. Wang, Y. Li, Y. Zhang, and M. Wang. LightGCN: Simplifying and powering graph convolution network for recommendation. In Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, pages 639–648, 2020.
W. Hu, M. Fey, M. Zitnik, Y. Dong, H. Ren, B. Liu, M. Catasta, and J. Leskovec. Open Graph Benchmark: Datasets for machine learning on graphs. In Advances in Neural Information Processing Systems, 2020.
Z. Hu, Y. Dong, K. Wang, and Y. Sun. Heterogeneous graph transformer. In Proceedings of The Web Conference 2020, pages 2704–2710, 2020.
S. Katoch, S. S. Chauhan, and V. Kumar. A review on genetic algorithm: past, present, and future. Multimedia Tools and Applications, 80(5):8091–8126, 2021.
T. N. Kipf and M. Welling. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations, 2017.
C. Li, Z. Guo, Q. He, H. Xu, and K. He. Long-range meta-path search on large-scale heterogeneous graphs, 2024.
Q. Lv, M. Ding, Q. Liu, Y. Chen, W. Feng, S. He, C. Zhou, J. Jiang, Y. Dong, and J. Tang. Are we really making much progress? Revisiting, benchmarking and refining heterogeneous graph neural networks. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, pages 1150–1160, 2021.
Q. Mao, Z. Liu, C. Liu, and J. Sun. HINormer: Representation learning on heterogeneous information networks with graph transformer. In Proceedings of the ACM Web Conference 2023, pages 599–610, 2023.
Y. Ozaki, Y. Tanigaki, S. Watanabe, and M. Onishi. Multiobjective tree-structured Parzen estimator for computationally expensive optimization problems. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference, pages 533–541, 2020.
M. Schlichtkrull, T. N. Kipf, P. Bloem, R. van den Berg, I. Titov, and M. Welling. Modeling relational data with graph convolutional networks. In The Semantic Web, pages 593–607, Springer, 2018.
Y. Sun, J. Han, X. Yan, P. S. Yu, and T. Wu. PathSim: Meta path-based top-k similarity search in heterogeneous information networks. Proceedings of the VLDB Endowment, 4(11):992–1003, 2011.
P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Liò, and Y. Bengio. Graph Attention Networks. International Conference on Learning Representations, 2018.
J. Wang, Y. Guo, L. Yang, and Y. Wang. Enabling homogeneous GNNs to handle heterogeneous graphs via relation embedding. IEEE Transactions on Big Data, 9:1697–1710, 2023.
M. Wang et al. 4DBInfer: A 4D benchmarking toolbox for graph-centric predictive modeling on relational databases. In Advances in Neural Information Processing Systems, volume 37, pages 27236–27273, 2024.
X. Wang, H. Ji, C. Shi, B. Wang, Y. Ye, P. Cui, and P. S. Yu. Heterogeneous graph attention network. In The World Wide Web Conference, pages 2022–2032, 2019.
F. Wu, A. Souza, T. Zhang, C. Fifty, T. Yu, and K. Weinberger. Simplifying graph convolutional networks. In Proceedings of the 36th International Conference on Machine Learning, pages 6861–6871, 2019.
B. Yang, W. T. Yih, X. He, J. Gao, and L. Deng. Embedding entities and relations for learning and inference in knowledge bases. International Conference on Learning Representations, 2014.
C. Yang, Y. Xiao, Y. Zhang, Y. Sun, and J. Han. Heterogeneous network representation learning: A unified framework with survey and benchmark. IEEE Transactions on Knowledge and Data Engineering, 2020.
X. Yang, M. Yan, S. Pan, X. Ye, and D. Fan. Simple and efficient heterogeneous graph neural network. Proceedings of the AAAI Conference on Artificial Intelligence, 37(9):10816–10824, 2023.
P. Yu, C. Fu, Y. Yu, C. Huang, Z. Zhao, and J. Dong. Multiplex heterogeneous graph convolutional network. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pages 2377–2387, 2022.
S. Yun, M. Jeong, R. Kim, J. Kang, and H. J. Kim. Graph transformer networks. In Advances in Neural Information Processing Systems, pages 11960–11970, 2019.
Z. Zhou, J. Shi, R. Yang, Y. Zou, and Q. Li. SlotGAT: Slot-based message passing for heterogeneous graphs. In Proceedings of the 40th International Conference on Machine Learning, 2023.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101287-
dc.description.abstract異質資訊網路(Heterogeneous Information Networks, HINs)提供了一個具有表現力的框架,能夠整合不同類型的實體與關係,通常定義於一個固定的綱要(schema)之下。然而,現有研究多倚賴預設的綱要結構,忽略了資料在不同建模選擇下呈現的差異,進而對下游任務造成潛在影響。
本研究提出「實體與屬性二象性(entity–attribute duality)」的概念:屬性可以被原子化為具關聯的實體,而實體亦可作為其他實體的屬性。此原則啟發我們設計出 atomic HIN 的範式,旨在系統性地規範結構建模選擇,並提升異質圖模型的表達能力。
在此基礎上,我們進一步提出一個任務導向的綱要最佳化(schema refinement)框架,能將既有資料集視為某一特定設計選擇下的產物,並系統性地探索更適合下游任務的綱要結構。實驗結果顯示,在結合 schema refinement 後,僅使用一個簡化版的關係圖卷積網(simplified Relational GCN, sRGCN),即可於八個涵蓋節點層級與邊層級任務的資料集上達到當前最佳(state-of-the-art)表現;進一步使用先進的異質圖神經網路(HGNNs)則可進一步提升效果。這表明,異質圖的結構建模選擇對於學習表現具有關鍵性影響。
最後,我們公開 atomic HINs 的最佳化結構與完整框架,為建立更具原則性的模型評分奠定基礎,並為未來在綱要感知式學習(schema-aware learning)、自動結構探索(automated structure discovery)以及下一代 HGNNs 的研究提供新方向。
zh_TW
dc.description.abstractHeterogeneous Information Networks (HINs) provide a powerful framework for modeling multi-typed entities and relations, typically defined under a fixed schema. Yet, most research assumes this structure is given, overlooking the fact that alternative designs can emphasize different aspects of the data and substantially influence downstream performance.
As a theoretical foundation for such designs, we introduce the principle of entity-attribute duality: attributes can be atomized as entities with their associated relations, while entities can, in turn, serve as attributes of others. This principle motivates atomic HIN, a canonical representation that makes all modeling choices explicit and achieves maximal expressiveness. Building on this foundation, we propose a systematic framework for task-specific schema refinement. Within this framework, we demonstrate that widely used benchmarks correspond to heuristic refinements of the atomic HIN—often far from optimal.
Across eight datasets, refinement alone enables a simplified Relational GCN (sRGCN) to reach state-of-the-art performance on node- and link-level tasks, with further gains from advanced HGNNs. These results highlight schema design as a key dimension in heterogeneous graph modeling.
By releasing the atomic HINs, searched schemas, and refinement framework, we enable principled benchmarking and open the way for future work on schema-aware learning, automated structure discovery, and next-generation HGNNs.
en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2026-01-13T16:13:22Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2026-01-13T16:13:22Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontentsAcknowledgements i
摘要 iii
Abstract v
Contents vii
List of Figures xi
List of Tables xiii
Chapter 1 Introduction 1
Section 1.1 Motivation 1
Section 1.2 Related Work 4
Section 1.2.1 Heterogeneous Graph Neural Networks 4
Section 1.2.2 Schema Augmentation Techniques 5
Section 1.2.3 Schema Design in Common Benchmarks 6
Chapter 2 Methodology 9
Section 2.1 Preliminaries 10
Section 2.1.1 Attributed Heterogeneous Information Networks 10
Section 2.1.2 Spectral Heterogeneous Graph Convolution (SHGC) 11
Section 2.2 From Attribute Atomization to Atomic HINs 12
Section 2.3 Schema Refinement via Node- and Edge-Type Selection 14
Section 2.4 Pre-propagation Feature Initialization 16
Section 2.5 Systematic Search for Schema Refinement 19
Section 2.6 HGNNs for Atomic HINs 20
Chapter 3 Simplified Relational GCN 27
Section 3.1 Formulation 28
Section 3.2 Layer-wise Update Rule 28
Section 3.3 Precomputation Setting 30
Section 3.4 Downstream Decoders 30
Section 3.5 Node Classification 31
Section 3.6 Link Prediction 31
Section 3.7 Training Enhancements 32
Section 3.8 L2-normalization 32
Section 3.9 Edge Dropout and Input Dropout 33
Section 3.10 Residual Connection 33
Section 3.11 Label Propagation 34
Section 3.12 Loss Function 35
Chapter 4 Experiments 37
Section 4.1 Datasets 37
Section 4.1.1 Human-Designed Schemas as Specific Refinements 38
Section 4.1.2 IMDB 39
Section 4.1.3 Freebase 40
Section 4.1.4 DBLP 41
Section 4.1.5 ACM 41
Section 4.1.6 OGBN-MAG 42
Section 4.1.7 Amazon 43
Section 4.1.8 LastFM 43
Section 4.1.9 PubMed 44
Section 4.2 Evaluation Protocol 45
Section 4.2.1 Evaluation Metrics 45
Section 4.2.1.1 Confusion Matrix 46
Section 4.2.1.2 Accuracy 46
Section 4.2.1.3 F1-Scores 47
Section 4.2.1.4 AUC-ROC 48
Section 4.2.1.5 MRR 49
Section 4.2.2 Dataset Split 49
Section 4.2.3 Negative Sampling 50
Section 4.3 Baselines 51
Section 4.4 Schema Refinement and Hyperparameter Setting 53
Section 4.4.1 Schema Tuning 53
Section 4.4.2 HGNN Hyperparameters and Tuning 53
Section 4.5 Results 54
Section 4.5.1 Results on Node Classification Datasets 54
Section 4.5.2 Results on Link Prediction Datasets 56
Chapter 5 Discussion 59
Section 5.1 Search Results on sRGCN 59
Section 5.2 Effect of Attribute Atomization and Schema Refinement 63
Section 5.3 Transferability and Generalization of Refined Schemas 64
Section 5.4 Search Results on More HGNNs 66
Section 5.5 Efficiency of Schema Refinement 69
Section 5.6 Discussion on Search Algorithms and Initialization 69
Section 5.7 Complexity Analysis 71
Chapter 6 Conclusion 75
References 79
Appendix A — Details of Pre-propagation Feature Initialization 87
Section A.1 Generalized Union Aggregation 87
Section A.2 Implementation 88
-
dc.language.isoen-
dc.subject異質資訊網路-
dc.subject異質圖神經網路-
dc.subject圖神經網路-
dc.subject圖表徵學習-
dc.subject深度學習-
dc.subjectHeterogeneous Information Networks-
dc.subjectHeterogeneous Graph Neural Networks-
dc.subjectGraph Neural Networks-
dc.subjectGraph Representation Learning-
dc.subjectDeep Learning-
dc.title基於實體-屬性二象性的異質圖建模zh_TW
dc.titleEntity-Attribute Duality for Heterogeneous Graph Modelingen
dc.typeThesis-
dc.date.schoolyear114-1-
dc.description.degree碩士-
dc.contributor.oralexamcommittee王志宇;王釧茹;蔡銘峰zh_TW
dc.contributor.oralexamcommitteeChih-Yu Wang;Chuan-Ju Wang;Ming-Feng Tsaien
dc.subject.keyword異質資訊網路,異質圖神經網路圖神經網路圖表徵學習深度學習zh_TW
dc.subject.keywordHeterogeneous Information Networks,Heterogeneous Graph Neural NetworksGraph Neural NetworksGraph Representation LearningDeep Learningen
dc.relation.page89-
dc.identifier.doi10.6342/NTU202504725-
dc.rights.note同意授權(限校園內公開)-
dc.date.accepted2025-12-17-
dc.contributor.author-college電機資訊學院-
dc.contributor.author-dept電信工程學研究所-
dc.date.embargo-lift2027-11-26-
顯示於系所單位:電信工程學研究所

文件中的檔案:
檔案 大小格式 
ntu-114-1.pdf
  未授權公開取用
2.26 MBAdobe PDF檢視/開啟
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved