請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88401完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 林守德 | zh_TW |
| dc.contributor.advisor | Shou-De Lin | en |
| dc.contributor.author | 劉燕芬 | zh_TW |
| dc.contributor.author | Yan-Fen Liu | en |
| dc.date.accessioned | 2023-08-15T16:07:36Z | - |
| dc.date.available | 2023-11-09 | - |
| dc.date.copyright | 2023-08-15 | - |
| dc.date.issued | 2023 | - |
| dc.date.submitted | 2023-07-28 | - |
| dc.identifier.citation | [1] F. Bieder, R. Sandkühler, and P. C. Cattin. Comparison of methods generalizing max- and average-pooling, 2021.
[2] T. Chen, S. Kornblith, M. Norouzi, and G. Hinton. A simple framework for contrastive learning of visual representations. arXiv preprint arXiv:2002.05709, 2020. [3] Y. Chen, J. Li, Z. Liu, N. S. Keskar, H. Wang, J. McAuley, and C. Xiong. Generating negative samples for sequential recommendation, 2022. [4] Y. Chen, Z. Liu, J. Li, J. McAuley, and C. Xiong. Intent contrastive learning for sequential recommendation. In Proceedings of the ACM Web Conference 2022, WWW ’22, page 2172–2182, New York, NY, USA, 2022. Association for Computing Machinery. [5] K. Cho, B. van Merrienboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk, and Y. Bengio. Learning phrase representations using rnn encoder-decoder for statistical machine translation, 2014. [6] Y. Dang, E. Yang, G. Guo, L. Jiang, X. Wang, X. Xu, Q. Sun, and H. Liu. Uniform sequence better: Time interval aware data augmentation for sequential recommendation, 2022. [7] Y. N. Dauphin, A. Fan, M. Auli, and D. Grangier. Language modeling with gated convolutional networks, 2017. [8] T. Gao, X. Yao, and D. Chen. SimCSE: Simple contrastive learning of sentence embeddings. In Empirical Methods in Natural Language Processing (EMNLP), 2021. [9] J. Giorgi, O. Nitski, B. Wang, and G. Bader. Declutr: Deep contrastive learning for unsupervised textual representations, 2021. [10] Y. Hao, P. Zhao, X. Xian, G. Liu, D. Wang, L. Zhao, Y. Liu, and V. S. Sheng. Learnable model augmentation self-supervised learning for sequential recommendation, 2022. [11] F. M. Harper and J. A. Konstan. The movielens datasets: History and context. ACM Trans. Interact. Intell. Syst., 5(4), dec 2015. [12] K. He, H. Fan, Y. Wu, S. Xie, and R. Girshick. Momentum contrast for unsupervised visual representation learning, 2020. [13] R. He and J. McAuley. Fusing similarity models with markov chains for sparse sequential recommendation, 2016. [14] A. Jaiswal, A. R. Babu, M. Z. Zadeh, D. Banerjee, and F. Makedon. A survey on contrastive self-supervised learning, 2021. [15] J. Johnson, M. Douze, and H. Jégou. Billion-scale similarity search with gpus, 2017. [16] W.-C. Kang and J. McAuley. Self-attentive sequential recommendation. 2018 IEEE International Conference on Data Mining (ICDM), Nov 2018. [17] P. Khosla, P. Teterwak, C. Wang, A. Sarna, Y. Tian, P. Isola, A. Maschinot, C. Liu, and D. Krishnan. Supervised contrastive learning, 2020. [18] D. P. Kingma and J. Ba. Adam: A method for stochastic optimization, 2017. [19] Y. Koren, R. Bell, and C. Volinsky. Matrix factorization techniques for recommender systems. Computer, 42(8):30–37, 2009. [20] W. Krichene and S. Rendle. On sampled metrics for item recommendation. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery amp; Data Mining, KDD ’20, page 1748–1757, New York, NY, USA, 2020. Association for Computing Machinery. [21] Y. Li, H. Chen, X. Sun, Z. Sun, L. Li, L. Cui, P. S. Yu, and G. Xu. Hyperbolic hypergraphs for sequential recommendation. In Proceedings of the 30th ACM International Conference on Information amp; Knowledge Management, CIKM ’21, page 988–997, New York, NY, USA, 2021. Association for Computing Machinery. [22] X. Liang, L. Wu, J. Li, Y. Wang, Q. Meng, T. Qin, W. Chen, M. Zhang, and T.-Y. Liu. R-drop: Regularized dropout for neural networks. In NeurIPS, 2021. [23] Z. Lin, C. Tian, Y. Hou, and W. X. Zhao. Improving graph collaborative filtering with neighborhood-enriched contrastive learning. In WWW, 2022. [24] C. Liu, X. Li, G. Cai, Z. Dong, H. Zhu, and L. Shang. Non-invasive self-attention for side information fusion in sequential recommendation, 2021. [25] Y. Liu and P. Liu. Simcls: A simple framework for contrastive learning of abstractive summarization, 2021. [26] Z. Liu, Y. Chen, J. Li, M. Luo, P. S. Yu, and C. Xiong. Improving contrastive learning with model augmentation, 2022. [27] Z. Liu, Y. Chen, J. Li, P. S. Yu, J. McAuley, and C. Xiong. Contrastive self-supervised sequential recommendation with robust augmentation. arXiv preprint arXiv:2108.06479, 2021. [28] Z. Liu, Z. Fan, Y. Wang, and P. S. Yu. Augmenting sequential recommendation with pseudo-prior items via reversely pre-training transformer. In Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’21, page 1608–1612, New York, NY, USA, 2021. Association for Computing Machinery. [29] C. Ma, P. Kang, and X. Liu. Hierarchical gating networks for sequential recommendation. In KDD, pages 825–833. ACM, 2019. [30] J. Ma, C. Zhou, H. Yang, P. Cui, X. Wang, and W. Zhu. Disentangled self-supervision in sequential recommenders. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery amp; Data Mining, KDD ’20, page 483–491, New York, NY, USA, 2020. Association for Computing Machinery. [31] J. McAuley, C. Targett, Q. Shi, and A. van den Hengel. Image-based recommendations on styles and substitutes, 2015. [32] F. Nagy, A. Haroun, H. Abd elkader, and A. Keshk. A review for recommender system models and deep learning. IJCI. International Journal of Computers and Information, 8:170–176, 12 2021. [33] R. Qiu, Z. Huang, and H. Yin. Memory augmented multi-instance contrastive predictive coding for sequential recommendation, 2021. [34] R. Qiu, Z. Huang, H. Yin, and Z. Wang. Contrastive learning for representation degeneration problem in sequential recommendation. CoRR, abs/2110.05730, 2021. [35] S. Rendle, C. Freudenthaler, and L. Schmidt-Thieme. Factorizing personalized markov chains for next-basket recommendation. In Proceedings of the 19th International Conference on World Wide Web, WWW ’10, page 811–820, New York, NY, USA, 2010. Association for Computing Machinery. [36] F. Ricci, L. Rokach, and B. Shapira. Introduction to Recommender Systems Handbook, pages 1–35. Springer US, Boston, MA, 2011. [37] J. Song, J. Xu, R. Zhou, L. Chen, J. Li, and C. Liu. Cbml: A cluster-based meta-learning model for session-based recommendation. In Proceedings of the 30th ACM International Conference on Information amp; Knowledge Management, CIKM ’21, page 1713-1722, New York, NY, USA, 2021. Association for Computing Machinery. [38] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov. Dropout: A simple way to prevent neural networks from overfitting. Journal of Machine Learning Research, 15(56):1929–1958, 2014. [39] F. Sun, J. Liu, J. Wu, C. Pei, X. Lin, W. Ou, and P. Jiang. Bert4rec: Sequential recommendation with bidirectional encoder representations from transformer. In Proceedings of the 28th ACM International Conference on Information and Knowledge Management, CIKM ’19, pages 1441–1450, New York, NY, USA, 2019. ACM. [40] Y. K. Tan, X. Xu, and Y. Liu. Improved recurrent neural networks for session-based recommendations. In Proceedings of the 1st Workshop on Deep Learning for Recommender Systems, DLRS 2016, page 17–22, New York, NY, USA, 2016. Association for Computing Machinery. [41] J. Tang and K. Wang. Personalized top-n sequential recommendation via convolutional sequence embedding. In ACM International Conference on Web Search and Data Mining, 2018. [42] A. van den Oord, Y. Li, and O. Vinyals. Representation learning with contrastive predictive coding, 2019. [43] A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Kaiser, and I. Polosukhin. Attention is all you need, 2017. [44] C. Wang, W. Ma, C. Chen, M. Zhang, Y. Liu, and S. Ma. Sequential recommendation with multiple contrast signals. ACM Trans. Inf. Syst., 41(1), jan 2023. [45] D. Wang, N. Ding, P. Li, and H. Zheng. CLINE: Contrastive learning with semantic negative examples for natural language understanding. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 2332–2342, Online, Aug. 2021. Association for Computational Linguistics. [46] L. Wang, J. Huang, K. Huang, Z. Hu, G. Wang, and Q. Gu. Improving neural language generation with spectrum control. In International Conference on Learning Representations, 2020. [47] X. Wang, X. He, M. Wang, F. Feng, and T. S. Chua. Neural graph collaborative filtering. In Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR’19, page 165–174, New York, NY, USA, 2019. Association for Computing Machinery. [48] Z. Wang, H. Liu, W. Wei, Y. Hu, X.-L. Mao, S. He, R. Fang, and D. Chen. Multi-level contrastive learning framework for sequential recommendation. In Proceedings of the 31st ACM International Conference on Information amp; Knowledge Management, CIKM ’22, page 2098–2107, New York, NY, USA, 2022. Association for Computing Machinery. [49] J. Wu, X. Wang, F. Feng, X. He, L. Chen, J. Lian, and X. Xie. Self-supervised graph learning for recommendation. In Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’21, page 726–735, New York, NY, USA, 2021. Association for Computing Machinery. [50] S. Wu, Y. Tang, Y. Zhu, L. Wang, X. Xie, and T. Tan. Session-based Recommendation with Graph Neural Networks. In P. V. Hentenryck and Z.-H. Zhou, editors, Proceedings of the Twenty-Third AAAI Conference on Artificial Intelligence, volume 33 of AAAI ’19, pages 346–353, July 2019. [51] Z. Wu, S. Wang, J. Gu, M. Khabsa, F. Sun, and H. Ma. Clear: Contrastive learning for sentence representation, 2020. [52] X. Xia, H. Yin, J. Yu, Y. Shao, and L. Cui. Self-supervised graph co-training for session-based recommendation, 2021. [53] X. Xia, H. Yin, J. Yu, Q. Wang, L. Cui, and X. Zhang. Self-supervised hypergraph convolutional networks for session-based recommendation, 2022. [54] X. Xie, F. Sun, Z. Liu, J. Gao, B. Ding, and B. Cui. Contrastive pre-training for sequential recommendation. ArXiv, abs/2010.14395, 2020. [55] Y. Xie, P. Zhou, and S. Kim. Decoupled side information fusion for sequential recommendation, 2022. [56] C. Xu, P. Zhao, Y. Liu, V. S. Sheng, J. Xu, F. Zhuang, J. Fang, and X. Zhou. Graph contextualized self-attention network for session-based recommendation. In Proceedings of the 28th International Joint Conference on Artificial Intelligence, IJCAI’19, page 3940–3946. AAAI Press, 2019. [57] L. Xue, D. Yang, S. Zhai, Y. Li, and Y. Xiao. Learning dual-view user representations for enhanced sequential recommendation. ACM Trans. Inf. Syst., 41(4), mar 2023. [58] Y. Yan, R. Li, S. Wang, F. Zhang, W. Wu, and W. Xu. ConSERT: A contrastive framework for self-supervised sentence representation transfer. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 5065–5075, Online, Aug. 2021. Association for Computational Linguistics. [59] T. Yao, X. Yi, D. Z. Cheng, F. X. Yu, A. K. Menon, L. Hong, E. H. Chi, S. K. Tjoa, J. Kang, and E. Ettinger. Self-supervised learning for deep models in recommendations. ArXiv, abs/2007.12865, 2020. [60] J. yeong Song and B. Suh. Data augmentation strategies for improving sequential recommender systems, 2022. [61] J. Yu, H. Yin, J. Li, Q. Wang, N. Q. V. Hung, and X. Zhang. Self-supervised multi-channel hypergraph convolutional network for social recommendation. In Proceedings of the Web Conference 2021, WWW ’21, page 413–424, New York, NY, USA, 2021. Association for Computing Machinery. [62] J. Yu, H. Yin, X. Xia, T. Chen, J. Li, and Z. Huang. Self-supervised learning for recommender systems: A survey, 2022. [63] M. Zhang, S. Wu, X. Yu, Q. Liu, and L. Wang. Dynamic graph neural networks for sequential recommendation. IEEE Transactions on Knowledge and Data Engineering, pages 1–1, 2022. [64] S. Zhang, B. Li, D. Yao, F. Feng, J. Zhu, W. Fan, Z. Zhao, X. He, T. seng Chua, and F. Wu. Ccl4rec: Contrast over contrastive learning for micro-video recommendation, 2022. [65] T. Zhang, P. Zhao, Y. Liu, V. Sheng, J. Xu, D. Wang, G. Liu, and X. Zhou. Feature-level deeper self-attention network for sequential recommendation. pages 4320–4326, 08 2019. [66] Y. Zhang, E. Chen, B. Jin, H. Wang, M. Hou, W. Huang, and R. Yu. Clustering based behavior sampling with long sequential data for ctr prediction. In Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’22, page 2195–2200, New York, NY, USA, 2022. Association for Computing Machinery. [67] Y. Zhang, R. He, Z. Liu, K. H. Lim, and L. Bing. An unsupervised sentence embedding method by mutual information maximization. In Conference on Empirical Methods in Natural Language Processing, 2020. [68] W. X. Zhao, S. Mu, Y. Hou, Z. Lin, K. Li, Y. Chen, Y. Lu, H. Wang, C. Tian, X. Pan, Y. Min, Z. Feng, X. Fan, X. Chen, P. Wang, W. Ji, Y. Li, X. Wang, and J.-R. Wen. Recbole: Towards a unified, comprehensive and efficient framework for recommendation algorithms. arXiv preprint arXiv:2011.01731, 2020. [69] Y. Zheng, C. Gao, J. Chang, Y. Niu, Y. Song, D. Jin, and Y. Li. Disentangling long and short-term interests for recommendation. In Proceedings of the ACM Web Conference 2022, pages 2256–2267, 2022. [70] Y. Zheng, C. Gao, J. Chang, Y. Niu, Y. Song, D. Jin, and Y. Li. Disentangling long and short-term interests for recommendation. In Proceedings of the ACM Web Conference 2022, WWW ’22, page 2256–2267, New York, NY, USA, 2022. Association for Computing Machinery. [71] K. Zhou, H. Wang, W. X. Zhao, Y. Zhu, S. Wang, F. Zhang, Z. Wang, and J. Wen. S3-rec: Self-supervised learning for sequential recommendation with mutual information maximization. In CIKM ’20: The 29th ACM International Conference on Information and Knowledge Management, Virtual Event, Ireland, October 19-23, 2020, pages 1893–1902. ACM, 2020. | - |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88401 | - |
| dc.description.abstract | 序列式推薦系統 (SR) 藉由考慮連續性的歷史行為,來捕捉使用者瀏覽行為上的興趣動向,使下一個推薦物品更能迎合用戶的需求。 在近期序列式推薦系統的發展裡,有一分支是將對比式學習與自監督信號結合起來,以解決數據集上的資料稀疏性所造成的推薦問題;而這些方法經常透過微調數據結構的方式,來構建出略為不同的數據形態。 然而,由於這些微調的手段將對瀏覽行為中的原始意圖造成隨機擾動,使得在建構出的數據與原始資料之間,兩者反映出的語義是否仍保有一致上備受挑戰。 因此,有別於使用微調數據結構來生成自監督信號的方法,近年來的提案傾向在原始數據中,定義一個局部監督標籤(例如目標項目)來緩解對比式訊號在語意一致性上的疑慮,然而,他們所提出的局部監督式標籤,恐怕並不足以反映整個瀏覽行為序列所代表的語意。 因此,為了能進一步改善上述的問題,我們在此提出一種新穎的對比式學習框架,用意在「透過識別出序列級上的監督式模板,解決序列式推薦系統的問題」。 與先前研究有所不同的地方是,我們旨在,通過從原始數據中所挖掘出序列級的監督式模板當作對比信號,來保持其與對比目標物之間的語義能在序列級上有著一致性。 此外,我們還提出了新的對比正則式,以提高對比式學習任務在假負樣本和真負樣本的辨識能力。 我們的模型在五個公共數據集上進行的大量實驗,其結果表明我們的模型與目前最先進的方法相比,有著卓越性能。 | zh_TW |
| dc.description.abstract | Sequential recommendation (SR) takes successive historical behaviors into the next-item prediction to capture dynamic user interests. Recent methods for SR incorporate contrastive learning with self-supervised signals to alleviate the data sparsity problem. While they typically utilize data augmentation to construct contrast signals, the semantic consistency between contrastive objectives has been challenged due to the random perturbation on the original intent. Thus, existing works tend to tackle this problem using a local supervision label such as target item, which is, however, not enough to reflect global sequence behaviors. To further address the above issue, we propose a novel contrastive learning framework, named \textbf{Supe}rvised Pattern Recognition for Sequential \textbf{Rec}ommendation (\textbf{SupeRec}). Different from previous studies, we aim to maintain sequence-level semantic consistency between contrastive objectives by exploring supervised sequence behaviors from raw data. Moreover, a novel contrastive regularization is presented in our work to improve the discrimination ability on false and true negative samples for contrastive tasks. Extensive experiments conducted on five public datasets demonstrate the proposed SupeRec achieve superior performance compared to existing state-of-the-art baselines. | en |
| dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-08-15T16:07:36Z No. of bitstreams: 0 | en |
| dc.description.provenance | Made available in DSpace on 2023-08-15T16:07:36Z (GMT). No. of bitstreams: 0 | en |
| dc.description.tableofcontents | 口試委員會審定書 i
誌謝 ii 摘要 iii Abstract iv Contents vi List of Figures viii List of Tables ix Chapter 1 Introduction 1 Chapter 2 Related Work 5 2.1 Sequential Recommendation 5 2.2 Contrastive Learning 6 Chapter 3 Problem Formulation 8 Chapter 4 Methodology 9 4.1 Sequence Encoder for User Representation 9 4.1.1 Embedding Layer. 10 4.1.2 Self-attention Module. 11 4.2 Supervised Pattern Recognition 11 4.3 Adaptive Contrastive Regularization 13 4.4 Multi-task Learning 15 4.5 Discussion. 16 4.5.1 Novelty and Differences. 16 4.5.2 Time Complexity Analysis. 17 Chapter 5 Experiments 18 5.1 Experimental Settings 18 5.1.1 Datasets. 18 5.1.2 Evaluation Metrics. 19 5.1.3 Baseline Models. 19 5.1.4 Implementation Details. 21 5.2 Overall Performance 21 5.3 Further Analysis of SupeRec 23 5.3.1 Different Contrastive Objectives. 23 5.3.2 Effect of AdaInfoNCE Loss. 25 5.3.3 Visualizing Representation Quality. 26 5.4 Hyperparameter Sensitivity 27 Chapter 6 Conclusion 30 References 31 | - |
| dc.language.iso | en | - |
| dc.subject | 序列式推薦系統 | zh_TW |
| dc.subject | 對比式學習 | zh_TW |
| dc.subject | 自監督式學習 | zh_TW |
| dc.subject | Contrastive Learning | en |
| dc.subject | Self-Supervised Learning | en |
| dc.subject | Sequential Recommendation | en |
| dc.title | 透過識別出序列級上的監督式模板解決序列式推薦系統的問題 | zh_TW |
| dc.title | Supervised Pattern Recognition for Sequential Recommendation | en |
| dc.type | Thesis | - |
| dc.date.schoolyear | 111-2 | - |
| dc.description.degree | 碩士 | - |
| dc.contributor.oralexamcommittee | 林軒田;鄭卜壬;蔡銘峰;葉彌妍 | zh_TW |
| dc.contributor.oralexamcommittee | Hsuan-Tien Lin;Pu-Jen Cheng;Ming-Feng Tsai;Mi-Yen Yeh | en |
| dc.subject.keyword | 對比式學習,序列式推薦系統,自監督式學習, | zh_TW |
| dc.subject.keyword | Contrastive Learning,Sequential Recommendation,Self-Supervised Learning, | en |
| dc.relation.page | 40 | - |
| dc.identifier.doi | 10.6342/NTU202302004 | - |
| dc.rights.note | 未授權 | - |
| dc.date.accepted | 2023-07-31 | - |
| dc.contributor.author-college | 電機資訊學院 | - |
| dc.contributor.author-dept | 資訊工程學系 | - |
| 顯示於系所單位: | 資訊工程學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-111-2.pdf 未授權公開取用 | 3.63 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
