請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/62260
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 林智仁(Chih-Jen Lin) | |
dc.contributor.author | Yu-Ting Huang | en |
dc.contributor.author | 黃郁庭 | zh_TW |
dc.date.accessioned | 2021-06-16T13:37:07Z | - |
dc.date.available | 2023-08-04 | |
dc.date.copyright | 2020-08-04 | |
dc.date.issued | 2020 | |
dc.date.submitted | 2020-06-16 | |
dc.identifier.citation | Rendle, Steffen (2010). “Factorization machines”. In: Proceedings of IEEE International Conference on Data Mining (ICDM), pp. 995–1000.
Blondel, Mathieu et al. (2016). “Polynomial Networks and Factorization Machines: new Insights and Efficient Training Algorithms”. In: Proceedings of the 33rd International Conference on Machine Learning (ICML). Rendle, Steffen (2012). “Factorization machines with libFM”. In: ACM Transactions on Intelligent Systems and Technology (TIST) 3.3, p. 57. Juan, Yuchin et al. (2016). “Field-aware factorization machines for CTR Prediction”. In: Proceedings of the ACM Recommender Systems Conference (RecSys). URL: http:// www.csie.ntu.edu.tw/˜cjlin/papers/ffm.pdf. Ta, Anh-Phuong (2015). “Factorization machines with follow-the-regularized-leader for CTR prediction in display advertising”. In: Proceedings of the IEEE International Conference on Big Data. Chin, Wei-Sheng et al. (2018). “An Efficient Alternating Newton Method for Learning Factorization Machines”. In: ACM Transactions on Intelligent Systems and Technology 9.6, 72:1–72:31. URL: https://www.csie.ntu.edu.tw/˜cjlin/papers/fm/ scalefm.pdf. Lin,Chih-Jen(2007).“ProjectedGradientMethodsforNon-negativeMatrixFactorization”. In: Neural Computation 19, pp. 2756–2779. URL: http://www.csie.ntu.edu. tw/˜cjlin/papers/pgradnmf.pdf. Yuan, Guo-Xun et al. (2010). “A Comparison of Optimization Methods and software for Large-scale L1-regularized Linear Classification”. In: Journal of Machine Learning Research 11, pp. 3183–3234. URL: http://www.csie.ntu.edu.tw/˜cjlin/ papers/l1.pdf. Lin, Chih-Jen, Ruby C. Weng, and S. Sathiya Keerthi (2008). “Trust region Newton method for large-scale logistic regression”. In: Journal of Machine Learning Research 9, pp. 627–650. URL: http://www.csie.ntu.edu.tw/˜cjlin/papers/ logistic.pdf. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/62260 | - |
dc.description.abstract | 矩陣分解在推薦系統上是一個普及的應用,它可以根據已看過的資料去預測使用者對商品的評分。矩陣分解是因子分解機的一種特例,後者的優勢在於能有效地處理稀疏的資料。一般而言,牛頓法可以有效地解大規模的迴歸問題,然而,因為因子分解機的目標函數是非凸函數,牛頓法無法應用於因子分解機。在這篇論文中,我們考慮了一種因子分解機的變形,其在解子問題時目標函數會變成凸函數。相較於交替地解子問題,我們直接將高斯牛頓法應用於此種變形的因子分解機。同時我們在矩陣分解的實作上,也利用其特性而達到降低空間複雜度。我們的實驗顯示應用於此種因子分解機的高斯牛頓法很有潛力。 | zh_TW |
dc.description.abstract | Matrix factorization (MF) is a popular technique for collaborative filtering. With observed ratings given by $m$th user to $n$th item, MF aims to find a model such that we can use it to predict the unobserved rating of a user on an item. MF is a special case of Factorization Machines (FM) which can model all interactions between variables using factorized parameters so that FM are able to estimate interactions even in problems with huge sparsity (like recommender systems). For the training of large-scale regression problem, Newton methods have been shown to be an effective approach, but it is difficult to apply such methods to FM because of the non-convexity. We consider a modification of FM that is multi-block convex and usually solved by alternating minimization algorithms. In this work, we apply Gauss-Newton method to the modification of FM instead of alternately switching between solving sub-problems. Furthermore, we introduce an effective implementation for MF to reduce the memory consumption. Through experiments in this work, we compare the Gauss Newton method with the alternating Newton method and show the effectiveness of our algorithm. | en |
dc.description.provenance | Made available in DSpace on 2021-06-16T13:37:07Z (GMT). No. of bitstreams: 1 ntu-109-R05922038-1.pdf: 2519816 bytes, checksum: 8508899ee039c12203a951ac2d02f508 (MD5) Previous issue date: 2020 | en |
dc.description.tableofcontents | 1. Introduction(1)
2. Newton Method for Unconstrained Minimization(4) 3. Newton Methods for Factorization Machines(6) 3.1 Gradient Calculation(7) 3.2 Hessian and Gauss-Newton Matrices(9) 3.3 Implementation of Line Search(10) 3.4 Overall Procedure(11) 3.5 Analysis and Implementation for Matrix Factorization(13) 4. A Review of Alternating Newton Methods(18) 4.1 Implementation for Matrix Factorization(22) 5. Experiments on Matrix Factorization(23) 5.1 Experimental Settings(23) 5.2 Enviorment and Implementation(24) 5.3 Comparison on Gauss Newton Method and alternating Newton Method(24) 6. Discussion and Conclusions(40) | |
dc.language.iso | en | |
dc.title | 牛頓法於因子分解機之應用 | zh_TW |
dc.title | Newton Methods for Factorization Machines | en |
dc.type | Thesis | |
dc.date.schoolyear | 108-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 林軒田(Hsuan-Tien Lin),李育杰(Yuh-Jye Lee) | |
dc.subject.keyword | 高斯牛頓法,因子分解機,矩陣分解,推薦系統,協同過濾,牛頓法, | zh_TW |
dc.subject.keyword | Gauss-Newton method,Factorization Machines,Matrix Factorization,Recommender Systems,Collaborative Filtering,Newton method, | en |
dc.relation.page | 41 | |
dc.identifier.doi | 10.6342/NTU202000933 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2020-06-16 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
顯示於系所單位: | 資訊工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-109-1.pdf 目前未授權公開取用 | 2.46 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。