Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/61600
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor林守德(Shou-De Lin)
dc.contributor.authorKuan-Wei Wuen
dc.contributor.author吳冠緯zh_TW
dc.date.accessioned2021-06-16T13:06:57Z-
dc.date.available2013-08-06
dc.date.copyright2013-08-06
dc.date.issued2013
dc.date.submitted2013-08-02
dc.identifier.citation[1] Vembu and Gartner. “Label Ranking Algorithms: A Survey”, in Preference Learning (2010)
[2] Weiwei Cheng, Krzysztof Dembczynski, Eyke Hullermeier, “Label Ranking Methods based on the Plackett-Luce Model”, ICML 2010
[3] Weiwei Cheng, Jens C. Huhn, Eyke Hullermeier, “Decision tree and instance-based learning for label ranking”, ICML 2009
[4] Weiwei Cheng, Krzysztof Dembczynski, Eyke Hullermeier, “Graded Multilabel Classification: The Ordinal Case”. ICML 2010
[5] Shuiwang Ji, Jieping Ye, “Linear Dimensionality Reduction for Multi-label Classification”. IJCAI 2009
[6] Istvan Pilaszy and Domonkos Tikk, “Recommending New Movies: Even a Few Ratings Are More Valuable Than Metadata”, RecSys '09
[7] Bottou, L, “Stochastic learning”, In Bousquet, O. and von Luxburg, U. (eds.), Advanced Lectures on Machine Learning
[8] Trohidis, K., Tsoumakas, G., Kalliris, G., Vlahavas, I., “Multilabel classi‾cation of music into emotions”. ISMIR 2008
[9] Schapire, R.E. Singer, Y., “Boostexter: a boosting-based system for text catego- rization”. Machine Learning 39 2000, 135–168
[10] Boutell, M., Luo, J., Shen, X., Brown, C.: “Learning multi-label scene classification”. Pattern Recognition 37(9) 2004
[11] Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang, Shou-De Lin, “Costsensitive multi-label learning for audio tag annotation and retrieval”. IEEE TMM, 13(3), pp. 518-529, 2011.
[12] Turnbull, D., Barrington, L., Torres, D., Lanckriet, G.R.G. “Semantic Annotation and Retrieval of Music and Sound Effects”. IEEE Transactions on Audio, Speech and Language Processing, Vol. 16, pp. 467-476, 2008.
[13] Kalervo Jarvelin and Jaana Kekalainen, “Cumulated Gain-based Evaluation of IR Techniques”, ACM Transactions on Information Systems 2002
[14] Xia, Fen and Liu, Tie-Yan and Wang, Jue and Zhang, Wensheng and Li, Hang, “Listwise approach to learning to rank: theory and algorithm”, ICML 2008
[15] Zhe Cao, Tao Qin, Tie-Yan Liu, Ming-Feng Tsai, and Hang Li, “Learning to rank: from pairwise approach to listwise approach”, ICML2007
[16] Truyen, Tran The, Phung, Dinh Q. and Venkatesh, “Probabilistic models over ordered partitions with applications in document ranking and collaborative filtering”, SDM 2011
[17] Abele, A.E. and Stief, M. Die Prognose des Berufserfolgs von ochschulabsolventinnen und -absolventen. Befunde zur ersten und zweiten Erhebung der Erlanger Langsschnittstudie BELA-E. Zeitschrift fur Arbeits- und Organisationspsychologie, 48:4–16, 2004.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/61600-
dc.description.abstract多標籤分類在機器學習領域是一個常見的問題。此問題還可以更進一步的做延伸,我們稱作標籤排序以及多標籤預測。在這篇論文中,我們主要的問題是這兩類問題的一個特例,即多標籤為部分被觀察到的。我們提出了一個基於矩陣分解的模型來處理這類問題。最基本的想法為,利用矩陣分解模型我們可以學習標籤的分數或排名並考慮標籤之間的相關性。利用此相關性,我們的模型仍可因為考慮相關性而有較佳的效能。同時,我們也提出了一個結合基於事例和基於模型的方法。根據我們的實驗,我們可以看出矩陣分解模型表現得比線性模型還要好。結合基於事例和基於模型的方法,也可以更進一步改善方法的效能。同時我們也比較了矩陣分解模型和成列、成對和點的目標函數結合的效果,結果證實成列的目標函數是表現最好的。zh_TW
dc.description.abstractMulti-label classification has attracted much attention in these days. The extension of the multi-label classification problem are the label ranking or and graded multi-label prediction problems. In this thesis, we focus on a special case of these two extension problem where only partial ranking or incomplete label are observed. We propose a matrix factorization approach to deal these problems. The merit of the matrix factorization model is that it can learn rating or ranking of labels and model the correlations between labels simultaneously. With this model, we can still learn well because our model considering the correlations between labels during training. We also propose a method to combine instance-based model into model-based approach. The experiments show that the matrix factorization model can outperform the baseline model, especially when our target is low rank matrix or training data is insufficient. Combining instance-based method can further boost the performance of our model. We also compare different loss functions combining with matrix factorization, and show that listwise loss can outperform others.en
dc.description.provenanceMade available in DSpace on 2021-06-16T13:06:57Z (GMT). No. of bitstreams: 1
ntu-102-R00922007-1.pdf: 1042843 bytes, checksum: c6bf950e347fbcd99173a4269d8144bd (MD5)
Previous issue date: 2013
en
dc.description.tableofcontents誌謝 i
中文摘要 ii
Abstract iii
List of Figures vi
List of Tables vii
Chapter 1 Introduction 1
1.1 Thesis Overview 4
Chapter 2 Related Works 5
Chapter 3 Methodology 7
3.1 Problem Definition 7
3.2 Goal 9
3.3 Matrix Factorization 10
3.4 Matrix Factorization for Label Ranking and Graded Multi-Label Prediction 11
3.5 Feature Extension by Neighborhood Information 12
3.6 Loss Functions 14
3.6.1 Square Loss 15
3.6.2 Pairwise Ranking 15
3.6.3 Weighted Pairwise Ranking 16
3.6.4 Loss Function Defined by Plackett-Luce Model 17
3.6.5 Subsample Plackett-Luce 18
3.6.6 Cross Entropy 18
3.7 Learning Algorithm 21
3.7.1 Gradient for Different Loss Functions 22
3.7.2 Gradient for Matrix Factorization Methods 24
3.7.3 Learning Algorithm with Stochastic Gradient Descent 24
Chapter 4 Experiments 26
4.1 Datasets 26
4.2 Evaluation Metrics 28
4.3 Linear model for Label Ranking 30
4.4 On Incomplete Dataset 31
4.5 Rank of Target Matrix 36
4.6 Models with Neighborhood Information 38
4.7 Experiments on Different Loss Functions 46
4.8 Matrix Factorization for Graded Multi-Label Prediction 52
4.9 Non-linear Matrix Factorization 54
Chapter 5 Conclusion and Discussion 55
References 57
dc.language.isoen
dc.subject降維zh_TW
dc.subject多標籤zh_TW
dc.subject矩陣分解zh_TW
dc.subjectdimension reductionen
dc.subjectMulti-labelen
dc.subjectmatrix factorizationen
dc.title基於矩陣分解模型的多標籤排序與分級的多標籤預測zh_TW
dc.titleMatrix Factorization Models for Label Ranking and Graded Multi-Label Predictionen
dc.typeThesis
dc.date.schoolyear101-2
dc.description.degree碩士
dc.contributor.oralexamcommittee林智仁(Chih-Jen Lin),林軒田(Hsuan-Tien Lin),李育杰(Yuh-Jye Lee),駱宏毅(Hung-Yi Lo)
dc.subject.keyword多標籤,矩陣分解,降維,zh_TW
dc.subject.keywordMulti-label,matrix factorization,dimension reduction,en
dc.relation.page59
dc.rights.note有償授權
dc.date.accepted2013-08-02
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-102-1.pdf
  未授權公開取用
1.02 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved