Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/70511
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor林軒田(Hsuan-Tien Lin)
dc.contributor.authorCheng-Yu Hsiehen
dc.contributor.author謝承佑zh_TW
dc.date.accessioned2021-06-17T04:29:52Z-
dc.date.available2018-08-21
dc.date.copyright2018-08-21
dc.date.issued2018
dc.date.submitted2018-08-13
dc.identifier.citation[1] A. Beygelzimer, J. Langford, and P. Ravikumar. Error-correcting tournaments. CoRR, abs/0902.3176, 2009.
[2] K. Bhatia, H. Jain, P. Kar, M. Varma, and P. Jain. Sparse local embeddings for extreme multi-label classification. In NIPS, 2015.
[3] M. R. Boutell, J. Luo, X. Shen, and C. M. Brown. Learning multi-label scene clas- sification. Pattern Recognition, 37(9):1757–1771, 2004.
[4] K. Dembczynski, W. Cheng, and E. Hüllermeier. Bayes optimal multilabel classifi- cation via probabilistic classifier chains. In ICML, 2010.
[5] K. Dembczynski, W. Kotlowski, and E. Hüllermeier. Consistent multilabel ranking through univariate losses. In ICML, 2012.
[6] C. Elkan. The foundations of cost-sensitive learning. In IJCAI, 2001.
[7] W. Gao and Z. Zhou. On the consistency of multi-label learning. In COLT, 2011.
[8] Y. Gong, Y. Jia, T. Leung, A. Toshev, and S. Ioffe. Deep convolutional ranking for multilabel image annotation. CoRR, abs/1312.4894, 2013.
[9] K.-H. Huang and H.-T. Lin. Cost-sensitive label embedding for multi-label classifi- cation. Machine Learning, 106(9–10):1725–1746, Oct. 2017.
[10] C.-L. Li and H.-T. Lin. Condensed filter tree for cost-sensitive multi-label classifi- cation. In ICML, 2014.
[11] H. Lo, J. Wang, H. Wang, and S. Lin. Cost-sensitive multi-label learning for audio tag annotation and retrieval. IEEE Trans. Multimedia, 13(3):518–529, 2011.
[12] G. Madjarov, D. Kocev, D. Gjorgjevikj, and S. Dzeroski. An extensive experimental comparison of methods for multi-label learning. Pattern Recognition, 45(9):3084– 3104, 2012.
[13] J. Nam, J. Kim, E. Loza Mencía, I. Gurevych, and J. Fürnkranz. Large-scale multi- label text classification - revisiting neural networks. In ECML PKDD, 2014.
[14] J. Petterson and T. S. Caetano. Reverse multi-label learning. In NIPS, 2010.
[15] J. Petterson and T. S. Caetano. Submodular multi-label learning. In NIPS, 2011.
[16] G. Qi, X. Hua, Y. Rui, J. Tang, T. Mei, and H. Zhang. Correlative multi-label video annotation. In Proceedings of the 15th International Conference on Multimedia, 2007.
[17] J. Read, B. Pfahringer, G. Holmes, and E. Frank. Classifier chains for multi-label classification. Machine Learning, 85(3):333–359, 2011.
[18] R. E. Schapire and Y. Singer. Boostexter: A boosting-based system for text catego- rization. Machine Learning, 39(2/3):135–168, 2000.
[19] G. Tsoumakas, I. Katakis, and I. P. Vlahavas. Mining multi-label data. In Data Mining and Knowledge Discovery Handbook, 2nd ed., pages 667–685. 2010.
[20] G. Tsoumakas, E. S. Xioufis, J. Vilcek, and I. P. Vlahavas. MULAN: A java library for multi-label learning. Journal of Machine Learning Research, 12:2411–2414, 2011.
[21] Y.-P. Wu and H.-T. Lin. Progressive k-labelsets for cost-sensitive multi-label classi- fication. Machine Learning, 106(5):671–694, 5 2017.
[22] B. Zadrozny, J. Langford, and N. Abe. Cost-sensitive learning by cost-proportionate example weighting. In ICDM, 2003.
[23] M. Zhang and Z. Zhou. Multi-label neural networks with applications to functional genomics and text categorization. IEEE Trans. Knowl. Data Eng., 18(10):1338– 1351, 2006.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/70511-
dc.description.abstract多標籤學習 (Multi-label Learning) 是傳統多類別分類問題 (Multi-class Classification) 的一項延伸。在多類別分類問題中,每一筆數據 (Instance) 只被允許擁有單一個與此數據最相關標籤;但在多標籤學習中,每一筆數據都可以同時擁有多個與此數據相關的標籤 (Label) 。也因此,多標籤學習的應用十分廣泛,是機器學習中相當重要的研究問題。舉例而言,在影像分類中,每張照片可能同時包含多個不同的物品。其他多標籤學習的應用也包括文本分類、音樂分類,及影片分類。
由於不同的應用往往專注於不同面向並使用不同的標準來衡量多標籤學習演算法的表現,這樣的需求使得如何設計出可以自動化地適應並最佳化不同衡量標準的成本導向多標籤學習演算法 (Cost-sensitive Multi-label Learning Algorithm) 成為一個重要的研究課題。然而,因為這些用來衡量多標籤學習演算法的標準十分複雜且不易最佳化,設計出具有一般性並能夠廣泛地適應各種不同衡量標準的成本導向多標籤學習演算法其實是相當困難的。也因此,目前的成本導向演算法還是僅限於處理某些具有特殊形式的衡量標準,並不具備足夠的一般性。在這篇研究當中,我們提出的核心想法是對複雜的目標衡量標準重複地估計出局部代理損失函數,並用此函數決定最佳化的梯度下降的方向。我們並將此想法與深度學習結合,提出一個具有一般性的成本導向多標籤深度學習演算法。
zh_TW
dc.description.abstractMulti-label learning is an important machine learning problem with a wide range of applications. The variety of criteria for satisfying different application needs calls for cost-sensitive algorithms, which can adapt to different criteria easily. Nevertheless, because of the sophisticated nature of the criteria for multi-label learning, cost-sensitive algorithms for general criteria are hard to design, and current cost-sensitive algorithms can at most deal with some special types of criteria. In this work, we propose a novel cost-sensitive multi-label learning model for any general criteria. Our key idea within the model is to iteratively estimate a surrogate loss that approximates the sophisticated criterion of interest near some local neighborhood, and use the estimate to decide a descent direction for optimization. The key idea is then coupled with deep learning to form our proposed model. Experimental results validate that our proposed model is superior to existing cost-sensitive algorithms and existing deep learning models across different criteria.en
dc.description.provenanceMade available in DSpace on 2021-06-17T04:29:52Z (GMT). No. of bitstreams: 1
ntu-107-R05922048-1.pdf: 1769074 bytes, checksum: 8e8cbc76cf0b1b56a46cc81fe912b072 (MD5)
Previous issue date: 2018
en
dc.description.tableofcontents誌謝 ... i
摘要 ... iii
Abstract ... iv
1. Introduction ... 1
2. Background ... 4
2.1 Problem Setup ... 4
2.2 Related Work ... 5
3. Proposed methods ... 7
3.1 Sample-weighting CSMLL Framework ... 7
3.2 A Simple Cost-sensitive Multi-label Deep Learning Model ... 8
3.3 Locally-learned Surrogate Loss for General Cost-sensitive Multi-label Deep Learning ... 11
4. Experiments ... 16
4.1 Experiment Setup ... 16
4.2 Comparisons with Cost-sensitive Algorithm ... 17
4.3 Comparisons between Deep Learning Models ... 17
4.4 Scaling Up to Datasets with Many Labels ... 21
5. Conclusion ... 23
Bibliography ... 24
Appendices ... 27
dc.language.isoen
dc.subject成本導向zh_TW
dc.subject多標籤學習zh_TW
dc.subject代理損失函數zh_TW
dc.subject局部估計zh_TW
dc.subject梯度下降zh_TW
dc.subject深度學習zh_TW
dc.subjectMulti-label learningen
dc.subjectcost-sensitiveen
dc.subjectgradient descenten
dc.subjectdeep learningen
dc.subjectsurrogate lossen
dc.subjectlocal approximationen
dc.title運用局部代理損失函數之深度模型於廣泛成本導向多標籤學習zh_TW
dc.titleA Deep Model with Local Surrogate Loss for General Cost-sensitive Multi-label Learningen
dc.typeThesis
dc.date.schoolyear106-2
dc.description.degree碩士
dc.contributor.oralexamcommittee林守德(Shou-De Lin),陳縕儂(Yun-Nung Chen)
dc.subject.keyword多標籤學習,成本導向,代理損失函數,局部估計,梯度下降,深度學習,zh_TW
dc.subject.keywordMulti-label learning,cost-sensitive,surrogate loss,local approximation,gradient descent,deep learning,en
dc.relation.page28
dc.identifier.doi10.6342/NTU201803150
dc.rights.note有償授權
dc.date.accepted2018-08-13
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-107-1.pdf
  未授權公開取用
1.73 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved