Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/51017
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor林軒田
dc.contributor.authorKuan-Hao Huangen
dc.contributor.author黃冠豪zh_TW
dc.date.accessioned2021-06-15T13:23:56Z-
dc.date.available2018-07-06
dc.date.copyright2016-07-06
dc.date.issued2016
dc.date.submitted2016-06-21
dc.identifier.citation[1] G. Carneiro, A. B. Chan, P. J. Moreno, and N. Vasconcelos. Supervised learning of
semantic classes for image annotation and retrieval. IEEE Transactions on Pattern
Analysis and Machine Intelligence, 29(3):394–410, 2007.
[2] K. Trohidis, G. Tsoumakas, G. Kalliris, and I. P. Vlahavas. Multi-label classification
of music into emotions. In ISMIR, pages 325–330, 2008.
[3] Z. Barutçuoglu, R. E. Schapire, and O. G. Troyanskaya. Hierarchical multi-label
prediction of gene function. Bioinformatics, 22(7):830–836, 2006.
[4] G. Tsoumakas, I. Katakis, and I. P. Vlahavas. Mining multi-label data. In Data
Mining and Knowledge Discovery Handbook, pages 667–685. 2010.
[5] F. Tai and H.-T. Lin. Multilabel classification with principal label space transformation.
Neural Computation, 24(9):2508–2542, 2012.
[6] Y.-N. Chen and H.-T. Lin. Feature-aware label space dimension reduction for multilabel
classification. In NIPS, pages 1538–1546, 2012.
[7] Z. Lin, G. Ding, M. Hu, and J. Wang. Multi-label classification via feature-aware
implicit label space encoding. In ICML, pages 325–333, 2014.
[8] K. Bhatia, H. Jain, P. Kar, M. Varma, and P. Jain. Sparse local embeddings for
extreme multi-label classification. In NIPS, pages 730–738. 2015.
[9] G. Tsoumakas, I. Katakis, and I. P. Vlahavas. Random k-labelsets for multilabel
classification. IEEE Transactions on Knowledge and Data Engineering, 23(7):1079–
1089, 2011.
[10] C.-S. Ferng and H.-T. Lin. Multilabel classification using error-correcting codes of
hard or soft bits. IEEE Transactions on Neural Networks and Learning Systems,
24(11):1888–1900, 2013.
[11] K. Dembczynski, W. Cheng, and E. Hüllermeier. Bayes optimal multilabel classification
via probabilistic classifier chains. In ICML, pages 279–286, 2010.
23
doi:10.6342/NTU201600441
[12] K Dembczynski, W. Waegeman, W. Cheng, and E. Hüllermeier. An exact algorithm
for f-measure maximization. In NIPS, pages 1404–1412, 2011.
[13] C.-L. Li and H.-T. Lin. Condensed filter tree for cost-sensitive multi-label classification.
In ICML, pages 423–431, 2014.
[14] H.-Y. Lo, J.-C. Wang, H.-M. Wang, and S.-D. Lin. Cost-sensitive multi-label learning
for audio tag annotation and retrieval. IEEE Trans. Multimedia, 13(3):518–529,
2011.
[15] G. Tsoumakas and I. Katakis. Multi-label classification: An overview. International
Journal of Data Warehousing and Mining, 3(3):1–13, 2007.
[16] D. Hsu, S. Kakade, J. Langford, and T. Zhang. Multi-label prediction via compressed
sensing. In NIPS, pages 772–780, 2009.
[17] B. Schölkopf, A. Smola, and K. Müller. Nonlinear component analysis as a kernel
eigenvalue problem. Neural computation, 10(5):1299–1319, 1998.
[18] J. Weston, O. Chapelle, V. Vapnik, A. Elisseeff, and B. Schölkopf. Kernel dependency
estimation. In NIPS, pages 873–880, 2002.
[19] J. B. Kruskal. Multidimensional scaling by optimizing goodness of fit to a nonmetric
hypothesis. Psychometrika, 29(1):1–27, 1964.
[20] J. De Leeuw. Applications of convex analysis to multidimensional scaling. Recent
Developments in Statistics, pages 133–145, 1977.
[21] G. Tsoumakas, E. Spyromitros-Xioufis, J. Vilcek, and I. P. Vlahavas. MULAN:
A java library for multi-label learning. Journal of Machine Learning Research,
12:2411–2414, 2011.
[22] L. Breiman. Random forests. Machine Learning, 45(1):5–32, 2001.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/51017-
dc.description.abstract在解決多標籤分類(Multi-label Classification)的方法當中,標籤嵌入法(Label Embedding)是一系列常見且重要的演算法。標籤嵌入法同時考量全部標籤之間的關係,並得到更好的分類表現。不同的多標籤分類應用通常會使用不同的成本計算方式來衡量演算法的分類表現,但是目前的標籤嵌入演算法都只考慮少數幾種、甚至只有一種成本計算方式,因此這些演算法在未考慮到的成本計算方式上的表現就會不如預期。為了解決這個問題,本論文提出了成本導向標籤嵌入演算法(Cost-sensitive Label Embedding with Multidimensional Scaling)。此演算法利用多維標度法(Multidimensional Scaling)找出標籤的嵌入向量(Embedded Vectos),再將成本計算方式的資訊嵌入到嵌入向量彼此之間的距離,最終的預測結果會透過所預測的嵌入向量以及我們設計的最近距離解碼函數(Nearest-neighbor Decoding Function)得出。由於嵌入向量隱含了成本計算方式的資訊,所以預測結果是成本導向的,在不同的成本計算方式上都能夠得到好的分類表現。雖然嵌入向量彼此之間的距離是對稱的,不過所提出的演算法不但能夠處理對稱式成本計算方式,也能處理非對稱式成本計算方式。除此之外,我們為所提出的演算法的提供了理論保證,最後再用大量的實驗結果來驗證所提出的演算法的確能夠在不同的成本計算方式上表現得比現有的標籤嵌入法以及成本導向演算法還要好。zh_TW
dc.description.abstractLabel embedding (LE) is an important family of multi-label classification algorithms that digest the label information jointly for better performance. Different real-world applications evaluate performance by different cost functions of interest. Current LE algorithms often aim to optimize one specific cost function, but they can suffer from bad performance with respect to other cost functions. In this paper, we resolve the performance issue by proposing a novel cost-sensitive LE algorithm that takes the cost function of interest into account. The proposed algorithm, cost-sensitive label embedding with multidimensional scaling (CLEMS), approximates the cost information with the distances of the embedded vectors using the classic multidimensional scaling approach for manifold learning. CLEMS is able to deal with both symmetric and asymmetric cost functions, and effectively makes cost-sensitive decisions by nearest-neighbor decoding within the embedded vectors. Theoretical results justify that CLEMS achieves the cost-sensitivity and extensive experimental results demonstrate that CLEMS is significantly better than a wide spectrum of existing LE algorithms and state-of-the-art cost-sensitive algorithms across different cost functions.en
dc.description.provenanceMade available in DSpace on 2021-06-15T13:23:56Z (GMT). No. of bitstreams: 1
ntu-105-R03922062-1.pdf: 1972818 bytes, checksum: 06ebf44ac789a7e2502a906a03cb5010 (MD5)
Previous issue date: 2016
en
dc.description.tableofcontents口試委員會審定書 i
誌謝 ii
摘要 iii
Abstract iv
1 Introduction 1
2 Cost-sensitive Label Embedding 4
3 Proposed Algorithm 8
3.1 Calculating the embedded vectors by multidimensional scaling . . . . . . 10
3.2 Choosing a proper isotonic function using theoretical guarantee . . . . . . 11
3.3 Approximating the asymmetric cost function with MDS . . . . . . . . . . 12
4 Experiments 15
4.1 Comparison with LSDR algorithms . . . . . . . . . . . . . . . . . . . . 16
4.2 Comparison with LSDE algorithms . . . . . . . . . . . . . . . . . . . . . 16
4.3 Candidate set and dimension of embedded space . . . . . . . . . . . . . . 17
4.4 Comparison with CFT . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
5 Conclusion 22
Bibliography 23
dc.language.isozh-TW
dc.subject標籤嵌入法zh_TW
dc.subject多標籤分類zh_TW
dc.subject成本導向zh_TW
dc.subject標籤嵌入法zh_TW
dc.subject成本導向zh_TW
dc.subject多標籤分類zh_TW
dc.subjectmulti-label classificationen
dc.subjectcost-sensitiveen
dc.subjectlabel embeddingen
dc.subjectlabel embeddingen
dc.subjectcost-sensitiveen
dc.subjectmulti-label classificationen
dc.title以成本導向標籤嵌入法解決多標籤分類問題zh_TW
dc.titleCost-sensitive Label Embedding for Multi-label Classificationen
dc.typeThesis
dc.date.schoolyear104-2
dc.description.degree碩士
dc.contributor.oralexamcommittee林守德,林智仁
dc.subject.keyword多標籤分類,成本導向,標籤嵌入法,zh_TW
dc.subject.keywordmulti-label classification,cost-sensitive,label embedding,en
dc.relation.page24
dc.identifier.doi10.6342/NTU201600441
dc.rights.note有償授權
dc.date.accepted2016-06-22
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-105-1.pdf
  未授權公開取用
1.93 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved