請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/51017完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 林軒田 | |
| dc.contributor.author | Kuan-Hao Huang | en |
| dc.contributor.author | 黃冠豪 | zh_TW |
| dc.date.accessioned | 2021-06-15T13:23:56Z | - |
| dc.date.available | 2018-07-06 | |
| dc.date.copyright | 2016-07-06 | |
| dc.date.issued | 2016 | |
| dc.date.submitted | 2016-06-21 | |
| dc.identifier.citation | [1] G. Carneiro, A. B. Chan, P. J. Moreno, and N. Vasconcelos. Supervised learning of
semantic classes for image annotation and retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(3):394–410, 2007. [2] K. Trohidis, G. Tsoumakas, G. Kalliris, and I. P. Vlahavas. Multi-label classification of music into emotions. In ISMIR, pages 325–330, 2008. [3] Z. Barutçuoglu, R. E. Schapire, and O. G. Troyanskaya. Hierarchical multi-label prediction of gene function. Bioinformatics, 22(7):830–836, 2006. [4] G. Tsoumakas, I. Katakis, and I. P. Vlahavas. Mining multi-label data. In Data Mining and Knowledge Discovery Handbook, pages 667–685. 2010. [5] F. Tai and H.-T. Lin. Multilabel classification with principal label space transformation. Neural Computation, 24(9):2508–2542, 2012. [6] Y.-N. Chen and H.-T. Lin. Feature-aware label space dimension reduction for multilabel classification. In NIPS, pages 1538–1546, 2012. [7] Z. Lin, G. Ding, M. Hu, and J. Wang. Multi-label classification via feature-aware implicit label space encoding. In ICML, pages 325–333, 2014. [8] K. Bhatia, H. Jain, P. Kar, M. Varma, and P. Jain. Sparse local embeddings for extreme multi-label classification. In NIPS, pages 730–738. 2015. [9] G. Tsoumakas, I. Katakis, and I. P. Vlahavas. Random k-labelsets for multilabel classification. IEEE Transactions on Knowledge and Data Engineering, 23(7):1079– 1089, 2011. [10] C.-S. Ferng and H.-T. Lin. Multilabel classification using error-correcting codes of hard or soft bits. IEEE Transactions on Neural Networks and Learning Systems, 24(11):1888–1900, 2013. [11] K. Dembczynski, W. Cheng, and E. Hüllermeier. Bayes optimal multilabel classification via probabilistic classifier chains. In ICML, pages 279–286, 2010. 23 doi:10.6342/NTU201600441 [12] K Dembczynski, W. Waegeman, W. Cheng, and E. Hüllermeier. An exact algorithm for f-measure maximization. In NIPS, pages 1404–1412, 2011. [13] C.-L. Li and H.-T. Lin. Condensed filter tree for cost-sensitive multi-label classification. In ICML, pages 423–431, 2014. [14] H.-Y. Lo, J.-C. Wang, H.-M. Wang, and S.-D. Lin. Cost-sensitive multi-label learning for audio tag annotation and retrieval. IEEE Trans. Multimedia, 13(3):518–529, 2011. [15] G. Tsoumakas and I. Katakis. Multi-label classification: An overview. International Journal of Data Warehousing and Mining, 3(3):1–13, 2007. [16] D. Hsu, S. Kakade, J. Langford, and T. Zhang. Multi-label prediction via compressed sensing. In NIPS, pages 772–780, 2009. [17] B. Schölkopf, A. Smola, and K. Müller. Nonlinear component analysis as a kernel eigenvalue problem. Neural computation, 10(5):1299–1319, 1998. [18] J. Weston, O. Chapelle, V. Vapnik, A. Elisseeff, and B. Schölkopf. Kernel dependency estimation. In NIPS, pages 873–880, 2002. [19] J. B. Kruskal. Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis. Psychometrika, 29(1):1–27, 1964. [20] J. De Leeuw. Applications of convex analysis to multidimensional scaling. Recent Developments in Statistics, pages 133–145, 1977. [21] G. Tsoumakas, E. Spyromitros-Xioufis, J. Vilcek, and I. P. Vlahavas. MULAN: A java library for multi-label learning. Journal of Machine Learning Research, 12:2411–2414, 2011. [22] L. Breiman. Random forests. Machine Learning, 45(1):5–32, 2001. | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/51017 | - |
| dc.description.abstract | 在解決多標籤分類(Multi-label Classification)的方法當中,標籤嵌入法(Label Embedding)是一系列常見且重要的演算法。標籤嵌入法同時考量全部標籤之間的關係,並得到更好的分類表現。不同的多標籤分類應用通常會使用不同的成本計算方式來衡量演算法的分類表現,但是目前的標籤嵌入演算法都只考慮少數幾種、甚至只有一種成本計算方式,因此這些演算法在未考慮到的成本計算方式上的表現就會不如預期。為了解決這個問題,本論文提出了成本導向標籤嵌入演算法(Cost-sensitive Label Embedding with Multidimensional Scaling)。此演算法利用多維標度法(Multidimensional Scaling)找出標籤的嵌入向量(Embedded Vectos),再將成本計算方式的資訊嵌入到嵌入向量彼此之間的距離,最終的預測結果會透過所預測的嵌入向量以及我們設計的最近距離解碼函數(Nearest-neighbor Decoding Function)得出。由於嵌入向量隱含了成本計算方式的資訊,所以預測結果是成本導向的,在不同的成本計算方式上都能夠得到好的分類表現。雖然嵌入向量彼此之間的距離是對稱的,不過所提出的演算法不但能夠處理對稱式成本計算方式,也能處理非對稱式成本計算方式。除此之外,我們為所提出的演算法的提供了理論保證,最後再用大量的實驗結果來驗證所提出的演算法的確能夠在不同的成本計算方式上表現得比現有的標籤嵌入法以及成本導向演算法還要好。 | zh_TW |
| dc.description.abstract | Label embedding (LE) is an important family of multi-label classification algorithms that digest the label information jointly for better performance. Different real-world applications evaluate performance by different cost functions of interest. Current LE algorithms often aim to optimize one specific cost function, but they can suffer from bad performance with respect to other cost functions. In this paper, we resolve the performance issue by proposing a novel cost-sensitive LE algorithm that takes the cost function of interest into account. The proposed algorithm, cost-sensitive label embedding with multidimensional scaling (CLEMS), approximates the cost information with the distances of the embedded vectors using the classic multidimensional scaling approach for manifold learning. CLEMS is able to deal with both symmetric and asymmetric cost functions, and effectively makes cost-sensitive decisions by nearest-neighbor decoding within the embedded vectors. Theoretical results justify that CLEMS achieves the cost-sensitivity and extensive experimental results demonstrate that CLEMS is significantly better than a wide spectrum of existing LE algorithms and state-of-the-art cost-sensitive algorithms across different cost functions. | en |
| dc.description.provenance | Made available in DSpace on 2021-06-15T13:23:56Z (GMT). No. of bitstreams: 1 ntu-105-R03922062-1.pdf: 1972818 bytes, checksum: 06ebf44ac789a7e2502a906a03cb5010 (MD5) Previous issue date: 2016 | en |
| dc.description.tableofcontents | 口試委員會審定書 i
誌謝 ii 摘要 iii Abstract iv 1 Introduction 1 2 Cost-sensitive Label Embedding 4 3 Proposed Algorithm 8 3.1 Calculating the embedded vectors by multidimensional scaling . . . . . . 10 3.2 Choosing a proper isotonic function using theoretical guarantee . . . . . . 11 3.3 Approximating the asymmetric cost function with MDS . . . . . . . . . . 12 4 Experiments 15 4.1 Comparison with LSDR algorithms . . . . . . . . . . . . . . . . . . . . 16 4.2 Comparison with LSDE algorithms . . . . . . . . . . . . . . . . . . . . . 16 4.3 Candidate set and dimension of embedded space . . . . . . . . . . . . . . 17 4.4 Comparison with CFT . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 5 Conclusion 22 Bibliography 23 | |
| dc.language.iso | zh-TW | |
| dc.subject | 標籤嵌入法 | zh_TW |
| dc.subject | 多標籤分類 | zh_TW |
| dc.subject | 成本導向 | zh_TW |
| dc.subject | 標籤嵌入法 | zh_TW |
| dc.subject | 成本導向 | zh_TW |
| dc.subject | 多標籤分類 | zh_TW |
| dc.subject | multi-label classification | en |
| dc.subject | cost-sensitive | en |
| dc.subject | label embedding | en |
| dc.subject | label embedding | en |
| dc.subject | cost-sensitive | en |
| dc.subject | multi-label classification | en |
| dc.title | 以成本導向標籤嵌入法解決多標籤分類問題 | zh_TW |
| dc.title | Cost-sensitive Label Embedding for Multi-label Classification | en |
| dc.type | Thesis | |
| dc.date.schoolyear | 104-2 | |
| dc.description.degree | 碩士 | |
| dc.contributor.oralexamcommittee | 林守德,林智仁 | |
| dc.subject.keyword | 多標籤分類,成本導向,標籤嵌入法, | zh_TW |
| dc.subject.keyword | multi-label classification,cost-sensitive,label embedding, | en |
| dc.relation.page | 24 | |
| dc.identifier.doi | 10.6342/NTU201600441 | |
| dc.rights.note | 有償授權 | |
| dc.date.accepted | 2016-06-22 | |
| dc.contributor.author-college | 電機資訊學院 | zh_TW |
| dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
| 顯示於系所單位: | 資訊工程學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-105-1.pdf 未授權公開取用 | 1.93 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
