請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/61660
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 林軒田 | |
dc.contributor.author | Chun-Liang Li | en |
dc.contributor.author | 李俊良 | zh_TW |
dc.date.accessioned | 2021-06-16T13:08:46Z | - |
dc.date.available | 2015-08-06 | |
dc.date.copyright | 2013-08-06 | |
dc.date.issued | 2013 | |
dc.date.submitted | 2013-08-01 | |
dc.identifier.citation | [1] A. Beygelzimer, V. Dani, T. Hayes, J. Langford, and B. Zadrozny. Error limiting reductions between classification tasks. In Proceedings of the 22nd International Conference on Machine Learning, 2005.
[2] A. Beygelzimer, J. Langford, and P. Ravikumar. Multiclass classification with filter trees. 2007. [3] A. Beygelzimer, J. Langford, and B. Zadrozny. Weighted one- against-all. In Proceedings of the 20th National Conference on Artificial Intelligence, 2005. [4] M. R. Boutell, J. Luo, X. Shen, and C. M. Brown. Learning multi- label scene classification. Pattern Recognition, 2004. [5] K. Dembczynski, W. Cheng, and E. Hullermeier. Bayes optimal multilabel classification via probabilistic classifier chains. In Proceedings of the 27th International Conference on Machine learn- ing, 2010. [6] K. Dembczynski, W. Waegeman, W. Cheng, and E. Hullermeier. An exact algorithm for f-measure maximization. In Advances in Neural Information Processing Systems 24. 2011. [7] P. Domingos. Metacost: a general method for making classifiers cost-sensitive. In Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining, 1999. 31 [8] A.ElisseeffandJ.Weston.A kernel method for multi-labelled classification. In Advances in Neural Information Processing Systems 14, 2002. [9] R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang, and C.-J. Lin. LIBLINEAR: a library for large linear classification. Journal of Machine Learning Research, 2008. [10] R.-E. Fan and C.-J. Lin. A study on threshold selection for multi- label classification. 2007. [11] C.-S. Ferng and H.-T. Lin. Multi-label classification with error- correcting codes. Journal of Machine Learning Research - Proceedings Track, 2011. [12] H.-Y. Lo, J.-C. Wang, H.-M. Wang, and S.-D. Lin. Cost-sensitive multi-label learning for audio tag annotation and retrieval. IEEE Transactions on Multimedia, 2011. [13] P. Mineiro. Cost sensitive multi label: an observation, 2011. [14] J. Read. Meka: a multi-label extension to weka, 2012. [15] J. Read, B. Pfahringer, G. Holmes, and E. Frank. Classifier chains for multi-label classification. In Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases, 2009. [16] C. J. V. Rijsbergen. Foundation of evaluation. Journal of Documentation, 1974. [17] C. G. M. Snoek, M. Worring, J. C. van Gemert, J.-M. Geusebroek, and A. W. M. Smeulders. The challenge problem for automated detection of 101 semantic concepts in multimedia. In Proceedings 32 of the 14th annual ACM international conference on Multimedia, 2006. [18] A. Srivastava and B. Zane-Ulman. Discovering recurring anomalies in text reports regarding complex space systems. In IEEE Aerospace Conference, 2005. [19] G. Tsoumakas, I. Katakis, and I. Vlahavas. Mining multi-label data. In Data Mining and Knowledge Discovery Handbook. Springer US, 2010. [20] G. Tsoumakas, E. Spyromitros-Xioufis, J. Vilcek, and I. Vlahavas. Mulan: a java library for multi-label learning. Journal of Machine Learning Research, 2011. [21] G. Tsoumakas and I. Vlahavas. Random k-labelsets: an ensemble method for multilabel classification. In Machine Learning: the European Conference on Machine Learning. 2007. [22] G. Tsoumakas, M.-L. Zhang, and Z.-H. Zhou. Introduction to the special issue on learning from multi-label data. Journal of Machine Learning Research, 2012. [23] H.-H. Tu and H.-T. Lin. One-sided support vector regression for multiclass cost-sensitive classification. In Proceedings of the 27th International Conference on Machine learning, 2010. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/61660 | - |
dc.description.abstract | 近年來許多真實世界的應用需要好的多標籤分類演 算法,而不同的應用往往需要考慮不同的衡量標準。為 此,我們針對此需求提出更一般性的架構─成本導向多 標籤分類問題 (Cost-Sensitive Multi-Label Classification), 希望藉由在學習的過程中考慮成本資訊,來一般化這個 需求。然而,大部分現存的演算法只能專注在最佳化部 分特定的衡量標準,無法有系統的處理各種不同的標準。 在此論文中,我們提出了壓縮篩選樹演算法 (Condensed Filter Tree) 來最佳化任何不同的標準。壓縮篩選樹演算法 係由在成本導向多重分類問題 (Cost-Sensitive Multi-Class Classification) 裡著名的篩選樹演算法,藉由標籤冪集 (Label Powerset) 的轉化推導而來。我們透過特殊的樹狀 結構設計與專注在關鍵的樹節點,成功的解決指數多的 轉換類別在表現 (Representatino)、學習 (Training) 與預測 (Prediction) 中的困難。最後,在真實世界的資料上的實 驗結果顯示,比起其他已提出專注於特定衡量標準的演 算法,此論文中所提出的壓縮篩選樹在不同的衡量標準 下皆有較好的表現。 | zh_TW |
dc.description.abstract | Many real-world applications call for better multi-label classification algorithms in recent years and different applications often need considering different evaluation criteria. We formalize this need with a general setup, cost-sensitive multi-label classification (CSMLC), which takes the evaluation criteria into account during the learning process. Nevertheless, most existed algorithms can only focus on optimizing
a few specific evaluation criteria, and cannot systematically deal with different criteria. In this paper, we propose a novel algorithm, called condensed filter tree (CFT), for optimizing any criteria in CSMLC. CFT is derived from reducing CSMLC to the famous filter tree algorithm for cost-sensitive multi- class classification via the simple label powerset approach. We successfully cope with the difficulty of having exponentially many extend-classes within the powerset for representation, training and prediction by carefully designing the tree structure and focusing on the key nodes. Experimental results across many real-world datasets validate that the pro- posed CFT algorithm results in the better performance for many general evaluation criteria when compared with existing special- purpose algorithms. | en |
dc.description.provenance | Made available in DSpace on 2021-06-16T13:08:46Z (GMT). No. of bitstreams: 1 ntu-102-R01922001-1.pdf: 425930 bytes, checksum: e3147c572960f266e66539c713f4b3a3 (MD5) Previous issue date: 2013 | en |
dc.description.tableofcontents | 誌謝 v
摘要 vii Abstract ix 1 Introduction 1 2 Cost-Sensitive Multi-Label Classification 5 3 Proposed Algorithm 11 3.1 Tree-based Algorithms for Cost-Sensitive Classification 12 3.2 PredictingwithTree ................... 13 3.3 Training with Tree-based Algorithms . . . . . . . . . . 15 3.3.1 Top-downTree ................. 15 3.3.2 Ideal-pathFilterTree .............. 16 3.3.3 UniformFilterTree ............... 19 3.3.4 CondensedFilterTree.............. 20 3.4 Comparison........................ 21 4 Experiment 23 4.1 Comparison with CC and Tree-based Algorithms . . . 24 4.2 ComparisonbetweenPCCandCFT. . . . . . . . . . . 26 5 Conclusion 29 Bibliography 31 | |
dc.language.iso | zh-TW | |
dc.title | 用壓縮篩選樹演算法處理成本導向多標籤分類問題 | zh_TW |
dc.title | Condensed Filter Tree For Cost Sensitive Multi-Label Classification | en |
dc.type | Thesis | |
dc.date.schoolyear | 101-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 呂及人,林智仁,林守德,李育杰 | |
dc.subject.keyword | 機器學習,多標籤分類,成本資訊, | zh_TW |
dc.subject.keyword | MachineLearning,Multi-label Classification,Cost Information, | en |
dc.relation.page | 33 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2013-08-01 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
顯示於系所單位: | 資訊工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-102-1.pdf 目前未授權公開取用 | 415.95 kB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。