Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/68219
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor林軒田(Hsuan-Tien Lin)
dc.contributor.authorHsien-Chun Chiuen
dc.contributor.author邱顯鈞zh_TW
dc.date.accessioned2021-06-17T02:15:02Z-
dc.date.available2018-01-04
dc.date.copyright2018-01-04
dc.date.issued2017
dc.date.submitted2017-10-26
dc.identifier.citationBibliography
[1] Changhu Wang, Shuicheng Yan, Lei Zhang, and Hong-Jiang Zhang. Multi-label sparse coding for automatic image annotation. In Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on, pages 1643–1650. IEEE, 2009.
[2] Hua Wang, Heng Huang, and Chris Ding. Image annotation using multi-label corre- lated green’s function. In Computer Vision, 2009 IEEE 12th International Confer- ence on, pages 2029–2034. IEEE, 2009.
[3] Jiang Wang, Yi Yang, Junhua Mao, Zhiheng Huang, Chang Huang, and Wei Xu. Cnn-rnn: A unified framework for multi-label image classification. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 2285– 2294, 2016.
[4] Timothy N Rubin, America Chambers, Padhraic Smyth, and Mark Steyvers. Sta- tistical topic models for multi-label document classification. Machine learning, 88(1):157–208, 2012.
[5] Grigorios Tsoumakas and Ioannis Katakis. Multi-label classification: An overview. International Journal of Data Warehousing and Mining, 3(3), 2006.
[6] DanielJHsu,ShamMKakade,JohnLangford,andTongZhang.Multi-labelpredic- tion via compressed sensing. In Advances in neural information processing systems, pages 772–780, 2009.
[7] Farbound Tai and Hsuan-Tien Lin. Multilabel classification with principal label space transformation. Neural Computation, 24(9):2508–2542, 2012.
[8] Kush Bhatia, Himanshu Jain, Purushottam Kar, Manik Varma, and Prateek Jain. Sparse local embeddings for extreme multi-label classification. In Advances in neu- ral information processing systems, pages 730–738, 2015.
[9] Kuan-Hao Huang and Hsuan-Tien Lin. Cost-sensitive label embedding for multi- label classification. Machine Learning, 106(9-10):1725–1746, 2017.
[10] Yao-Nan Chen and Hsuan-Tien Lin. Feature-aware label space dimension reduc- tion for multi-label classification. In Advances in Neural Information Processing Systems, pages 1529–1537, 2012.
[11] Xin Li and Yuhong Guo. Multi-label classification with feature-aware non-linear label space transformation. In IJCAI, pages 3635–3642, 2015.
[12] Zijia Lin, Guiguang Ding, Jungong Han, and Ling Shao. End-to-end feature-aware label space encoding for multilabel classification with many classes. IEEE Transac- tions on Neural Networks and Learning Systems, 2017.
[13] Chih-Kuan Yeh, Wei-Chieh Wu, Wei-Jen Ko, and Yu-Chiang Frank Wang. Learning deep latent space for multi-label classification. In AAAI, pages 2838–2844, 2017.
[14] Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang, and Shou-De Lin. Cost-sensitive multi-label learning for audio tag annotation and retrieval. IEEE Transactions on Multimedia, 13(3):518–529, 2011.
[15] Chun-Liang Li and Hsuan-Tien Lin. Condensed filter tree for cost-sensitive multi- label classification. In Proceedings of the 31st International Conference on Machine Learning (ICML-14), pages 423–431, 2014.
[16] Hung-Yi Lo, Shou-De Lin, and Hsin-Min Wang. Generalized k-labelsets ensemble for multi-label and cost-sensitive classification. IEEE Transactions on Knowledge and Data Engineering, 26(7):1679–1691, 2014.
[17] Tianyi Zhou, Dacheng Tao, and Xindong Wu. Compressed labeling on distilled labelsets for multi-label learning. Machine Learning, 88(1-2):69–126, 2012.
[18] Wei Bi and James Kwok. Efficient multi-label classification with many labels. In International Conference on Machine Learning, pages 405–413, 2013.
[19] Hsiang-Fu Yu, Prateek Jain, Purushottam Kar, and Inderjit Dhillon. Large-scale multi-label learning with missing labels. In International Conference on Machine Learning, pages 593–601, 2014.
[20] Harold Hotelling. Relations between two sets of variates. Biometrika, 28(3/4):321– 377, 1936.
[21] Weiwei Cheng, Eyke Hüllermeier, and Krzysztof J Dembczynski. Bayes optimal multilabel classification via probabilistic classifier chains. In Proceedings of the 27th international conference on machine learning (ICML-10), pages 279–286, 2010.
[22] J.B.Kruskal.Multidimensionalscalingbyoptimizinggoodnessoffittoanonmetric hypothesis. Psychometrika, 29(1):1–27, Mar 1964.
[23] Jane Bromley, Isabelle Guyon, Yann LeCun, Eduard Säckinger, and Roopak Shah. Signature verification using a” siamese” time delay neural network. In Advances in Neural Information Processing Systems, pages 737–744, 1994.
[24] Sumit Chopra, Raia Hadsell, and Yann LeCun. Learning a similarity metric dis- criminatively, with application to face verification. In Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference on, volume 1, pages 539–546. IEEE, 2005.
[25] Raia Hadsell, Sumit Chopra, and Yann LeCun. Dimensionality reduction by learn- ing an invariant mapping. In Computer vision and pattern recognition, 2006 IEEE computer society conference on, volume 2, pages 1735–1742. IEEE, 2006.
[26] Xinyuan Cai, Chunheng Wang, Baihua Xiao, Xue Chen, and Ji Zhou. Deep non- linear metric learning with independent subspace analysis for face verification. In
Proceedings of the 20th ACM international conference on Multimedia, pages 749– 752. ACM, 2012.
[27] Wenjie Luo, Alexander G Schwing, and Raquel Urtasun. Efficient deep learning for stereo matching. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 5695–5703, 2016.
[28] Grigorios Tsoumakas, Eleftherios Spyromitros-Xioufis, Jozef Vilcek, and Ioannis Vlahavas. Mulan: A java library for multi-label learning. Journal of Machine Learn- ing Research, 12:2411–2414, 2011.
[29] Leo Breiman. Random forests. Machine learning, 45(1):5–32, 2001.
[30] F.Pedregosa,G.Varoquaux,A.Gramfort,V.Michel,B.Thirion,O.Grisel,M.Blon- del, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. Scikit-learn: Machine learning in Python. Journal of Machine Learning Research, 12:2825–2830, 2011.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/68219-
dc.description.abstract多標籤分類問題是一個重要的機器學習問題,在此問題中每個樣本 點被標上多個標籤。在解決多標籤分類問題的方法之中,標籤嵌入法 是一系列重要的方法,它們透過抽取與利用標籤之間的潛藏結構來達 到更好的表現。在此系列的方法中,特徵感知的標籤嵌入法在抽取的 過程中同時考慮了特徵資訊和標籤資訊,並已展現出相較於沒有特徵 感知的標籤嵌入法更好的表現。儘管如此,現有的特徵感知標籤嵌入 法並沒有被設計成可以彈性的套用在不同的衡量標準上。在此論文中, 我們提出了一個嶄新的特徵感知標籤嵌入法,此方法會在訓練的過程 中考慮所要的衡量標準。我們將其命名為特徵感知的成本導向標籤嵌 入法,它以深度雙重網路將衡量標準編碼成嵌入向量之間的距離,並 透過一個同時考慮嵌入誤差與特徵至嵌入誤差的損失函數來達成特徵 感知。此外,特徵感知的成本導向標籤嵌入法還藉由附加位元法來處 理非對稱式衡量標準。橫跨不同資料集與衡量標準的實驗結果證明了 特徵感知的成本導向標籤嵌入法優於其他最先進的特徵感知標籤嵌入 法與成本導向標籤嵌入法。zh_TW
dc.description.abstractMulti-label classification (MLC) is an important learning problem where each instance is annotated with multiple labels. Label embedding (LE) is an important family of methods for MLC that extracts and utilizes the latent structure of labels towards better performance. Within the family, feature- aware LE methods, which jointly consider the feature and label information during extraction, have been shown to reach better performance than feature- unaware ones. Nevertheless, current feature-aware LE methods are not de- signed to flexibly adapt to different evaluation criteria. In this work, we pro- pose a novel feature-aware LE method that takes the desired evaluation cri- terion into account during training. The method, named Feature-aware Cost- sensitive Label Embedding (FaCLE), encodes the criterion into the distance between embedded vectors with a deep Siamese network. The feature-aware characteristic of FaCLE is achieved with a loss function that jointly considers the embedding error and the feature-to-embedding error. Moreover, FaCLE is coupled with an additional-bit trick to deal with the possibly asymmetric criteria. Experiment results across different datasets and evaluation criteria demonstrate that FaCLE is superior to other state-of-the-art feature-aware LE methods and cost-sensitive LE methods.en
dc.description.provenanceMade available in DSpace on 2021-06-17T02:15:02Z (GMT). No. of bitstreams: 1
ntu-106-R04922004-1.pdf: 1292583 bytes, checksum: e73f0832cf8daab51be364c22d09ade4 (MD5)
Previous issue date: 2017
en
dc.description.tableofcontentsContents
誌謝 i
摘要 ii
Abstract iii
1 Introduction 1
2 Related work 4
3 The Proposed Approach 7
3.1 DeepCost-sensitiveLabelEmbedding................... 9
3.2 Feature-awareComponent ......................... 11
4 Experiments 13
4.1 Comparing with Cost-sensitive Label Embedding Methods . . . . . . . . 14
4.2 Comparing with Feature-aware Label Embedding Methods . . . . . . . . 16
4.3 BalancingandSamplingParameters .................... 19
5 Conclusion 22
Bibliography 23
dc.language.isoen
dc.subject多標籤分類zh_TW
dc.subject標籤嵌入法zh_TW
dc.subject成本導向zh_TW
dc.subject特徵感知zh_TW
dc.subjectmulti-label classificationen
dc.subjectfeature-awareen
dc.subjectcost-sensitiveen
dc.subjectlabel embeddingen
dc.title以特徵感知的成本導向標籤嵌入法解決多標籤分類問題zh_TW
dc.titleMulti-label Classification with Feature-aware Cost-sensitive Label Embeddingen
dc.typeThesis
dc.date.schoolyear106-1
dc.description.degree碩士
dc.contributor.oralexamcommittee王鈺強(Yu-Chiang Wang),陳縕儂(Yun-Nung Chen)
dc.subject.keyword多標籤分類,特徵感知,成本導向,標籤嵌入法,zh_TW
dc.subject.keywordmulti-label classification,feature-aware,cost-sensitive,label embedding,en
dc.relation.page26
dc.identifier.doi10.6342/NTU201704326
dc.rights.note有償授權
dc.date.accepted2017-10-26
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-106-1.pdf
  未授權公開取用
1.26 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved