請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/6715
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 林軒田(Hsuan-Tien Lin) | |
dc.contributor.author | Chun-Sung Ferng | en |
dc.contributor.author | 馮俊菘 | zh_TW |
dc.date.accessioned | 2021-05-17T09:16:47Z | - |
dc.date.available | 2017-08-09 | |
dc.date.available | 2021-05-17T09:16:47Z | - |
dc.date.copyright | 2012-08-09 | |
dc.date.issued | 2012 | |
dc.date.submitted | 2012-07-31 | |
dc.identifier.citation | R. C. Bose and D. K. Ray-Chaudhuri. On a class of error correcting binary group codes. Information and Control, 3(1), 1960.
M. R. Boutell, J. Luo, X. Shen, and C. M. Brown. Learning multi-label scene classification. Pattern Recognition, 37(9), 2004. C.-C. Chang and C.-J. Lin. LIBSVM: a library for support vector machines, 2001. Software available at http://www.csie.ntu.edu.tw/˜cjlin/libsvm. K. Dembczynski, W. Waegeman, W. Cheng, and E. Hullermeier. On label dependence in multi-label classification. In Proc. of the 2nd International Workshop on Learning from Multi-Label Data, 2010. T. G. Dietterich and G. Bakiri. Solving multiclass learning problems via error-correcting output codes. Journal of Artificial Intelligence Research, 2, 1995. S. Diplaris, G. Tsoumakas, P. Mitkas, and I. Vlahavas. Protein classification with multiple algorithms. In Proc. of Panhellenic Conference on Informatics, 2005. A. Elisseeff and J. Weston. A kernel method for multi-labelled classification. In Advances in Neural Information Processing Systems 14, 2002. R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang, and C.-J. Lin. LIBLINEAR: A library for large linear classification. Journal of Machine Learning Research, 9, 2008. R. G. Gallager. Low Density Parity Check Codes, Monograph. M.I.T. Press, 1963. J. Hagenuaer, E. Offer, and L. Papke. Iterative decoding of binary block and convolutional codes. IEEE Transactions on Information Theory, 42(2), 1996. M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I. H. Witten. The WEKA data mining software: an update. SIGKDD Explorations, 11(1), 2009. R. W. Hamming. Error detecting and error correcting codes. Bell System Technical Journal, 26(2), 1950. A. Hocquenghem. Codes correcteurs d’erreurs. Chiffres, 2, 1959. D. Hsu, S. Kakade, J. Langford, and T. Zhang. Multi-label prediction via compressed sensing. In Advances in Neural Information Processing Systems 22, 2009. A. Z. Kouzani. Multilabel classification using error correction codes. In Advances in Computation and Intelligence - 5th International Symposium, 2010. A. Z. Kouzani and G. Nasireding. Multilabel classification by BCH code and random forests. International Journal of Recent Trends in Engineering, 2(1), 2009. L. Li. Multiclass boosting with repartitioning. In Proc. of the 23rd International Conference on Machine Learning, 2006. H.-T. Lin, C.-J. Lin, and R. C. Weng. A note on Platt’s probabilistic outputs for support vector machines. Machine Learning, 68(3), 2007. D. J. C. Mackay. Information Theory, Inference and Learning Algorithms. Cambridge University Press, 1st edition, 2003. J. P. Pestian, C. Brew, P. Matykiewicz, D. J. Hovermale, N. Johnson, K. B. Cohen, and W. Duch. A shared task involving multi-label classification of clinical free text. In Proc. of the Workshop on BioNLP 2007: Biological, Translational, and Clinical Language Processing, 2007. J. C. Platt. Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. In Advances in Large Margin Classifiers, 1999. R. E. Schapire. Using output codes to boost multiclass learning problems. In Proc. of the 14th International Conference on Machine Learning, 1997. C. E. Shannon. A mathematical theory of communication. Bell Systems Technical Journal, 27, 1948. F. Tai and H.-T. Lin. Multi-label classification with principal label space transformation. Neural Computation, 2012. K. Trohidis, G. Tsoumakas, G. Kalliris, and I. Vlahavas. Multilabel classification of music into emotions. In Proc. of the 9th International Conference on Music Information Retrieval, 2008. G. Tsoumakas and I. Vlahavas. Random k-labelsets: An ensemble method for multilabel classification. In Proc. of the 18th European Conference on Machine Learning, 2007. G. Tsoumakas, J. Vilcek, and E. S. Xioufis. MULAN: A Java library for multi-label learning, 2010. J. K. Wolf. Efficient maximum likelihood decoding of linear block codes using a trellis. IEEE Transactions on Information Theory, 24(1), 1978. Y. Zhang and J. Schneider. Multi-label output codes using canonical correlation analysis. In Proc. of the 14th International Conference on Artificial Intelligence and Statistics, 2011. Y. Zhang and J. Schneider. Maximum margin output coding. In Proc. of the 29th International Conference on Machine Learning, 2012. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/6715 | - |
dc.description.abstract | 我們提出一個將錯誤更正碼 (error-correcting codes, ECC) 應用於多標籤分類問題 (multi-label classification) 的架構。 在這個架構中,我們以一些基礎學習器 (base learner) 當作有干擾的傳輸頻道, 並用錯誤更正碼來更正這些基礎學習器的預測錯誤。 透過這個架構,我們可以用簡單的重複碼 (repetition code) 來解釋現有的隨機 k 標籤組演算法 (random k-label-sets, RAKEL) 。 我們也實驗了各種錯誤更正碼應用在多標籤分類問題的效果, 實驗結果顯示,利用較強的錯誤更正碼可以改善隨機 k 標籤組演算法的表現; 此外,讓傳統的二元關聯演算法 (binary relevance) 學習一些校驗標籤 (parity-checking labels) 也會讓它有更好的表現。 而且,由不同的錯誤更正碼的實驗結果可以看出,錯誤更正碼的強度會影響基礎學習器的難度,妥善平衡兩者可以讓結果變得更好。 最後,我們也設計了一個新的解碼器來處理剛性(二元值)與柔性(實數值)的線性錯誤更正碼, 實驗結果也證實這個新的解碼器可以提昇這個架構的表現。 | zh_TW |
dc.description.abstract | We formulate a framework for applying error-correcting codes (ECC) on multi-label classification problems. The framework treats some base learners as noisy channels and uses ECC to correct the prediction errors made by the learners. An immediate use of the framework is a novel ECC-based explanation of the popular random k-label-sets (RAKEL) algorithm using a simple repetition ECC. Using the framework, we empirically compare a broad spectrum of ECC designs for multi-label classification. The results not only demonstrate that RAKEL can be improved by applying some stronger ECC, but also show that the traditional Binary Relevance approach can be enhanced by learning more parity-checking labels. Our study on different ECC also helps understand the trade-off between the strength of ECC and the hardness of the base learning tasks. Furthermore, we extend our study to linear ECC for either hard (binary) or soft (real-valued) bits, and design a novel decoder for the ECC. We demonstrate that the decoder improves the performance of our framework. | en |
dc.description.provenance | Made available in DSpace on 2021-05-17T09:16:47Z (GMT). No. of bitstreams: 1 ntu-101-R99922054-1.pdf: 1004191 bytes, checksum: 6a06a6051d433dc993de235a32878db7 (MD5) Previous issue date: 2012 | en |
dc.description.tableofcontents | 致謝 iii
中文摘要 v Abstract vii 1 Introduction 1 1.1 Problem Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.2 Related Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2 ECC for Multi-label Classification 7 2.1 ML-ECC Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 2.2 Review of Classic ECC . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 2.2.1 Repetition Code . . . . . . . . . . . . . . . . . . . . . . . . . . 9 2.2.2 Hamming on Repetition Code . . . . . . . . . . . . . . . . . . . 9 2.2.3 Bose-Chaudhuri-Hocquenghem Code . . . . . . . . . . . . . . . 10 2.2.4 Low-density Parity-check Code . . . . . . . . . . . . . . . . . . 10 2.3 ECC View of RAKEL . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.4 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.4.1 Validity of ML-ECC Framework . . . . . . . . . . . . . . . . . . 12 2.4.2 Comparison of Codeword Length . . . . . . . . . . . . . . . . . 15 2.4.3 Bit Error Analysis . . . . . . . . . . . . . . . . . . . . . . . . . 19 2.4.4 Comparison with Binary Relevance . . . . . . . . . . . . . . . . 21 3 New Decoder for Hard and Soft Decoding of Error-correcting Codes 25 3.1 Geometric Decoder for Linear Codes . . . . . . . . . . . . . . . . . . . . 26 3.2 Experimental Results of Geometric Decoder . . . . . . . . . . . . . . . . 28 3.2.1 Bit Error Analysis . . . . . . . . . . . . . . . . . . . . . . . . . 30 3.3 Soft-input Decoding and Bitwise Confidence Estimation for k-powerset Learners . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 3.4 Experimental Results of Soft-input Geometric Decoder . . . . . . . . . . 35 3.4.1 Soft-input Decoding for Binary Relevance Learners . . . . . . . . 35 3.4.2 Soft-input Decoding for k-powerset Learners . . . . . . . . . . . 37 3.5 Comparison with Real-valued ECC . . . . . . . . . . . . . . . . . . . . . 41 4 Conclusion 43 Bibliography 44 A Additional Experiment Results 49 A.1 ML-ECC Using 3-powerset Base Learners . . . . . . . . . . . . . . . . . 49 A.2 ML-ECC With Different Codeword Lengths . . . . . . . . . . . . . . . . 51 A.3 ML-ECC Using Binary Relevance Base Learners . . . . . . . . . . . . . 54 A.4 ML-ECC With Geometric Decoders . . . . . . . . . . . . . . . . . . . . 56 | |
dc.language.iso | en | |
dc.title | 剛性與柔性解碼之錯誤更正碼於多標籤分類學習之應用 | zh_TW |
dc.title | Multi-label Classification with Hard-/soft-decoded Error-correcting Codes | en |
dc.type | Thesis | |
dc.date.schoolyear | 100-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 林守德(Shou-De Lin),李育杰(Yuh-Jye Lee) | |
dc.subject.keyword | 機器學習,多標籤分類,錯誤更正碼,柔性解碼,幾何解碼, | zh_TW |
dc.subject.keyword | Machine Learning,Multi-label Classifi,cation,Error-correcting Codes,Soft Decoding,Geometric Decoding, | en |
dc.relation.page | 58 | |
dc.rights.note | 同意授權(全球公開) | |
dc.date.accepted | 2012-07-31 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
顯示於系所單位: | 資訊工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-101-1.pdf | 980.66 kB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。