請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/6865
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 林智仁(Chih-Jen Lin) | |
dc.contributor.author | Chia-Hua Ho | en |
dc.contributor.author | 何家華 | zh_TW |
dc.date.accessioned | 2021-05-17T09:19:50Z | - |
dc.date.available | 2012-07-18 | |
dc.date.available | 2021-05-17T09:19:50Z | - |
dc.date.copyright | 2012-07-18 | |
dc.date.issued | 2012 | |
dc.date.submitted | 2012-06-25 | |
dc.identifier.citation | T. Bertin-Mahieux, D. P. Ellis, B. Whitman, and P. Lamere. The million song dataset. In In Proceedings of the Twelfth International Society for Music Information Retrieval Conference (ISMIR 2011), 2011.
B. E. Boser, I. Guyon, and V. Vapnik. A training algorithm for optimal margin classifiers. In Proceedings of the Fifth Annual Workshop on Computational Learning Theory, pages 144–152. ACM Press, 1992. C.-C. Chang and C.-J. Lin. LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2:27:1–27:27, 2011. Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm. C. Cortes and V. Vapnik. Support-vector network. Machine Learning, 20:273–297, 1995. R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang, and C.-J. Lin. LIBLINEAR: A library for large linear classification. Journal of Machine Learning Research, 9:1871–1874, 2008. URL http://www.csie.ntu.edu.tw/~cjlin/papers/liblinear.pdf. A. Frank and A. Asuncion. UCI machine learning repository, 2010. URL http://archive.ics.uci.edu/ml. A. E. Hoerl and R. W. Kennard. Ridge regression: Biased estimation for nonorthogonal problems. Technometrics, 12(1):55–67, 1970. C.-J. Hsieh, K.-W. Chang, C.-J. Lin, S. S. Keerthi, and S. Sundararajan. A dual coordinate descent method for large-scale linear SVM. In Proceedings of the Twenty Fifth International Conference on Machine Learning (ICML), 2008. URL http://www.csie.ntu.edu.tw/~cjlin/papers/cddual.pdf. T. Joachims. Making large-scale SVM learning practical. In B. Scholkopf, C. J. C. Burges, and A. J. Smola, editors, Advances in Kernel Methods – Support Vector Learning, pages 169–184, Cambridge, MA, 1998. MIT Press. T. Joachims. Training linear SVMs in linear time. In Proceedings of the Twelfth ACM SIGKDD International Conference on Knowledge Discovery and Data Min- ing, 2006. S. S. Keerthi and D. DeCoste. A modified finite Newton method for fast solution of large scale linear SVMs. Journal of Machine Learning Research, 6:341–361, 2005. S. Kogan, D. Levin, B. R. Routledge, J. S. Sagi, and N. A. Smith. Predicting risk from financial reports with regression. In In Proceedings of the North American Association for Computational Linguistics Human Language Technologies Conference, pages 272–280, 2009. S.-P. Liao, H.-T. Lin, and C.-J. Lin. A note on the decomposition methods for support vector regression. Neural Computation, 14:1267–1281, 2002. C.-J. Lin and J. J. Mor ́. Newton’s method for large-scale bound constrained problems. SIAM Journal on Optimization, 9:1100–1127, 1999. C.-J. Lin, R. C. Weng, and S. S. Keerthi. Trust region Newton method for large-scale logistic regression. Journal of Machine Learning Research, 9:627–650, 2008. URL http://www.csie.ntu.edu.tw/~cjlin/papers/logistic.pdf. O. L. Mangasarian. A finite Newton method for classification. Optimization Methods and Software, 17(5):913–929, 2002. J. C. Platt. Fast training of support vector machines using sequential minimal optimization. In B. Scholkopf, C. J. C. Burges, and A. J. Smola, editors, Advances in Kernel Methods - Support Vector Learning, Cambridge, MA, 1998. MIT Press. S. Shalev-Shwartz, Y. Singer, and N. Srebro. Pegasos: primal estimated subgradient solver for SVM. In Proceedings of the Twenty Fourth International Conference on Machine Learning (ICML), 2007. P. Tseng and S. Yun. A coordinate gradient descent method for nonsmooth separable minimization. Mathematical Programming, 117:387–423, 2009. V. Vapnik. The Nature of Statistical Learning Theory. Springer-Verlag, New York, NY, 1995. G.-X. Yuan, K.-W. Chang, C.-J. Hsieh, and C.-J. Lin. A comparison of optimization methods and software for large-scale l1-regularized linear classification. Journal of Machine Learning Research, 11:3183–3234, 2010. URL http://www.csie.ntu.edu.tw/~cjlin/papers/l1.pdf. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/6865 | - |
dc.description.abstract | 在機器學習中,支持向量迴歸(SVR)與支持向量分類(SVC)是常見的方法,但是使用再生核函數後,他們的訓練過程常常很費時。近年來的研究發現,不使用再生核函數的線性SVC在特定領域上有良好的準確度以及快速的訓練及預測時間。然而,很少研究是著重在線性的SVR上。我們在這篇論文將快速的線性SVC訓練演算法拓展到線性SVR上。這些方法中,有些方法可以直接套用,有些方法則需要一些調整。實驗結果發現,我們提出的線性SVR訓練方法可以快速地產生和非線性SVR一樣好的模型。 | zh_TW |
dc.description.abstract | Support vector regression (SVR) and support vector classification (SVC) are popular learning techniques, but their use with kernels is often time consuming. Recently, linear SVC without kernels has been shown to give competitive accuracy for some applications, but enjoys much faster training/testing. However, few studies have focused on linear SVR. In this thesis, we extend state-of-the-art training methods for linear SVC to linear SVR. We show that the extension is straightforward for some methods, but is not trivial for some others. Our experiments demonstrate that for some problems, the proposed linear-SVR training methods can very efficiently produce models that are as good as kernel SVR. | en |
dc.description.provenance | Made available in DSpace on 2021-05-17T09:19:50Z (GMT). No. of bitstreams: 1 ntu-101-R99922033-1.pdf: 3628778 bytes, checksum: 7f5f6f7c6c0ea698b2b546d726c1bda6 (MD5) Previous issue date: 2012 | en |
dc.description.tableofcontents | 口試委員會審定書 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . i
中文摘要 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ii ABSTRACT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii LIST OF FIGURES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi LIST OF TABLES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii CHAPTER I. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 II. Linear Support Vector Regression . . . . . . . . . . . . . . . . . . 3 2.1 Differences Between SVC and SVR . . . . . . . . . . . . . . . . 5 III. Optimization Methods for Training Linear SVR . . . . . . . . . 7 3.1 A Trust Region Newton Method (TRON) for L2-loss SVR . . . 7 3.2 Dual Coordinate Descent Methods (DCD) . . . . . . . . . . . . 9 3.2.1 A Direct Extension from Classification to Regression . 9 3.2.2 A New Coordinate Descent Method by Solving α+ and α− Together . . . . . . . . . . . . . . . . . . . . . . . 13 3.3 Difference Between Linear and Nonlinear SVR . . . . . . . . . 20 IV. Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 4.1 Experimental Settings . . . . . . . . . . . . . . . . . . . . . . . 24 4.2 A Comparison Between Two DCD Algorithms . . . . . . . . . . 26 4.3 A Comparison Between Linear and Nonlinear SVR . . . . . . . 27 4.4 A Comparison Between TRON and DCD on Data with/without Normalization . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 4.5 With and Without the Bias Term in the SVR Prediction Function 37 4.6 Aggressiveness of DCD’s Shrinking Scheme . . . . . . . . . . . . 38 V. Discussions and Conclusions . . . . . . . . . . . . . . . . . . . . . 42 APPENDICES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 BIBLIOGRAPHY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 | |
dc.language.iso | en | |
dc.title | 大規模線性支持向量迴歸 | zh_TW |
dc.title | Large-scale Linear Support Vector Regression | en |
dc.type | Thesis | |
dc.date.schoolyear | 100-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 林軒田(Hsuan-Tien Lin),李育杰(Yuh-Jye Lee) | |
dc.subject.keyword | 大規模學習,支持向量迴歸, | zh_TW |
dc.subject.keyword | large-scale learning,support vector regression, | en |
dc.relation.page | 49 | |
dc.rights.note | 同意授權(全球公開) | |
dc.date.accepted | 2012-06-25 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
顯示於系所單位: | 資訊工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-101-1.pdf | 3.54 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。