請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/63813完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 林守德 | |
| dc.contributor.author | EN-HSU YEN | en |
| dc.contributor.author | 嚴恩勗 | zh_TW |
| dc.date.accessioned | 2021-06-16T17:19:49Z | - |
| dc.date.available | 2014-12-06 | |
| dc.date.copyright | 2012-12-06 | |
| dc.date.issued | 2012 | |
| dc.date.submitted | 2012-08-17 | |
| dc.identifier.citation | [1] R. Collobert, F. Sinz, J. Weston, and L. Bottou, “Trading Convexity for Scalability,” In ICML, 2006
[2] Jain, P., Vijayanarasimhan, S., Grauman, K.: Hashing hyperplane queries to near points with applications to large-scale active learning. In NIPS, 2010 [3] Ting Liu, Andrew W. Moore, Alexander Gray and Ke Yang, An Investigation of Practical Approximate Nearest Neighbor Algorithms, In NIPS, 2004 [4] Hsiang-Fu Yu, Cho-Jui Hsieh, Kai-Wei Chang, and Chih-Jen Lin. “Large linear classification when data cannot fit in memory”, ACM KDD 2010 [5] Kai-Wei Chang and Dan Roth. “Selective Block Minimization for Faster Convergence of Limited Memory Large-scale Linear Models”, ACM KDD 2011 [6] S. Ertekin, L. Bottou, C. Giles, “Nonconvex online support vector machines”, PAMI, 33 (2) (2011), pp. 368–381 [7] L. Wang, H. Jia, and J. Li, “Training Robust Support Vector Machine with Smooth Ramp Loss in the Primal Space,” Neurocomputing, vol. 71, pp. 3020-3025, 2008. [8] A. Guillory, E. Chastain and J. Bilmes. Active learning as Non-Convex Optimization. In AISTATS., 2009 [9] Zhuang Wang and Slobodan Vucetic. Fast Online Training of Ramp Loss Support Vector Machines, In ICDM 2009. [10] G. Loosli, S. Canu, and L. Bottou, “Training invariant support vector machines using selective sampling,” Large Scale Kernel Machines (L. Bottou, O. Chapelle, D. DeCoste, and J. Weston, eds.), pp. 301–320, Cambridge, MA.: MIT Press, 2007. [11] S. Shalev-Shwartz, Y. Singer, and N. Srebro. Pegasos: primal estimated sub-gradient solver for SVM. In ICML, 2007 [12] C.-J. Hsieh, K.-W. Chang, C.-J. Lin, S. S. Keerthi, and S. Sundararajan. A dual coordinate descent method for large-scale linear SVM. In ICML, 2008 [13] Z.-Q. Luo and P. Tseng, “On the convergence of coordinate descent method for convex differentiable minimization,” J. Optim. Theory Appl., vol. 72, no. 1, pp. 7–35, 1992. [14] H. Yu, J. Yang, and J. Han, “Classifying large data sets using SVMs with hierarchical clusters,” in ACM KDD, 2003. [15] Steinwart, I. Sparseness of support vector machines. Journal of Machine Learning Research, 2003 [16] Q. Lv, W. Josephson, Z. Wang, M. Charikar, and K. Li. Multi-probe LSH: Efficient indexing for high-dimensional similarity search. In VLDB, pages 950-961, 2007 [17] Beis, J. and Lowe, D. G. 1997. Shape indexing using approximate nearest-neighbor search in high-dimensional spaces. Conference on Computer Vision and Pattern Recognition, Puerto Rico, pp. 1000-1006. [18] Collobert, R., Bengio, S., and Bengio, Y. (2002). A Parallel Mixture of SVMs for Very Large Scale Problems, Neural Computation, Neural Computation 14, 1105-1114. | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/63813 | - |
| dc.description.abstract | 多維索引已頻繁用於在各種應用中的次線性時間近鄰搜索。在本文中,我們展示這種技術如何與與次線性稀疏學習問題結合,像是Ramp損失函數的SVM。我們提出一種outlier-free的凸鬆弛來解Ramp損失函數的SVM,並利用索引達到次線性時間的最佳化演算法,該演算法能在數據不能裝入記憶體的情境下解決大規模的問題。我們分別在充足和有限記憶體的條件下,與既有的線性SVM、Ramp損失SVM比較,其中我們的演算法不僅快許多倍,在大規模嘈雜的資料上亦能達到更好的精準度。 | zh_TW |
| dc.description.abstract | Multidimensional indexing has been frequently used for sublinear-time nearest neighbor search in various applications. In this paper, we demonstrate how this technique can be integrated into learning problem with sublinear sparsity like ramp-loss SVM. We propose an outlier-free convex-relaxation for ramp-loss SVM and an indexed optimization algorithm which solves large-scale problem in sublinear-time even when data cannot fit into memory. We compare our algorithm with state-of-the-art linear hinge-loss solver and ramp-loss solver in both sufficient and limited memory conditions, where our algorithm not only learns several times faster but achieves more accurate result on noisy and large-scale datasets. | en |
| dc.description.provenance | Made available in DSpace on 2021-06-16T17:19:49Z (GMT). No. of bitstreams: 1 ntu-101-R00922017-1.pdf: 456800 bytes, checksum: 73c6b0e7f52b61e56963b21e88d45d1a (MD5) Previous issue date: 2012 | en |
| dc.description.tableofcontents | 審定書 1
誌謝 2 中文摘要 3 ABSTRACT 4 CONTENTS 5 LIST OF FIGURES 7 LIST OF TABLES 8 Chapter 1 Introduction 9 1.1 Background 9 1.2 Contributions and Thesis Organization 10 Chapter 2 Convex-Ralaxtion for Ramp-Loss SVM 12 2.1 Concave-Convex Procedure (CCCP) 12 2.2 Outlier-Free Convex Relaxation 13 Chapter 3 Indexing and ANN Search 15 3.1 Nearest-To-Hyperplane Search 15 3.2 Indexing with Metric Forest 16 Chapter 4 Algorithm and Analysis 19 4.1 Block Minimization 21 4.2 Shrinking and Memory Management 23 4.3 Index Reusage 24 Chapter 5 Experiments 25 5.1 Datasets and Index 25 5.2 Results and Discussion 27 Chapter 6 Limitation 30 REFERENCE 31 | |
| dc.language.iso | en | |
| dc.subject | 次線性時間 | zh_TW |
| dc.subject | 高維度索引 | zh_TW |
| dc.subject | Ramp損失函數 | zh_TW |
| dc.subject | 大規模 | zh_TW |
| dc.subject | 有限記憶體 | zh_TW |
| dc.subject | 支持向量機 | zh_TW |
| dc.subject | 凸鬆弛 | zh_TW |
| dc.subject | Large-scale | en |
| dc.subject | Sublinear-time | en |
| dc.subject | Ramp-loss | en |
| dc.subject | SVM | en |
| dc.subject | Convex-Relaxation | en |
| dc.subject | Multidimensional Indexing | en |
| dc.subject | Limited-Memory | en |
| dc.title | 利用索引技術達到低於線性時間的支持向量機訓練 | zh_TW |
| dc.title | Indexed Optimization: Learning Ramp-Loss SVM in Sublinear Time | en |
| dc.type | Thesis | |
| dc.date.schoolyear | 100-2 | |
| dc.description.degree | 碩士 | |
| dc.contributor.oralexamcommittee | 林軒田,鮑興國,李育杰 | |
| dc.subject.keyword | 高維度索引,次線性時間,Ramp損失函數,支持向量機,凸鬆弛,大規模,有限記憶體, | zh_TW |
| dc.subject.keyword | Multidimensional Indexing,Sublinear-time,Ramp-loss,SVM,Convex-Relaxation,Large-scale,Limited-Memory, | en |
| dc.relation.page | 32 | |
| dc.rights.note | 有償授權 | |
| dc.date.accepted | 2012-08-17 | |
| dc.contributor.author-college | 電機資訊學院 | zh_TW |
| dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
| 顯示於系所單位: | 資訊工程學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-101-1.pdf 未授權公開取用 | 446.09 kB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
