Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/63813
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor林守德
dc.contributor.authorEN-HSU YENen
dc.contributor.author嚴恩勗zh_TW
dc.date.accessioned2021-06-16T17:19:49Z-
dc.date.available2014-12-06
dc.date.copyright2012-12-06
dc.date.issued2012
dc.date.submitted2012-08-17
dc.identifier.citation[1] R. Collobert, F. Sinz, J. Weston, and L. Bottou, “Trading Convexity for Scalability,” In ICML, 2006
[2] Jain, P., Vijayanarasimhan, S., Grauman, K.: Hashing hyperplane queries to near points with applications to large-scale active learning. In NIPS, 2010
[3] Ting Liu, Andrew W. Moore, Alexander Gray and Ke Yang, An Investigation of Practical Approximate Nearest Neighbor Algorithms, In NIPS, 2004
[4] Hsiang-Fu Yu, Cho-Jui Hsieh, Kai-Wei Chang, and Chih-Jen Lin. “Large linear classification when data cannot fit in memory”, ACM KDD 2010
[5] Kai-Wei Chang and Dan Roth. “Selective Block Minimization for Faster Convergence of Limited Memory Large-scale Linear Models”, ACM KDD 2011
[6] S. Ertekin, L. Bottou, C. Giles, “Nonconvex online support vector machines”, PAMI, 33 (2) (2011), pp. 368–381
[7] L. Wang, H. Jia, and J. Li, “Training Robust Support Vector Machine with Smooth Ramp Loss in the Primal Space,” Neurocomputing, vol. 71, pp. 3020-3025, 2008.
[8] A. Guillory, E. Chastain and J. Bilmes. Active learning as Non-Convex Optimization. In AISTATS., 2009
[9] Zhuang Wang and Slobodan Vucetic. Fast Online Training of Ramp Loss Support Vector Machines, In ICDM 2009.
[10] G. Loosli, S. Canu, and L. Bottou, “Training invariant support vector machines using selective sampling,” Large Scale Kernel Machines (L. Bottou, O. Chapelle, D. DeCoste, and J. Weston, eds.), pp. 301–320, Cambridge, MA.: MIT Press, 2007.
[11] S. Shalev-Shwartz, Y. Singer, and N. Srebro. Pegasos: primal estimated sub-gradient solver for SVM. In ICML, 2007
[12] C.-J. Hsieh, K.-W. Chang, C.-J. Lin, S. S. Keerthi, and S. Sundararajan. A dual coordinate descent method for large-scale linear SVM. In ICML, 2008
[13] Z.-Q. Luo and P. Tseng, “On the convergence of coordinate descent method for convex differentiable minimization,” J. Optim. Theory Appl., vol. 72, no. 1, pp. 7–35, 1992.
[14] H. Yu, J. Yang, and J. Han, “Classifying large data sets using SVMs with hierarchical clusters,” in ACM KDD, 2003.
[15] Steinwart, I. Sparseness of support vector machines. Journal of Machine Learning Research, 2003
[16] Q. Lv, W. Josephson, Z. Wang, M. Charikar, and K. Li. Multi-probe LSH: Efficient indexing for high-dimensional similarity search. In VLDB, pages 950-961, 2007
[17] Beis, J. and Lowe, D. G. 1997. Shape indexing using approximate nearest-neighbor search in high-dimensional spaces. Conference on Computer Vision and Pattern Recognition, Puerto Rico, pp. 1000-1006.
[18] Collobert, R., Bengio, S., and Bengio, Y. (2002). A Parallel Mixture of SVMs for Very Large Scale Problems, Neural Computation, Neural Computation 14, 1105-1114.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/63813-
dc.description.abstract多維索引已頻繁用於在各種應用中的次線性時間近鄰搜索。在本文中,我們展示這種技術如何與與次線性稀疏學習問題結合,像是Ramp損失函數的SVM。我們提出一種outlier-free的凸鬆弛來解Ramp損失函數的SVM,並利用索引達到次線性時間的最佳化演算法,該演算法能在數據不能裝入記憶體的情境下解決大規模的問題。我們分別在充足和有限記憶體的條件下,與既有的線性SVM、Ramp損失SVM比較,其中我們的演算法不僅快許多倍,在大規模嘈雜的資料上亦能達到更好的精準度。zh_TW
dc.description.abstractMultidimensional indexing has been frequently used for sublinear-time nearest neighbor search in various applications. In this paper, we demonstrate how this technique can be integrated into learning problem with sublinear sparsity like ramp-loss SVM. We propose an outlier-free convex-relaxation for ramp-loss SVM and an indexed optimization algorithm which solves large-scale problem in sublinear-time even when data cannot fit into memory. We compare our algorithm with state-of-the-art linear hinge-loss solver and ramp-loss solver in both sufficient and limited memory conditions, where our algorithm not only learns several times faster but achieves more accurate result on noisy and large-scale datasets.en
dc.description.provenanceMade available in DSpace on 2021-06-16T17:19:49Z (GMT). No. of bitstreams: 1
ntu-101-R00922017-1.pdf: 456800 bytes, checksum: 73c6b0e7f52b61e56963b21e88d45d1a (MD5)
Previous issue date: 2012
en
dc.description.tableofcontents審定書 1
誌謝 2
中文摘要 3
ABSTRACT 4
CONTENTS 5
LIST OF FIGURES 7
LIST OF TABLES 8
Chapter 1 Introduction 9
1.1 Background 9
1.2 Contributions and Thesis Organization 10
Chapter 2 Convex-Ralaxtion for Ramp-Loss SVM 12
2.1 Concave-Convex Procedure (CCCP) 12
2.2 Outlier-Free Convex Relaxation 13
Chapter 3 Indexing and ANN Search 15
3.1 Nearest-To-Hyperplane Search 15
3.2 Indexing with Metric Forest 16
Chapter 4 Algorithm and Analysis 19
4.1 Block Minimization 21
4.2 Shrinking and Memory Management 23
4.3 Index Reusage 24
Chapter 5 Experiments 25
5.1 Datasets and Index 25
5.2 Results and Discussion 27
Chapter 6 Limitation 30
REFERENCE 31
dc.language.isoen
dc.subject次線性時間zh_TW
dc.subject高維度索引zh_TW
dc.subjectRamp損失函數zh_TW
dc.subject大規模zh_TW
dc.subject有限記憶體zh_TW
dc.subject支持向量機zh_TW
dc.subject凸鬆弛zh_TW
dc.subjectLarge-scaleen
dc.subjectSublinear-timeen
dc.subjectRamp-lossen
dc.subjectSVMen
dc.subjectConvex-Relaxationen
dc.subjectMultidimensional Indexingen
dc.subjectLimited-Memoryen
dc.title利用索引技術達到低於線性時間的支持向量機訓練zh_TW
dc.titleIndexed Optimization: Learning Ramp-Loss SVM in Sublinear Timeen
dc.typeThesis
dc.date.schoolyear100-2
dc.description.degree碩士
dc.contributor.oralexamcommittee林軒田,鮑興國,李育杰
dc.subject.keyword高維度索引,次線性時間,Ramp損失函數,支持向量機,凸鬆弛,大規模,有限記憶體,zh_TW
dc.subject.keywordMultidimensional Indexing,Sublinear-time,Ramp-loss,SVM,Convex-Relaxation,Large-scale,Limited-Memory,en
dc.relation.page32
dc.rights.note有償授權
dc.date.accepted2012-08-17
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-101-1.pdf
  未授權公開取用
446.09 kBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved