Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電機工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/40262
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor王勝德
dc.contributor.authorKuo-Ping Wuen
dc.contributor.author吳國賓zh_TW
dc.date.accessioned2021-06-14T16:43:36Z-
dc.date.available2008-08-08
dc.date.copyright2008-08-08
dc.date.issued2008
dc.date.submitted2008-08-01
dc.identifier.citation[1] V. Vapnik, Statistical Learning Theory. New York: Wiley, 1998.
[2] C.-W. Hsu and C.-J. Lin, 'A comparison of methods for multiclass support vector machines,' IEEE trans. Neural Network, vol. 13, pp. 415-425, 2002.
[3] N. Ancona, G. Cicirelli, E. Stella, and A. Distante, 'Object detection in images: run-time complexity and parameter selection of support vector machines,' 16th International Conference on Pattern Recognition, vol. 2, pp. 426-429, 2002.
[4] S. Rojas and D. Fernandez-Reyes, 'Adapting multiple kernel parameters for support vector machines using genetic algorithms,' The 2005 IEEE Congress on Evolutionary Computation, vol. 1, pp. 626-631, 2005.
[5] A. T. Quang, Q.-L. Zhang, and X. Li, 'Evolving support vector machine parameters,' Proc. 2002 International Conference on Machine Learning and Cybernetics, vol. 1, pp. 548-551, 2002.
[6] F. Imbault and K. Lebart, 'A stochastic optimization approach for parameter tuning of support vector machines,' Proc. 17th International Conference on Pattern Recognition, vol. 4, pp. 597-600, 2004.
[7] D. Hush and C. Scovel, 'Polynomial-time decomposition algorithms for support vector machines,' Machine Learning, vol. 51, p. 51-71, 2003.
[8] J. Platt, 'Sequential minimal optimization A fast algorithm for training support vector machines,' Microsoft Research, Technical Report MST-TR-98-14, 1998.
[9] K.-I. Maruyama, M. Maruyama, H. Miyao, and Y. Nakano, 'A method to make multiple hypotheses with high cumulative recognition rate using svms,' Pattern Recognition, vol. 37, pp. 241-251, 2004.
[10] R. Debnath and H. Takahashi, 'An e±cient method for tuning kernel parameter of the support vector machine,' IEEE International Symposium on Communications and Information Technology, vol. 2, pp. 1023-1028,2004.
[11] L.-P. Bi, H. Huang, Z.-Y. Zheng, and H.-T. Song, 'New heuristic for determination gaussian kernel's parameter,' Proceedings of 2005 International Conference on Machine Learning and Cybernetics, vol. 7,pp. 4299-4304, 2005.
[12] H. Li, S. Wang, and F. Qi, 'Svm model selection with the vc bound,' Lecture Notes in Computer Science, vol. 3314, pp. 1067-1071, 2005.
[13] B. de Souza, A. de Carvalho, R. Calvo, and R. Ishii, 'Multiclass svm model selection using particle swarm optimization,' 2006. HIS '06. Sixth International Conference on Hybrid Intelligent Systems, pp. 31-34, 2006.
[14] C. Chatelain, S. Adam, Y. Lecourtier, L. Heutte, and T. Paquet, 'Multi-objective optimization for svm model selection,' ICDAR 2007. Ninth International Conference on Document Analysis and Recognition, 2007., vol. 1, pp. 23-26, 2007.
[15] H.-K. Pao, S.-C. Chang, and Y.-J. Lee, 'Model trees for hybrid data type classi‾cation,' 6th International Conference on Intelligent Data Engineering and Automated Learning, pp. 32-39, 2005.
[16] C.-F. Lin and S.-D. Wang, 'Fuzzy support vector machines,' IEEE Transactions on Neural Networks, vol. 13, pp. 461-471, 2002.
[17] C.-F. Lin and S.-D.Wang, 'Training algorithms for fuzzy support vector machines with noisy data,' IEEE 13th Workshop on Neural Networks for Signal Processing, pp. 517-52, 2003.
[18] K.-P. Wu and S.-D. Wang, 'A weighting initialization strategy for weighted support vector machines,' The third international conference on Advances in Pattern Recognition, pp. 288-296, 2005.
[19] N. Cristianini and J. Shawe-Taylor, An introduction to support vector machines. Cambridge: Cambridge University Press, 2000.
[20] S. S. Keerthi and C.-J. Lin, 'Asymptotic behaviors of support vector machines with gaussian kernel,' Neural Computation, vol. 15, pp. 1667-1689, 2003.
[21] H.-T. Lin and C.-J. Lin, 'A study on sigmoid kernels for svm and the training of non-psd kernels by smo-type methods,' Technical report, Department of Computer Science and Information Engineering, National Taiwan University.
[22] F. Takahashi and S. Abe, 'Optimizing directed acyclic graph support vector machines,' Proc. Artificial Neural Networks in Pattern Recognition (ANNPR 2003), pp. 166-170, 2003.
[23] T. Phetkaew, B. Kijsirikul, and W. Rivepiboon, 'Reordering adaptive directed acyclic graphs: An improved algorithm for multiclass support vector machines,' Proc. Internat. Joint Conf. on Neural Networks(IJCNN 2003), vol. 2, pp. 1605-1610, 2003.
[24] K.-P. Wu and S.-D. Wang, 'Choosing the kernel parameters of support vector machines according to the inter-cluster distance,' Proc. Int. Joint Conf. on Neural Networks (IJCNN 2006), pp. 1205-1211, 2006.
[25] K. Morik, P. Brockhausen, and T. Joachims, 'Combining statistical learning with a knowledge-based approach, a case study in intensive case monitoring,' Proc. 16th International conference on machine learning, pp. 268-277, 1999.
[26] D. Li, S. Du, and T. Wu, 'A weighted support vector machine method and its application,' Proc. 5th world congress on intelligent control and automation, pp. 1834-1837, 2004.
[27] L. Bottou, C. Cortes, J. Denker, H. Drucker, I. Guyon, L. Jackel, Y. LeCun, U. Muller, E. Sackinger, P. Simard, and V. Vapnik, 'Comparison of classifier methods: A case study in handwritten digit recognition,' Proc. Int. Conf. Pattern Recognition, pp. 77-87, 1994.
[28] U. Krebel, Pairwise classification and support vector machines. in Advances in Kernel Methods - Support Vector Learning, B. Scholkopf, C., J. C. Burges and A. J. Smola Eds. Cambridge, MA: MIT Press, 1999.
[29] J. C. Platt, N. Cristianini, and J. Show-Tayler, Large margin DAG's for multiclass classification. in : Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 2000.
[30] K.-P. Wu and S.-D. Wang, 'Choosing the kernel parameters for the directed acyclic graph support vector machines,' Proc. Int. Conf. on Mechine Learning and Data Mining, pp. 276-285, 2007.
[31] C. Bezdek and N. Pal, 'Some new indexes of cluster validity,' IEEE Transactions On Systems, Man, And Cybernetics-Part B: Cybernetics, vol. 28, pp. 301-315, 1998.
[32] J.-C. Chiang and J. Wan, 'A validity-guided support vector clustering algorithm for identification of optimal cluster con‾guration,' IEEE International Conference on Systems, Man and Cybernetics, vol. 4, pp. 3613-3618, 2004.
[33] M. Award, L. Khan, F. Bastani, and I.-L. Yen, 'An effective support vector machines (svms) performance using hierarchical clustering,' Proceedings of the 16th IEEE International Conference on Tools with Artificial Intelligence, pp. 663-667, 2004.
[34] J. C. Platt, Fast training of support vector machines using sequential minimal optimization. in Advances in Kernel Methods - Support Vector Learning, B. Scholkopf, C. J. C. Burges and A. J. Smola Eds. Cambridge, MA: MIT Press, 1999.
[35] D. Michie, D. J. Spiegelhalter, and C. C. Taylor, Machine learning, neural and statistical classification. 1994. Online available:ftp.ncc.up.pt/pub/statlog/.
[36] C.-C. Chang and C.-J. Lin, LIBSVM: a library for supportvector machines. Online available from World Wide Web:http://www.csie.ntu.edu.tw/~cjlin/libsvm.
[37] K.-P. Wu and S.-D. Wang, 'Choosing the parameters of 2-norm soft margin support vector machines according to the cluster validity,' IEEE International Conference on Systems, Man, and Cybernetics 2006, pp. 4825-4831, 2006.
[38] A. Schwaighofer, SVM toolbox for Matlab. Online available from World Wide Web: http://ida.first.fraunhofer.de/~anton/software.html.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/40262-
dc.description.abstract決定支撐向量機的核心函數參數與誤差懲罰參數,在實用上是與待解問題非常相依的。網格搜尋是最常建議使用的方法。在訓練的過程中,各個參數組合用來訓練對應的分類器,其中效果最好的分類器及其參數將被保留使用。這個方法可以找到具有良好推論能力的分類器及其參數,然而訓練許多分類器將耗費大量時間。本論文提出,使用群組分離指標,以估計在不同特徵空間中,分類器的推論能力。該指標為在特徵空間中求得的群組內以及群組間距離,以及他們的組合。計算指標花費的時間,通常比訓練支撐向量機分類器少,因此可以快速選擇較佳的參數組合。實驗結果顯示,適當設計的指標,可以選到優良的參數組合,對應的分類器之偵測準確性,約與網格搜尋相當,然而訓練時間可以大幅減少。zh_TW
dc.description.abstractDetermining the kernel and error penalty parameters for support vector machines (SVMs) is very problem-dependent in practice. The most popular method to decide the parameters is the grid search method. In the training process, classifiers are trained with different parameter combinations, and only one of the classifiers is required for the testing process. This method can find a parameter combination with good generalization ability, while it makes the training process time-consuming. In this thesis we propose using separation indexes to estimate the generalization ability of the classifiers. These indexes are derived from the inter- and intra-cluster distances in the feature spaces. Calculating such indexes often costs much less computation time than training the corresponding SVM classifiers; thus the proper parameters can be chosen much faster. Experiment results show that some of the indexes can choose proper kernel parameters with which the testing accuracy of trained SVMs is competitive to the standard ones, and the training time can be significantly shortened.en
dc.description.provenanceMade available in DSpace on 2021-06-14T16:43:36Z (GMT). No. of bitstreams: 1
ntu-97-D86921026-1.pdf: 401946 bytes, checksum: 8534f830586e098e85c926853a0f332c (MD5)
Previous issue date: 2008
en
dc.description.tableofcontentsList of Tables - v
List of Figures - vi
Chapter 1 Introduction - 1
Chapter 2 Support Vector Machines and The Kernels - 5
2.1 Support Vector Machines for Binary-Class Classification - 5
2.1.1 Linearly Separable Case - 5
2.1.2 Non Linearly Separable Case - 9
2.2 Kernel Functions and the Penalty Parameter - 12
2.2.1 Kernel Functions and its Parameters - 12
2.2.2 Penalty Parameter C as the Weight - 16
2.3 Support Vector Machines for Multi-Class Classification - 17
Chapter 3 Separation Indexes for Choosing SVM Kernel Parameters - 21
3.1 Inter- and Intra-cluster Distances - 21
3.2 Distances in the Feature Space - 25
Chapter 4 Experiments and Results - 30
4.1 Binary-Class Cases - 31
4.2 Multi-Class Cases - 36
4.3 2-Norm SVM Application - 41
Chapter 5 Conclusions - 45
References - 47
dc.language.isoen
dc.subject群組間距離zh_TW
dc.subject支撐向量機zh_TW
dc.subject特徵空間zh_TW
dc.subject核心函數參數zh_TW
dc.subject模式選擇zh_TW
dc.subject群組內距離zh_TW
dc.subjectsupport vector machineen
dc.subjectinter-cluster distenceen
dc.subjectintra-cluster distanceen
dc.subjectmodel selectionen
dc.subjectkernel functionen
dc.subjectfeature spaceen
dc.title應用特徵空間距離訓練支撐向量機分類器zh_TW
dc.titleApplications of feature space distance measures in training support vector classifiersen
dc.typeThesis
dc.date.schoolyear96-2
dc.description.degree博士
dc.contributor.oralexamcommittee劉長遠,李漢銘,鍾國亮,李育杰,李嘉晃
dc.subject.keyword支撐向量機,特徵空間,核心函數參數,模式選擇,群組內距離,群組間距離,zh_TW
dc.subject.keywordsupport vector machine,feature space,kernel function,model selection,intra-cluster distance,inter-cluster distence,en
dc.relation.page50
dc.rights.note有償授權
dc.date.accepted2008-08-01
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept電機工程學研究所zh_TW
顯示於系所單位:電機工程學系

文件中的檔案:
檔案 大小格式 
ntu-97-1.pdf
  未授權公開取用
392.53 kBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved