請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/38091
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 陳正剛 | |
dc.contributor.author | Yi-Hung Chen | en |
dc.contributor.author | 陳宜鴻 | zh_TW |
dc.date.accessioned | 2021-06-13T16:26:17Z | - |
dc.date.available | 2005-07-20 | |
dc.date.copyright | 2005-07-20 | |
dc.date.issued | 2005 | |
dc.date.submitted | 2005-07-15 | |
dc.identifier.citation | [1] I. Jollife. Principal Component Analysis. Spring-Verlag, New York,1986.
[2] Fisher, R. A. (1936). The use of multiple measurements in taxonomic problems. Annals of Eugenics 7, 179-188. [3] Harry Hochstadt. Integral equations. Wiley Classics Library. John Wiley & Sons Inc., New York, 1989. ISBN 0-471-50404-1. Reprint of the 1973 original, A Wiley-Interscience Publication. [4] B. Schölkopf, C. J. C. Burges, & A. J. Smola, editors. Advanced in Kernel Methods – Support Vector Machine. MIT Press, Cambridge, MA, 1999. [5] B. Schölkopf, A. Smola, & K-R Müller. Kernel principal component analysis. In B. Schölkopf, C. J. C. Burges, & A. J. Smola, editors, Advanced in Kernel Methods – Support Vector Machine. 327-352. MIT Press, Cambridge, MA, 1999. [6] Mika, S., Rätsch, G., Weston, J., Schölkopf, B., & Müller, K-R. (1999). Fisher Discriminant Analysis With Kernels. IEEE International Workshop on Neural Networks for Signal Processing, Vol. IX, Madison, USA, August 1999, 41-48. [7] Vapnik, V., & Chervonenkis, A. (1974). Theory of pattern recognition [In Russian]. Nauka, Moscow, 1974. (German Translation: W. Wapnik & A. Tscherwonenkis, Theorie der Zeichener Kenning, Akademie-Verlag, Berlin). [8] B. Schölkopf, A. Smola, & K-R Müller. Nonlinear component analysis as kernel eigenvalue problem. Neural Computation, 10: 1299-1319, 1998. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/38091 | - |
dc.description.abstract | 分類方法大致上包含非監督式分類(unsupervised classification)和監督式分類(supervised classification)兩個範疇,其中主成分分析(Principal Component Analysis)是屬於非監督式分類方法,費雪線性區別方法(Fisher Linear Discriminant)則是屬於監督式分類方法;由於這兩種方法都是線性分析方法,為了能夠處理蘊含非線性特徵的實例(instance),因此從主成分分析衍生出以核函數(kernel function)為根基的主成分分析(Kernel Principal Component Analysis),另外也從費雪線性區別方法衍生出以核函數為根基的費雪區別方法(Kernel Fisher Discriminants),這兩種方法都是把實例從原本的屬性空間(attribute space)投射到一個較高維度的特徵空間(feature space)裡,並在特徵空間裡有效的對實例做樣式辨認(pattern recognition)或特徵提取(feature extraction),但是在特徵空間裡同時也會失去屬性解釋(attribute interpretation)的功能,因此對於以核函數為根基的主成分分析,我們先在特徵空間裡利用適當的主成分得點(score)將實例分成兩個分區(partition),再回到原本的屬性空間解釋屬性的意義,這個方法可以極小化重建誤差(reconstruction error)並突顯出較重要的屬性;對於以核函數為根基的費雪區別方法,我們先在特徵空間裡利用適當的區別得點將實例分成兩個分區,然後建構出區別函數使得預測的正確率可以達到最高。最後經由一些模擬和實際的例子證實,這兩個方法可以分別對非線性的非監督式和監督式分類方法提供屬性解釋,並且比線性的分類方法更有效率。 | zh_TW |
dc.description.abstract | In general, classification methods can be divided into two categories: one is unsupervised classification and the other is supervised classification. PCA and FLD are linear multivariate analysis methods where PCA is an unsupervised classification method and FLD belongs to supervised classification methods. Because linear methods are not sufficient to analyze the data with nonlinear patterns, the nonlinear methods KPCA and KFD are hence extended from PCA and FLD, respectively. Both transform the instances from the original attribute space to the feature space which could be arbitrarily large, possibly an infinite dimensional space. The feature space is efficient for feature extraction and pattern recognition but we loss the meanings of the original attributes there. For attribute interpretation, we need to segment the instances with nonlinear patterns into several partitions where each partition has its own linear patterns. Then, we can apply linear methods in each partition respectively for attribute interpretation. For KPCA, the segmentation is manipulated in the feature space through the second and higher KPC score(s) and then come back to the original attribute space for attribute interpretation. For KFD, we segment the training instance by an appropriate second and higher KFD score(s) to construct a linear predictive model for future prediction. This segmentation turns KPCA to segmented PCA by minimizing the reconstruction error and transforms KFD to segmented FLD by maximizing the classification accuracy. To verify this method, simulated examples are first used to examine and compare the proposed methods against other conventional methods. Then, real-world data sets are used to validate the proposed methodologies. | en |
dc.description.provenance | Made available in DSpace on 2021-06-13T16:26:17Z (GMT). No. of bitstreams: 1 ntu-94-R92546012-1.pdf: 558618 bytes, checksum: 8e67655a6727e8ff74c39cedf4d41ffc (MD5) Previous issue date: 2005 | en |
dc.description.tableofcontents | Abstract i
論文摘要 ii Contents iii Contents of Figures v Contents of Tables vii Chapter 1: Introduction 1 1.1 Background 1 1.2 Current Classification Approaches and Importance of Attribute Interpretation 3 1.2.1 Principal Component Analysis 3 1.2.2 Fisher Linear Discriminant 6 1.3 Problems of Current Nonlinear Classification Approaches and Research Objectives 12 1.4 Thesis Organization 13 Chapter 2: Kernel Principal Component Analysis with Attribute Interpretation 14 2.1 Kernel Principal Component Analysis 14 2.2 Attribute Interpretation and Classification Rules for KPCA 18 2.3 Illustration with a Simulated Example 26 Chapter 3: Kernel Fisher Discriminants with Attribute Interpretation 35 3.1 Kernel Fisher Discriminants 35 3.2 Attribute Interpretation and Classification Rules for KFD 38 3.3 Illustration with a Simulated Example 44 Chapter 4: Case Study 48 4.1 Liver-disorders Data Set 48 4.2 Balance Scaled Data Set 56 Chapter 5: Conclusions and Suggestions on Future Research 63 References 65 | |
dc.language.iso | en | |
dc.title | KPCA和KFD的屬性解釋以及分類法則之研究 | zh_TW |
dc.title | Attribute Interpretation and Classification Rules behind Kernel Principal Component Analysis and Kernel Fisher Discriminants | en |
dc.type | Thesis | |
dc.date.schoolyear | 93-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 仲偉強,林智仁,宗福季,范治民 | |
dc.subject.keyword | 分類方法,以核函數為根基的主成分分析,以核函數為根基的費雪區別方法, | zh_TW |
dc.subject.keyword | Classification method,Kernel principal component analysis,Kernel Fisher discriminants, | en |
dc.relation.page | 65 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2005-07-15 | |
dc.contributor.author-college | 工學院 | zh_TW |
dc.contributor.author-dept | 工業工程學研究所 | zh_TW |
顯示於系所單位: | 工業工程學研究所 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-94-1.pdf 目前未授權公開取用 | 545.53 kB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。