請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/62424完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 陳素雲,陳宏 | |
| dc.contributor.author | You-Ren Chen | en |
| dc.contributor.author | 陳宥任 | zh_TW |
| dc.date.accessioned | 2021-06-16T16:02:21Z | - |
| dc.date.available | 2014-07-18 | |
| dc.date.copyright | 2013-07-18 | |
| dc.date.issued | 2013 | |
| dc.date.submitted | 2013-07-08 | |
| dc.identifier.citation | [1] A. Azzalini and A. Capitanio. Statistical applications of the multivariate skew-normal distribution.
J. Roy. Statist. Soc. Series B., 1999. [2] A. Banerjee, S. Merugu, I. S. Dhillon, S. Inderjit, and J. Ghosh. Clustering with Bregman divergences. Journal of Machine Learning Research, 2005. [3] L. M. Bregman. The relaxation method of finding the common points of convex sets and its application to the solution of problems in convex programming. USSR Computational Mathematics and Mathematical Physics., 1967. [4] J. Davis, B. Kulis, S. Sra, and I. S. Dhillon. Information-theoretic metric learning. In Proc. 24th International Conference on Machine Learning (ICML), 2007. [5] I. S. Dhillon and D. S. Modha. Concept decompositions for large sparse text data using clustering. Machine Learning, 2001. [6] B. Kulis, M. A. Sustik, and I. S. Dhillon. Learning low-rank kernel matrices. In Proc. 23rd International Conference on Machine Learning (ICML), 2006. [7] B. Kulis, M. A. Sustik, and I. S. Dhillon. Low-rank kernel learning with Bregman matrix divergences. Journal of Machine Learning Research, 2009. [8] J. Sherman and W. J. Morrison. Adjustment of an inverse matrix corresponding to changes in the elements of a given column or a given row of the original matrix. Annals of Mathematical Statistics, 1949. | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/62424 | - |
| dc.description.abstract | If a crowd of data (interior data) is encompassed by another set of data (exterior data) and we are
to find a closed surface centralized at the mean of interior data to wrap and separate it from exterior data, the first candidate for the closed surface may be an ellipse. Kulis et al. (2006) applies Bregman (1976), named Bregman's modification, trying to find a suitable ellipse to separate this kind of data. To be more specific, the crowd of data (interior data) can be properly described by its mean and covariance matrix, while the other set of data (exterior data) can be thought of as an unknown number of crowds of data (e.g., a mixed multivariate skew normal). However, from the conventional PCA, ellipses (i.e. semi positive definite matrices) exhibit rigid structures, that is, symmetry and orthogonality. These two properties are too restrictive for the following situations, assuming that interior data is like a unit ball (A) and exterior data is like two unit balls (B, C). First, B is 4 units far from A on the left while C is 40 units far from A on the right. The best cuts should be 2 units far from A on the left and 20 units far from A on the right. Unfortunately, an ellipse centralized at the center of A can only have cuts equally far from the left and right of A. Second, B is 4 units far from A on the left while B is 8 units far from A on the top right. Since an ellipse has orthogonal long and short axes while AB and AC are not orthogonal, this makes ellipses unsuitable separators. Therefore, we offer a method- Nonorthognal Ray Decomposition (NRD)- to improve this situation. | en |
| dc.description.provenance | Made available in DSpace on 2021-06-16T16:02:21Z (GMT). No. of bitstreams: 1 ntu-102-R99221013-1.pdf: 2996723 bytes, checksum: c1681a6cf061306ea000cbb5d0d28cba (MD5) Previous issue date: 2013 | en |
| dc.description.tableofcontents | Acknowledgements i
Abstract (in Chinese) ii Abstract (in English) iii Contents iv List of Figures v 1 Introduction 1 1.1 Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 2 Preliminary 3 2.1 Two Geometric Interpretation For PCA . . . . . . . . . . . . . . . . . . . . . . . . 3 2.2 Bregman's Modification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.3 Spherical k-means Clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 3 Nonorthogonal Ray Decomposition 12 3.1 An extension from QDA: one vs the rest . . . . . . . . . . . . . . . . . . . . . . . . 12 3.2 Observation (Bregman's Modification) . . . . . . . . . . . . . . . . . . . . . . . . . 12 3.3 Nonorthogonal Ray Decompositon (NRD) . . . . . . . . . . . . . . . . . . . . . . . 15 3.4 Evaluation of NRD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 3.5 Bregman's modification with slack variables . . . . . . . . . . . . . . . . . . . . . . 19 4 Pendigits 27 5 Conclusion 36 References 37 | |
| dc.language.iso | en | |
| dc.subject | 對稱 | zh_TW |
| dc.subject | 垂直 | zh_TW |
| dc.subject | 橢圓 | zh_TW |
| dc.subject | 非垂直射線分解 | zh_TW |
| dc.subject | 布雷格曼 | zh_TW |
| dc.subject | 外部資料 | zh_TW |
| dc.subject | 內部資料 | zh_TW |
| dc.subject | Interior data | en |
| dc.subject | Exterior data | en |
| dc.subject | Ellipse | en |
| dc.subject | Symmetry | en |
| dc.subject | Orthogonality | en |
| dc.subject | Nonorthogonal Ray Decomposition | en |
| dc.title | "用局部橢圓來分類資料, 基於布雷格曼矩陣分歧" | zh_TW |
| dc.title | Piecewise elliptical classification based on Bregman matrix
divergence | en |
| dc.type | Thesis | |
| dc.date.schoolyear | 101-2 | |
| dc.description.degree | 碩士 | |
| dc.contributor.oralexamcommittee | 陳鵬文,鮑興國 | |
| dc.subject.keyword | 內部資料,外部資料,布雷格曼,橢圓,對稱,垂直,非垂直射線分解, | zh_TW |
| dc.subject.keyword | Interior data,Exterior data,Ellipse,Symmetry,Orthogonality,Nonorthogonal Ray Decomposition, | en |
| dc.relation.page | 37 | |
| dc.rights.note | 有償授權 | |
| dc.date.accepted | 2013-07-08 | |
| dc.contributor.author-college | 理學院 | zh_TW |
| dc.contributor.author-dept | 數學研究所 | zh_TW |
| 顯示於系所單位: | 數學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-102-1.pdf 未授權公開取用 | 2.93 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
