請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/9508
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 陳正剛 | |
dc.contributor.author | Yu-Wei Lin | en |
dc.contributor.author | 林育維 | zh_TW |
dc.date.accessioned | 2021-05-20T20:25:59Z | - |
dc.date.available | 2018-02-23 | |
dc.date.available | 2021-05-20T20:25:59Z | - |
dc.date.copyright | 2008-09-02 | |
dc.date.issued | 2008 | |
dc.date.submitted | 2008-08-27 | |
dc.identifier.citation | 1. James W. Longley, An Appraisal of Least Squares Programs for the Electronic Computer from the Point of View of the User, Journal of the American Statistical Association, Vol. 62, No. 319, (Sep., 1967), pp. 819- 841.
2. Gilbert Strang, Linear Algebra and its applications, pp.174-185. 3. R.M. Johnson, The Minimal Transformation to Orthonormality, Psychometrika, 1966. 4. Chen-Sui Lin, Clustering Analysis by Attributes Interactions and its Application to Clustering of Differentially Expressed Data, Graduate Institute of Industrial Engineering, National Taiwan University. 5. Bowerman, B. L., O’Connell, R. T. & Richard, T. (1993). Forecasting and Time Series: An Applied Approach. Belmont, CA Wadsworth. 6. Maddala, G. S. (1977). Econometrics. New York: McGraw-Hill Book Company. 7. Draper, N. & Smith, H. (1981). Applied Regression Analysis. New York: Wiley. 8. Feng-Jenq Lin, Solving Multicollinearity in the Process of Fitting Regression Model Using the Nested Estimate Procedure, Department of Applied Economics National I-Lan University | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/9508 | - |
dc.description.abstract | 迴歸分析是最常被使用的統計方法,然而,在迴歸分析中常會遇到的問題就是共線性,共線性是來自於獨立變變數之間的高度相關所引起的,也就是說,在資料向量中,存在有微小角度的問題存在。
文獻上,有兩種用來垂直化向量群的方法,一個是知名的葛蘭-史密特垂直化過程(Gram-Schmidt Process),另一個是由R. M. Johnson學者在1966年所提出來的,我們稱之為R. M. Johnson method,然而,Gram-Schmidt Process沒有一個有意義的機制來決定向量群垂直化的優先順序,而經由R. M. Johnson method所轉換出來的垂直向量群也無法具有解釋能力,特別是在具在有高相關的資料向量的情況中。 本篇研究中,我們嘗試去發展一個演算法來決定向量群垂直化的優先順序並且達到資訊轉換最小化,稱之為Gram-Schmidt轉換過程最小化演算法(Gram-Schmidt Transformation Minimization algorithm, GSTM algorithm),它把在向量投影過程中的資訊轉換最小化,但是在執行GSTM algorithm之前,有一些前置處理需要先進行,進行完之後,再針對這些資料向量執行GSTM algorithm,並在這些垂直向量群執行迴歸分析,最後再針對這些分析結果做解釋。 我們發現此演算法不僅克服在迴歸分析中共線性的問題和最小化在向量投影過程中的資訊轉換,也使得分析的結果更具有解釋能力。 | zh_TW |
dc.description.abstract | Regression analysis is the most used statistical method. However, we may encounter the multicollinearity problem in regression analysis. Multicollinearity is due to high correlation among independent variables, namely, small angles among data vectors of the independent variables.
In the literature, there are two methods to orthogonalize vectors. One is the well-known Gram-Schmidt Process and the other is a method proposed by R.M. Johnson in 1966, referred to as the R.M. Johnson method. However, the Gram-Schmidt Process has no meaningful mechanism to determine the sequence order of vector orthogonalization; while the results transformed by the R.M. Johnson method can not be interpreted meaningfully, especially in a case with highly correlated data vectors. In this research, we attempt to develop an algorithm to determine the sequence order of the Gram-Schmidt Process with minimized transformation, called the Gram-Schmidt Transformation Minimization (GSTM) algorithm. It minimizes information subtraction during the vector projection processes. But before performing the GSTM algorithm, some procedures need to be done first. After those procedures, we perform the GSTM algorithm on data vectors, and with the orthogonalized data vectors, we perform regression analysis. Finally, we interpret the analysis results in regression analysis by the GSTM algorithm. We find that this proposed algorithm not only overcomes the multicollinearity problem in regression analysis and minimizes information subtraction during the vector projection processes but also makes the analysis results more interpretable. | en |
dc.description.provenance | Made available in DSpace on 2021-05-20T20:25:59Z (GMT). No. of bitstreams: 1 ntu-97-R95546025-1.pdf: 550579 bytes, checksum: f65e63df54d35d831ee9087920dabc95 (MD5) Previous issue date: 2008 | en |
dc.description.tableofcontents | List of Tables VI
List of Figures IX Chapter 1. Introduction 1 1.1. Multicollinearity in Regression Analysis 1 1.2. Angle between Vectors and Statistical Correlation 6 1.3. Vector Orthogonalization 9 1.3.1. Gram-Schmidt Process 9 1.3.2. The Minimal Transformation to Orthonormality 14 1.4. Problem Definition 20 1.5. Thesis Organization 22 Chapter 2. Gram-Schmidt Transformation Minimization (GSTM) Algorithm 23 2.1. Preprocessing of Data 23 2.2. GSTM Algorithm 27 2.3. Performance Evaluation of GSTM Algorithm 39 Chapter 3. Regression Analysis with the GSTM Algorithm 44 3.1. Clustering of Features 44 3.2. Regression Analysis with the GSTM Algorithm 51 3.3. Interpretation 57 Chapter 4. Case Study 60 4.1. The CDU Dataset 60 Chapter 5. Conclusion 68 5.1. Conclusion 68 5.2. Future Research 69 Reference 71 | |
dc.language.iso | en | |
dc.title | Gram-Schmidt轉換過程最小化之演算法及其應用在具有共線性之迴歸分析 | zh_TW |
dc.title | Gram-Schmidt Transformation Minimization Algorithm
and Its Applications to Regression Analysis with Multicollinearity | en |
dc.type | Thesis | |
dc.date.schoolyear | 96-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 任恒毅,柯志明,蔡雅蓉,陳俊宏,張時中 | |
dc.subject.keyword | 迴歸分析,共線性,葛蘭-史密特垂直化過程,資訊轉換最小化,向量投影, | zh_TW |
dc.subject.keyword | Regression Analysis,Multicollinearity,Gram-Schmidt Process,Information Transformation Minimization,Vector Projection, | en |
dc.relation.page | 71 | |
dc.rights.note | 同意授權(全球公開) | |
dc.date.accepted | 2008-08-27 | |
dc.contributor.author-college | 工學院 | zh_TW |
dc.contributor.author-dept | 工業工程學研究所 | zh_TW |
顯示於系所單位: | 工業工程學研究所 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-97-1.pdf | 537.67 kB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。