Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 共同教育中心
  3. 統計碩士學位學程
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/67958
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor張晏誠(Yen-Cheng Chang)
dc.contributor.authorHan Chiuen
dc.contributor.author裘涵zh_TW
dc.date.accessioned2021-06-17T02:00:35Z-
dc.date.available2020-07-27
dc.date.copyright2017-07-27
dc.date.issued2017
dc.date.submitted2017-07-19
dc.identifier.citationChen, Y., Wiesel, A., & Hero, A. O. (2009, April). Shrinkage estimation of high dimensional covariance matrices. In Acoustics, Speech and Signal Processing, 2009. ICASSP 2009. IEEE International Conference on (pp. 2937-2940). IEEE.
Chen, B., Huang, S. F., & Pan, G. (2015). High dimensional mean–variance optimization through factor analysis. Journal of Multivariate Analysis, 133, 140-159.
Berk, K. N. (1974). Consistent autoregressive spectral estimates. Ann. Statist., 2, 489–502.
Markowitz, H. (1952). Portfolio selection. The journal of finance, 7(1), 77-91.
Mcnamara, J. R. (1998). Portfolio selection using stochastic dominance criteria. Decision Sciences, 29(4), 785-801.
Laloux, L., Cizeau, P., Bouchaud, J. P., & Potters, M. (1999). Noise dressing of financial correlation matrices. Physical review letters, 83(7), 1467.
Levina, E., Rothman, A., & Zhu, J. (2008). Sparse estimation of large covariance matrices via a nested Lasso penalty. The Annals of Applied Statistics, 2(1), 245-263.
Ing, C. K., Chiou, H. T., & Guo, M. (2016). Estimation of inverse autocovariance matrices for long memory processes. Bernoulli, 22(3), 1301-1330.
Ing, C. K., & Lai, T. L. (2011). A stepwise regression method and consistent model selection for high-dimensional sparse linear models. Statistica Sinica, 1473-1513.
Chen, Y., Wiesel, A., & Hero, A. O. (2009, April). Shrinkage estimation of high dimensional covariance matrices. In Acoustics, Speech and Signal Processing, 2009. ICASSP 2009. IEEE International Conference on (pp. 2937-2940). IEEE.
Bickel, P. J., & Levina, E. (2008). Regularized estimation of large covariance matrices. The Annals of Statistics, 199-227.
Lam, C., & Fan, J. (2009). Sparsistency and rates of convergence in large covariance matrix estimation. Annals of statistics, 37(6B), 4254.
El Karoui, N. (2010). High-dimensionality effects in the Markowitz problem and other quadratic programs with linear constraints: Risk underestimation. The Annals of Statistics, 38(6), 3487-3566.
Banerjee, O., Ghaoui, L. E., & d’Aspremont, A. (2008). Model selection through sparse maximum likelihood estimation for multivariate gaussian or binary data. Journal of Machine learning research, 9(Mar), 485-516.
Friedman, J., Hastie, T., & Tibshirani, R. (2008). Sparse inverse covariance estimation with the graphical lasso. Biostatistics, 9(3), 432-441.
Rothman, A. J., Bickel, P. J., Levina, E., & Zhu, J.(2008). Sparse permutation invariant covariance estimation. Electronic Journal of Statistics, 2, 494-515.
Fan, J., Fan, Y., & Lv, J. (2008). High dimensional covariance matrix estimation using a factor model. Journal of Econometrics, 147(1), 186-197.
Fan, J., Liao, Y., & Mincheva, M. (2011). High dimensional covariance matrix estimation in approximate factor models. Annals of statistics, 39(6), 3320.
Cai, T., & Liu, W. (2011). Adaptive thresholding for sparse covariance matrix estimation. Journal of the American Statistical Association, 106(494), 672-684.
Berk, K. N. (1974). Consistent autoregressive spectral estimates. The Annals of Statistics, 489-502.
Wu, W. B., & Pourahmadi, M. (2003). Nonparametric estimation of large covariance matrices of longitudinal data. Biometrika, 831-844.
Ing, C. K., Wang, H. Y., & Kuang, H. C. (2016). Estimation of large precision matrix for high dimensional mean-variance optimization.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/67958-
dc.description.abstract在平均-變異數投資組合最佳化的問題中,我們時常需要估計共變異數矩陣的反矩陣以計算投資組合的最適權重。當資產數量大且樣本數小的同時,共變異數矩陣在計算反矩陣上較為困難且複雜。常用來估計高維度共變異數矩陣的方法是在樣本共變異數上利用稀疏性假設,使高維度矩陣轉換為一個可逆矩陣。本研究提出一個統計框架,透過修正的Cholesky 分解法將高維度共變異數矩陣估計,轉換為迴歸係數估計之問題,並使用正交貪婪演算法(OGA)處理高維度迴歸模型之選擇。在模擬研究中,比較估計量與母體共變異數矩陣間的差異。此外,實證研究顯示在合理的參數假設下,OGA 估計結果優於Adaptive thresholding和linear shrinkage之方法。zh_TW
dc.description.abstractThe classical mean-variance portfolio optimization requires the estimation of an inverse covariance matrix. This is a challenging task given the large number of assets in the market and at the same time limited available historical data. Commonly used methods for estimating large covariance matrix exploit sparsity in the sample covariance matrix. In this study, I propose a statistical framework to estimate high-dimensional variance-covariance matrices under small sample size via the modified Cholesky decomposition with orthogonal greedy algorithm (OGA). This study transforms the covariance matrix estimation into a regression coefficient estimation problem, where the OGA is a fast stepwise regression method for the high-dimensional model selection and coefficient estimation. Therefore, I perform simulation studies to measure the difference between the estimators and the population covariance matrix. Moreover, empirical results show OGA estimators have better performance than adaptive thresholding and linear shrinkage approaches under reasonable parameter assumptions.en
dc.description.provenanceMade available in DSpace on 2021-06-17T02:00:35Z (GMT). No. of bitstreams: 1
ntu-106-R04H41011-1.pdf: 1643761 bytes, checksum: cf1ab75578b25cc5e6ce27dafc21b658 (MD5)
Previous issue date: 2017
en
dc.description.tableofcontents口試委員審定書 ............................................................................................................... i
誌謝 ................................................................................................................................. ii
摘要 ................................................................................................................................ iv
ABSTRACT ..................................................................................................................... v
CONTENTS .................................................................................................................... vi
LIST OF FIGURES....................................................................................................... viii
LIST OF TABLES .......................................................................................................... ix
1 Introduction ............................................................................................................. 1
2 Literature Review.................................................................................................... 4
3 Model and Estimation ............................................................................................. 6
3.1 Main model......................................................................................................... 7
3.2 Factor analysis based estimator .......................................................................... 7
3.3 Estimation of large error covariance matrix ....................................................... 8
3.3.1 Modified Cholesky decomposition ........................................................... 8
3.3.2 Orthogonal greedy algorithm (OGA) ........................................................ 9
4 Alternative method................................................................................................ 13
4.1 Adaptive thresholding estimation..................................................................... 13
4.2 Shrinkage estimation ........................................................................................ 14
5 Simulation ............................................................................................................. 15
5.1 Factor model structure data and simulations .................................................... 15
5.2 Simulation Results............................................................................................ 16
6 Empirical study ..................................................................................................... 19
6.1 Data and Descriptive Statistics ......................................................................... 19
6.2 Methodology..................................................................................................... 20
6.3 Empirical Results.............................................................................................. 23
6.3.1 Portfolio Performance (2009).................................................................. 23
6.3.2 Portfolio Performance (2009-2016) ........................................................ 28
7 Conclusion............................................................................................................. 31
References ...................................................................................................................... 32
dc.language.isoen
dc.subject因素分析zh_TW
dc.subject共變異數矩陣zh_TW
dc.subject修正Cholesky分解法zh_TW
dc.subject正交貪婪演算法zh_TW
dc.subject平均-變異數最佳化zh_TW
dc.subjectMean-variance optimizationen
dc.subjectVariance-covariance matrixen
dc.subjectModified Cholesky decompositionen
dc.subjectFactor analysisen
dc.subjectOrthogonal greedy algorithmen
dc.title高維度平均-變異數最佳化之共變異數矩陣估計:以台灣資料為例zh_TW
dc.titleVariance-Covariance Matrix Estimation for High Dimensional Mean-Variance Optimization: Evidence from Taiwanen
dc.typeThesis
dc.date.schoolyear105-2
dc.description.degree碩士
dc.contributor.coadvisor葉小蓁(Hsiaw-Chan Yeh)
dc.contributor.oralexamcommittee鄭宏文(Hung-Wen Cheng)
dc.subject.keyword因素分析,共變異數矩陣,修正Cholesky分解法,正交貪婪演算法,平均-變異數最佳化,zh_TW
dc.subject.keywordFactor analysis,Variance-covariance matrix,Modified Cholesky decomposition,Orthogonal greedy algorithm,Mean-variance optimization,en
dc.relation.page33
dc.identifier.doi10.6342/NTU201700667
dc.rights.note有償授權
dc.date.accepted2017-07-19
dc.contributor.author-college共同教育中心zh_TW
dc.contributor.author-dept統計碩士學位學程zh_TW
顯示於系所單位:統計碩士學位學程

文件中的檔案:
檔案 大小格式 
ntu-106-1.pdf
  未授權公開取用
1.61 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved