Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 理學院
  3. 應用數學科學研究所
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/2374
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor王偉仲
dc.contributor.authorDa-Wei Changen
dc.contributor.author張大衛zh_TW
dc.date.accessioned2021-05-13T06:39:33Z-
dc.date.available2017-08-11
dc.date.available2021-05-13T06:39:33Z-
dc.date.copyright2017-08-11
dc.date.issued2017
dc.date.submitted2017-08-10
dc.identifier.citation[1] P.-A. Absil, R. Mahony, and R. Sepulchre. Optimization algorithms on matrix manifolds. Princeton University Press, 2009.
[2] E. Anderson, Z. Bai, C. Bischof, S. Blackford, J. Demmel, J. Dongarra, J. Du Croz, A. Greenbaum, S. Hammarling, A. McKenney, and D. Sorensen. LAPACK Users’ Guide. Society for Industrial and Applied Mathematics, Philadelphia, PA, third edition, 1999.
[3] J. Barzilai and J. M. Borwein. Two-point step size gradient methods. IMA Journal of Numerical Analysis, 8(1):141–148, 1988.
[4] T.-L. Chen, D. D. Chang, S.-Y. Huang, H. Chen, C. Lin, and W. Wang. Integrating multiple random sketches for singular value decomposition. arXiv preprint arXiv:1608.08285, 2016.
[5] S.Fiori, T.Kaneko, and T.Tanaka. Mixed maps for learning a kolmogoroff-nagumo-type average element on the compact Stiefel manifold. IEEE International Conference on Acoustic, Speech and Signal Processing (ICASSP), pages 4518– 4522, 2014.
[6] I. Griva, S. G. Nash, and A. Sofer. Linear and nonlinear optimization. Siam, 2009.
[7] N. Halko, P.-G. Martinsson, and J. A. Tropp. Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions. SIAM review, 53(2):217–288, 2011.
[8] T.Kaneko, S.Fiori, and T.Tanaka. Empirical arithmetic averaging over the compact Stiefel manifold. IEEE Transations on Signal Processing, 61(4):883–894, 2013.
[9] J. R. Magnus and H. Neudecker. The commutation matrix: some properties and applications. The Annals of Statistics, pages 381–394, 1979.
[10] V. Rokhlin, A. Szlam, and M. Tygert. A randomized algorithm for principal component analysis. SIAM Journal on Matrix Analysis and Applications, 31(3):1100–1124, 2009.
[11] Z. Wen and W. Yin. A feasible method for optimization with orthogonality constraints. Mathematical Programming, 142(1-2):397–434, 2013.
[12] H. Zhang and W. W. Hager. A nonmonotone line search technique and its application to unconstrained optimization. SIAM Journal on Optimization, 14(4):1043–1056, 2004.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/2374-
dc.description.abstract維度降低和特徵提取是大數據時代的重要技術,此二技術可以降低數據維數並降低進一步分析數據的計算成本。低秩奇異值分解(low-rank SVD)是這些技術的關鍵部分。為了更快地計算低秩奇異值分解,一些研究提出可以使用隨機抽取子空間的方法來獲得近似結果。在這項研究中,我們提出了一種新的概念,將隨機算法的結果進行整合以獲得更準確的近似值,稱為整合奇異值分解。我們通過理論和數值實驗來分析演算法的性質,以及不同的整合方法。整合方法的架構是有條件的優化問題,其具有唯一的局部極小值。整合子空間將透過線搜索、Kolmogorov-Nagumo 平均、和簡化類型的方法來進行計算,並針對這些方法的理論背景及計算複雜度進行分析,此外,整合奇異值分解與先前隨機奇異值分解的相似與相異處也會進行說明與分析。數值實驗結果顯示,在所提供的例子中,整合奇異值分解相對於同樣數量的隨機奇異值分解,使用線搜索方法時的疊代次數較少。另外,使用簡化類型的方法,來當作線搜索方法的初始值,可以減少收斂所需的疊代次數。zh_TW
dc.description.abstractDimension reduction and feature extraction are the important techniques in the big-data era to reduce the dimension of data and the computational cost for further data analysis. Low-rank singular value decomposition (low-rank SVD) is the key part of these techniques. In order to compute low-rank SVD faster, some researchers propose to use randomized subspace sketching algorithm to get an approximation result (rSVD). In this research, we propose an idea for integrating the results from randomized algorithm to get a more accurate approximation, which is called integrated singular value decomposition (iSVD). We analyze iSVD and the integration methods by theoretical analysis and numerical experiment. The integration scheme is a constraint optimization problem with unique local maximizer up to orthogonal transformation. Line search type method, Kolmogorov-Nagumo type average method and reduction type method are introduced and analyzed for their theoretical background and computational complexity. The similarity and difference between iSVD and rSVD with same sketching number are also explained and analyzed. The numerical experiment shows that the line search method in iSVD converges faster than the one in rSVD for our test examples. Also, using the integrated subspace from reduction as the initial value of line search method can reduce the iteration number to converge.en
dc.description.provenanceMade available in DSpace on 2021-05-13T06:39:33Z (GMT). No. of bitstreams: 1
ntu-106-R04246002-1.pdf: 954171 bytes, checksum: 657b125a18ee644f1728ac42ef2ecfb7 (MD5)
Previous issue date: 2017
en
dc.description.tableofcontents口試委員會審定書 iii
誌謝 v
Acknowledgements vii
摘要 ix
Abstract xi
1 Introduction 1
2 Overview of Integrated Singular Value Decomposition 5
3 Properties of Integrated Subspace 9
3.1 Solution of the Optimization Problem 10
3.2 Asymptotic Behavior of the Integrated Subspace 12
3.3 Uniqueness of Local Maximizer 13
4 Integration Method 21
4.1 Line Search Type Method 21
4.2 Kolmogorov-Nagumo-Type Average 29
4.3 Reduction-Type Average 33
5 Comparison of rSVD and iSVD 37
6 Numerical Experiment 41
6.1 Different Number of Sketched Subspaces 42
6.2 Comparison of KN and WY 44
6.3 Comparison of iSVD, rSVD and Reduction 47
7 Discussion and Conclusion 51
Bibliography 53
dc.language.isoen
dc.subject維度降低zh_TW
dc.subject數值線性代數zh_TW
dc.subject奇異值分解zh_TW
dc.subject隨機演算法zh_TW
dc.subject數值優化zh_TW
dc.subjectSingular Value Decompositionen
dc.subjectDimension Reductionen
dc.subjectNumerical Optimizationen
dc.subjectRandomized Algorithmen
dc.subjectNumerical Linear Algebraen
dc.title整合多重隨機奇異值分解與理論分析zh_TW
dc.titleTheoretical and Performance Analysis for Integrated Randomized Singular Value Decompositionen
dc.typeThesis
dc.date.schoolyear105-2
dc.description.degree碩士
dc.contributor.oralexamcommittee陳定立,陳素雲
dc.subject.keyword數值線性代數,奇異值分解,隨機演算法,數值優化,維度降低,zh_TW
dc.subject.keywordNumerical Linear Algebra,Singular Value Decomposition,Randomized Algorithm,Numerical Optimization,Dimension Reduction,en
dc.relation.page54
dc.identifier.doi10.6342/NTU201702973
dc.rights.note同意授權(全球公開)
dc.date.accepted2017-08-10
dc.contributor.author-college理學院zh_TW
dc.contributor.author-dept應用數學科學研究所zh_TW
顯示於系所單位:應用數學科學研究所

文件中的檔案:
檔案 大小格式 
ntu-106-1.pdf931.81 kBAdobe PDF檢視/開啟
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved