Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/32613
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor劉長遠(Cheng-Yuan Liou)
dc.contributor.authorI-Chun Linen
dc.contributor.author林義淳zh_TW
dc.date.accessioned2021-06-13T04:12:19Z-
dc.date.available2006-07-27
dc.date.copyright2006-07-27
dc.date.issued2006
dc.date.submitted2006-07-24
dc.identifier.citation[1] R.G. Andrzejak, K. Lehnertz, F. Mormann, C. Rieke, P. David, and C.E. Elger. Indications
of nonlinear deterministic and …nite-dimensional structures in time series of brain electrical
activity: Dependence on recording region and brain state. Physical Review E, 64:061907,
2001.
[2] A. Babloyantz. Chaotic dynamics in brain activity. dynamics of sensory and cognitive
processing by the brain. E. Basar (Eds.), 1998.
[3] S. Boccaletti, C. Grebogi, Y.C. Lai, H. Mancini, and D. Mazaet. The control of chaos:
Theory and applications. Physics Reports, 329:108–109, 2000.
[4] R. Brown, P. Bryant, and H.D.I. Abarbanel. Computing the lyapunov spectrum of a
dynamic system from an observed time-series. Physical Review A, 43:2787, 1991.
[5] M.C. Casdagli, L.D. Iasemidis, R.S. Savit, R.L. Glimore, S.N. Roper, and J.C. Sackle-
lares. Non-linearity in invasive eeg recordings from patients with temporal lobe epilepsy.
Eletroencephalogr. Clin. Neurophysiol, 1(102), 1998.
[6] A. Das, P. Das, and A. B. Roy. Applicability of lyapunov exponent in eeg data analysis.
Complexity International, 9, 2002.
[7] A.P. Dempster, N.M Laird, and D.B. Rubin. Maximum likelihood from incomplete data
via the em algorithm. J. R. Statist. Soc, 39:1–38, 1977.
[8] R.O. Duda and P.E. Hart. Pattern classi…cation and scene analysis. New York: Wiley,
1973.
[9] J. P. Eckmann and D. Ruelle. Ergodic theory of chaos and strange attractors. Rev. Mod.
Phys., 57:617 ¾ aV656, 1985.
[10] J.P. Eckmann, S.O. Kamphorst, D. Ruelle, and S. Ciliberto. Liapunov exponents from
time series. Physical Review A, 34:2787, 1986.
[11] A. Mees (Ed.). Nonlinear Dynamics and Statistics. Springer, 2001.
[12] A.M. FRASER and H.L. Swinney. Independent coordinates for strange attractors from
mutual information. Physical Review A, 33:1134, 1986.
[13] R. Gencay and W. Dechert. The identi…cation of spurious lyapunov exponents in jacobian
algorithms. Studies in nonlinear dynamics and econometrics, 1:145–154, 1996.
[14] N. Gershenfeld. Nature of Mathematical Modeling. MIT Press, 1998.
[15] N. Gershenfeld, B. Schoner, and E. Metois. Cluster-weighted modeling for time-series
analysis. Nature, 397:329–332, 1999.
[16] N. Gershenfeld and A.S. Weigend. In: Time series prediction: Forecasting the Future and
Understanding the Past ( eds A. S. Weigend and N. A. Gershenfeld). Addison-Wesley,
1993.
[17] A. Ghosh and R. Ramaswamy. Cluster-weighted modeling: Estimation of the lyapunov
spectrum in driven systems. Physical Review E, 71:016224, 2005.
[18] S. Haykin. Neural Networks. Prentice Hall, 1999.
[19] R. Hegger, H. Kantz, and T. Schreiber. Practical implementation of nonlinear time series
methods: The tisean package. Chaos, 9:413, 1999.
[20] M. Jordan and R. Jacobs. Hierarchical mixtures of experts and the em algorithm. Neural
Computation, 6:181–214, 1994.
[21] M. C. Mackey and L. Glass. Oscillation and chaos in physiological control systems. Science,
197:716–723, 1997.
[22] D.F. McCa¤rey, S. Ellner, R. Gallant, and D.W. Nychka. Estimating the lyapunov expo-
nent of a chaotic system with nonparametric regression. Journal of the american statistical
association, 87:682–695, 1992.
[23] D. Nychka, S. Ellner, R. Gallant, and D. MacCa¤rey. Finding chaos in noisy systems.
Journal of the Royal statistical society, series B, 54:399–462, 1992.
[24] V.I. Oseledec. A multiplicative ergodic theorem. lyapunov characteristic numbers for dy-
namical systems. Transactions of the Moscow mathematical society., 19:197–221, 1968.
[25] T. Poggio and F. Girosi. Networks for approximation and learning. In Proceedings of the
IEEE, volume 78, pages 1481–1497, 1990.
[26] M. Rosenstein, J.J. Collins, and C. De Luca. A pratical method for calculating largest
lyapunov exponents from small data sets. Physica D, 65:117–134, 1993.
[27] T. Sauer, J.A. Yorke, and M. Casdagli. Embedology. Journal of statistical physics, 65:579,
1991.
[28] R.H. Shumway and D.S. Sto¤er. Time Series Analysis and ITs Applications. Springer,
2000.
[29] R.a.m. van der linden and the sidc team, online catalogue of the sunspot index, 2006.
[30] F. Takens. Dynamical systems and turbulence. Lecture notes in mathematics, 898, 1980.
[31] J. Theiler. Don’t bleach chaotic data. Chaos, 3:771–782, 1993.
[32] M. Thomason. A basic neural network-based trading system: project revisited (part 1 and
2). J. Comput. Intell. Finance, 7(3):36–45, 1999.
[33] M. Thomason. A basic neural network-based trading system: project revisited (part 3 and
4). J. Comput. Intell. Finance, 7(4):35–45, 1999.
[34] D. Wettschereck and T. Dietterich. Improving the performance of radial basis function net-
works by learning center locations. In Advances in Neural Information Processing Systems,
volume 4, pages 1133–1140, 1992.
[35] A. Wolf, J.B. Swift, H. Swinney, and J. Vastano. Determining lyapunov exponents from a
time series. Physica D, 16:285–317, 1985.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/32613-
dc.description.abstract本論文是以叢聚權重模型為基礎, 其模型可以視為一個優良的函數逼近模型. 藉由估計輸入-輸出資料的機率密度來達成. 叢聚權重模型是以期望-最大化 (EM) 演算法來進行訓練. 在本論文中, 最小平方法 (LMS) 被用來更進一步將叢聚權重模型的訓練結果再度訓練, 且可視為一種互補的訓練方法. 因為期望-最大化演算法和最小平方法的目標函數並不相同, 因此兩者的極小值並不會相同. 最小平方法的訓練結果可以用來初始叢聚權重模型的參數, 因此提供了一個可以避免陷入區域極小值的問題. 本論文包含時間序列的預測, 颱風路徑預測以Lyapunov指數的估測實驗.zh_TW
dc.description.abstractThis thesis is based on Cluster-Weighted Modeling (CWM), which can be viewed as a novel uni-versal function approximator based on input-output joint density estimation. CWM is trained by Expectation-Maximization (EM) algorithm. In this thesis Least-Mean-Square (LMS) is ap-
plied to further train the model parameters and it can be viewed as a complementary training method for CWM. Due to different objective functions of EM and LMS, the local minimum should not be the same for the two objective functions. The training result of LMS learning can be used to reinitialize CWM’s model parameters which provides an approach to mitigate local minimum problems. Experiments of time-series prediction, hurricane track prediction and
Lyapunov exponents estimation are presented in this thesis.
en
dc.description.provenanceMade available in DSpace on 2021-06-13T04:12:19Z (GMT). No. of bitstreams: 1
ntu-95-R93922140-1.pdf: 717286 bytes, checksum: 715934e25daf9cd1d676607bf2af5301 (MD5)
Previous issue date: 2006
en
dc.description.tableofcontents1 Introduction. . . 8
2 Cluster-Weighted Modeling. . . 10
2.0.1 Architecture. . . 11
2.0.2 Model Estimation . . . 14
3 Least-Mean-Square Training of CWM. . . 22
3.1 Unconstrained Optimization Techniques. . . 22
3.2 Least-Mean-Square Algorithm. . . 24
3.3 Using LMS to Train CWM. . . 24
3.4 Experiments. . . 30
3.4.1 Simulated Data Experiments. . . 30
3.4.2 Real-World Data Experiments. . . 38
3.4.3 Local Minimum. . . 52
4 Using CWM to Estimate Lyapunov Exponents. . . 58
4.1 Lyapunov Exponents. . . 58
4.2 Lyapunov Exponents Estimation. . . 60
4.3 Lyapunov Exponents Estimation of EEG Time-Series Data. . . 62
4.4 Data Collection. . . 62
4.4.1 Results. . . 65
5 Conclusion . . . 66
dc.language.isoen
dc.subject函數逼近zh_TW
dc.subject叢聚權重模型zh_TW
dc.subject最小平方法zh_TW
dc.subject時間序列zh_TW
dc.subjectleast-mean-squareen
dc.subjectfunction approximationen
dc.subjecttime seriesen
dc.subjectcluster-weighted modelingen
dc.title以最小平方法訓練叢聚權重模型zh_TW
dc.titleLeast-Mean-Square Training of Cluster-Weighted Modelingen
dc.typeThesis
dc.date.schoolyear94-2
dc.description.degree碩士
dc.contributor.oralexamcommittee林智仁(Chih-Jen Lin),程爾觀(Philip E. Cheng),John Aston
dc.subject.keyword叢聚權重模型,最小平方法,時間序列,函數逼近,zh_TW
dc.subject.keywordcluster-weighted modeling,least-mean-square,time series,function approximation,en
dc.relation.page69
dc.rights.note有償授權
dc.date.accepted2006-07-26
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-95-1.pdf
  未授權公開取用
700.47 kBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved