Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 理學院
  3. 物理學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/50823
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳義裕(Yih-Yuh Chen)
dc.contributor.authorHao-Yuan Heen
dc.contributor.author何浩源zh_TW
dc.date.accessioned2021-06-15T13:00:30Z-
dc.date.available2016-07-26
dc.date.copyright2016-07-26
dc.date.issued2016
dc.date.submitted2016-07-12
dc.identifier.citation1. Enrico Simonotto, Massimo Riani, Charles Seife, Mark Roberts, Jennifer Twitty, and Frank Moss. Visual perception of stochastic resonance. Physical review letters, 78(6):1186, 1997.
2. Danielle Smith Bassett and ED Bullmore. Small-world brain networks. The neuroscientist, 12(6):512 523, 2006.
3. Wolfgang Maass. Liquid state machines: motivation, theory, and applications. Computability in context: computation and logic in the real world, pages 275
296, 2010.
4. Henry Markram, Yun Wang, and Misha Tsodyks. Di erential signaling via the same axon of neocortical pyramidal neurons. Proceedings of the National Academy of Sciences, 95(9):5323 5328, 1998.
5. Misha V Tsodyks and Henry Markram. The neural code between neocortical pyramidal neurons depends on neurotransmitter release probability. Proceedings of the National Academy of Sciences, 94(2):719 723, 1997.
6. JJ Collins, Carson C Chow, Thomas T Imho , et al. Stochastic resonance without tuning. Nature, 376(6537):236 238, 1995.
7. Duncan J Watts and Steven H Strogatz. Collective dynamics of ‘small-world’networks. nature, 393(6684):440 442, 1998.
8. Alessandro Treves. Mean- eld analysis of neuronal spike dynamics. Network: Computation in Neural Systems, 4(3):259 284, 1993.
9. Ji-Zheng Chu, Shyan-Shu Shieh, Shi-Shang Jang, Chuan-I Chien, Hou-Peng Wan, and Hsu-Hsun Ko. Constrained optimization of combustion in a simulated coal- red boiler using arti cial neural network model and information analysis.
Fuel, 82(6):693 703, 2003.
10. Dilip Goswami, Klaus Schuch, Yi Zheng, Tom DeMarse, and Jose C Principe. Towards the modeling of dissociated cortical tissue in the liquid state machine framework. In Neural Networks, 2005. IJCNN’05. Proceedings. 2005 IEEE In-
ternational Joint Conference on, volume 4, pages 2179 2183. IEEE, 2005.
11. Kurt Hornik, Maxwell Stinchcombe, and Halbert White. Multilayer feedforward networks are universal approximators. Neural networks, 2(5):359 366, 1989.
12. Guang-Bin Huang, Qin-Yu Zhu, and Chee-Kheong Siew. Extreme learning machine: a new learning scheme of feedforward neural networks. In Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on,
volume 2, pages 985 990. IEEE, 2004.
13. Sen Song, Kenneth D Miller, and Larry F Abbott. Competitive hebbian learning through spike-timing-dependent synaptic plasticity. Nature neuroscience, 3(9):919 926, 2000.
14. Wolfgang Maass, Thomas Natschläger, and Henry Markram. Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural computation, 14(11):2531 2560, 2002.
15. Amir F Atiya and Alexander G Parlos. New results on recurrent network training: unifying the algorithms and accelerating convergence. Neural Networks, IEEE Transactions on, 11(3):697 709, 2000.
16. Nils Bertschinger and Thomas Natschläger. Real-time computation at the edge of chaos in recurrent neural networks. Neural computation, 16(7):1413 1436, 2004.
17. Elad Schneidman, Michael J Berry, Ronen Segev, and William Bialek. Weak pairwise correlations imply strongly correlated network states in a neural population. Nature, 440(7087):1007 1012, 2006.
18. Thierry Mora, StØphane Deny, and Olivier Marre. Dynamical criticality in the collective activity of a population of retinal neurons. Physical review letters,
114(7):078105, 2015.
19. Han Ju, Jian-Xin Xu, and Antonius MJ VanDongen. Classi cation of musical styles using liquid state machines. In Neural Networks (IJCNN), The 2010 International Joint Conference on, pages 1 7. IEEE, 2010.
20. Stefan Schliebs and Doug Hunt. Continuous classi cation of spatio-temporal data streams using liquid state machines. In International Conference on Neural Information Processing, pages 626 633. Springer, 2012.
21. Hananel Hazan and Larry M Manevitz. Topological constraints and robustness in liquid state machines. Expert Systems with Applications, 39(2):1597 1606, 2012.
22. Luca Gammaitoni, Peter Hänggi, Peter Jung, and Fabio Marchesoni. Stochastic resonance. Reviews of modern physics, 70(1):223, 1998.
23. Roberto Benzi, Giorgio Parisi, Alfonso Sutera, and Angelo Vulpiani. Stochastic resonance in climatic change. Tellus, 34(1):10 16, 1982.
24. John K Douglass, Lon Wilkens, Eleni Pantazelou, Frank Moss, et al. Noise enhancement of information transfer in cray sh mechanoreceptors by stochastic resonance. Nature, 365(6444):337 340, 1993.
25. Per Bak, Chao Tang, and Kurt Wiesenfeld. Self-organized criticality: An explanation of the 1/f noise. Physical review letters, 59(4):381, 1987.
26. Kai J Miller, Larry B Sorensen, Je rey G Ojemann, and Marcel Den Nijs. Power-law scaling in the brain surface electric potential. PLoS Comput Biol, 5(12):e1000609, 2009.
27. Wolfgang Maass and Henry Markram. On the computational power of circuits of spiking neurons. Journal of computer and system sciences, 69(4):593 616, 2004.
28. Ismail Uysal and John G Harris. Biologically plausible speech recognition using spike-based phase locking cues. In Circuits and Systems, 2009. ISCAS 2009. IEEE International Symposium on, pages 101 104. IEEE, 2009.
29. Ronald A Fisher. The use of multiple measurements in taxonomic problems. Annals of eugenics, 7(2):179 188, 1936.
30. David E Rumelhart, Geo rey E Hinton, and Ronald J Williams. Learning representations by back-propagating errors. Cognitive modeling, 5(3):1, 1988.
31. Frank Rosenblatt. The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review, 65(6):386, 1958.
32. David Norton and Dan Ventura. Improving liquid state machines through iterative re nement of the reservoir. Neurocomputing, 73(16):2893 2904, 2010.
33. Ulf D Schiller and Jochen J Steil. Analyzing the weight dynamics of recurrent learning algorithms. Neurocomputing, 63:5 23, 2005.
34. Anthony N Burkitt. A review of the integrate-and- re neuron model: I. homogeneous synaptic input. Biological cybernetics, 95(1):1 19, 2006.
35. Prashant Joshi. From memory-based decisions to decision-based movements: A model of interval discrimination followed by action selection. Neural networks, 20(3):298 311, 2007.
36. Shan Yu, Debin Huang, Wolf Singer, and Danko Nikoli¢. A small world of neuronal synchrony. Cerebral cortex, 18(12):2891 2901, 2008.
37. Stanley Milgram. The small world problem. Psychology today, 2(1):60 67, 1967.
38. Bruce J Gluckman, Theoden I Neto , Emily J Neel, William L Ditto, Mark L Spano, and Steven J Schi . Stochastic resonance in a neuronal network from mammalian brain. Physical Review Letters, 77(19):4098, 1996.
39. Clayton Haldeman and John M Beggs. Critical branching captures activity in living neural networks and maximizes the number of metastable states. Physical review letters, 94(5):058101, 2005.
40. Osame Kinouchi and Mauro Copelli. Optimal dynamical range of excitable networks at criticality. Nature physics, 2(5):348 351, 2006.
41. Donald B Percival and Andrew T Walden. Spectral analysis for physical applications. Cambridge University Press, 1993.
42. Leandro M Alonso, Alex Proekt, Theodore H Schwartz, Kane O Pryor, Guillermo A Cecchi, and Marcelo O Magnasco. Dynamical criticality during induction of anesthesia in human ecog recordings. Frontiers in neural circuits, 8, 2014.
43. Arnold Neumaier and Tapio Schneider. Estimation of parameters and eigenmodes of multivariate autoregressive models. ACM Transactions on Mathematical Software (TOMS), 27(1):27 57, 2001.
44. Lars Onsager. Crystal statistics. i. a two-dimensional model with an order-disorder transition. Physical Review, 65(3-4):117, 1944.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/50823-
dc.description.abstractLiquid state machine (LSM)藉由模擬神經網路來達成機器學習中的分類工作。為了增進LSM的工作效率,我們針對LSM中的非線性性質進行進一步的研究。首先,我們研究一種被稱為隨機共振的非線性現象。雖然一般而言雜訊對於一個機器的效能是破壞性的,但在一些非線性系統內,卻可藉由加入雜訊來增加訊噪比。我們首先在LSM中觀察到隨機共振,並且觀察到一個現象:對同樣的一組資料,LSM在有雜訊的環境下進行學習的穩定性反而更高,這顯示雜訊的存在不但可以幫助訊號傳遞,還可以幫助神經網路進行學習。第二,我們研究LSM中的自組織臨界現象。許多人相信神經網路在臨界狀態中可以有更好的工作效率,因此研究臨界現象對於LSM效率的提升是重要的。與前人多使用power-law分布來決定臨界狀態不同,我們使用自回歸模型研究神經間的相關性來決定臨界狀態。我們發現在自回歸模型的特徵值最接近1的時候,LSM有著最佳的工作效率,說明可以使用這個方法來判斷神經網路的臨界特性。zh_TW
dc.description.abstractLiquid state machine (LSM) is an artificial neural network that does classification task by simulating spiking neurons. In order to improve the performance of LSM, we analyze the nonlinear phenomena behind it. In this study, two topics are studied. First, we analyze the nonlinear effect called stochastic resonance, which describes the effect of noise. While noise is an unwelcome feature in most system, it is possible for noise to enhance the performance in a nonlinear system. We observe that stochastic resonance can occurs in LSM, and show that the existence of noise can also help a neural network to 'learn'. Second, we study the critical phenomenon in LSM. Since many people believe a neural network has best performance under critical state, it is an important issue to determine whether our LSM is in criticality. Usually, people use statistical method such as power-law to determine the criticality. However, we use auto-regressive model which estimate the dynamical correlation between neurons to find critical state. In this study, we will show the LSM has best performance where the eigenvalue distribution in auto-regressive model is closest to 1.en
dc.description.provenanceMade available in DSpace on 2021-06-15T13:00:30Z (GMT). No. of bitstreams: 1
ntu-105-R03222046-1.pdf: 23140224 bytes, checksum: 44f558bed2b666c360fce25ae0f53a96 (MD5)
Previous issue date: 2016
en
dc.description.tableofcontentsPage
Chapter 1 Introduction 3
1.1 Neural network 3
1.2 Articial Neural Network 3
1.3 Liquid State Machine 5
1.4 Stochastic Resonance 7
1.5 Critical phenomenon 8
Chapter 2 Method 11
2.1 Liquid State Machine 11
2.1.1 Input layer 11
2.1.2 Liquid layer 12
2.1.3 Readout layer 13
2.2 Models of neuron and synapse 14
2.2.1 Neuron 14
2.2.2 Leaky Integrate-and-Fire Model 15
2.2.3 Tsodyks-Markram model (TM model) 16
2.3 Connecting topology 17
Chapter 3 Stochastic Resonance 21
3.1 Introduction to Stochastic Resonance 21
3.2 Simulation Setup 22
3.3 Results 24
3.3.1 Noise on Membrane Potential 24
3.3.2 Add noise under learning 26
3.3.3 Noise on Synapses 28
3.4 Summary 30
Chapter 4 Auto-Regressive Model 33
4.1 Introduction 33
4.2 How AR model works? 33
4.3 Simulation Setup 35
4.3.1 Ising Model 35
4.3.2 Simulation Setup for LSM 39
4.4 Results 41
4.4.1 AR Model on Liquid State Machine 42
Chapter 5 Conclusion and Discussion 51
Appendix A 55
A.1 Parameters for LSM 55
A.2 Setup of LSM 58
A.2.1 Input Pattern Generation 58
A.2.2 Small World Network Generation 58
A.2.3 Neuron dynamics 58
Reference 61
dc.language.isoen
dc.subject臨界現象zh_TW
dc.subject機器學習zh_TW
dc.subject類神經網路zh_TW
dc.subject隨機共振zh_TW
dc.subject臨界現象zh_TW
dc.subject隨機共振zh_TW
dc.subject自回歸模型zh_TW
dc.subject機器學習zh_TW
dc.subject自回歸模型zh_TW
dc.subject類神經網路zh_TW
dc.subjectAuto-regressive modelen
dc.subjectMachine learningen
dc.subjectArtificial neural networken
dc.subjectStochastic resonanceen
dc.subjectCritical phenomenonen
dc.subjectAuto-regressive modelen
dc.subjectMachine learningen
dc.subjectArtificial neural networken
dc.subjectStochastic resonanceen
dc.subjectCritical phenomenonen
dc.title液體狀態機中的非線性現象:隨機共振與臨界現象之研究zh_TW
dc.titleNonlinear phenomenon in liquid state machine: stochastic resonance and critical phenomenonen
dc.typeThesis
dc.date.schoolyear104-2
dc.description.degree碩士
dc.contributor.coadvisor陳志強(Chi-Keung Chen)
dc.contributor.oralexamcommittee陳俊仲(Chun-Chung Chen)
dc.subject.keyword機器學習,類神經網路,隨機共振,臨界現象,自回歸模型,zh_TW
dc.subject.keywordMachine learning,Artificial neural network,Stochastic resonance,Critical phenomenon,Auto-regressive model,en
dc.relation.page65
dc.identifier.doi10.6342/NTU201600747
dc.rights.note有償授權
dc.date.accepted2016-07-12
dc.contributor.author-college理學院zh_TW
dc.contributor.author-dept物理學研究所zh_TW
顯示於系所單位:物理學系

文件中的檔案:
檔案 大小格式 
ntu-105-1.pdf
  未授權公開取用
22.6 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved