Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 理學院
  3. 應用數學科學研究所
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88816
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor崔茂培zh_TW
dc.contributor.advisorMao-Pei Tsuien
dc.contributor.author劉環齊zh_TW
dc.contributor.authorHuan-Chi Liuen
dc.date.accessioned2023-08-15T17:54:17Z-
dc.date.available2023-11-09-
dc.date.copyright2023-08-15-
dc.date.issued2023-
dc.date.submitted2023-08-08-
dc.identifier.citationBenjamin Scellier. A deep learning theory for neural networks grounded in physics. ArXiv, abs/2103.09985, 2021.
John Hopfield. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences of the United States of America, 79:2554–8, 05 1982.
Jehoshua Bruck. On the convergence properties of the hopfield model. Proceedings of the IEEE, 78(10):1579–1585, 1990.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88816-
dc.description.abstract近年人工智慧與神經網路相關的內容遍地開花,相關的理論與應用也不斷地出現在人們的視野裡。我們在Benjamin Scellier 所著的「A deep learning theory for neural networks grounded in physics」[1] 中發現,神經網路也能夠以類似能量的形式來描述並量化其迭代和收斂的過程,因此我們開始對做為其基礎的Hopfield Network 感興趣。
在Hopfield 的概念下[2],所有神經元的狀態都是二元的,因此神經網路整體可能的狀態數量是有限的。以此為前提,我們便能夠透過能量函數和與其相應的演算法來去判別一個神經網路的收斂性質[3]。在這個基礎上,我們透過Hebbian學習法去建構神經網路,使其能夠確實收斂至事先給定的目標。以手寫數字或車牌辨識為例,我們期望一個辨識系統只會將其判別為指定的英數字或符號,而非其他意料之外的目標。不過在高維度與多個目標下的神經網路其行為過於複雜,因此我們會討論在一些簡化的情況下,是否存在其他我們並不期望的收斂目標。
zh_TW
dc.description.abstractIn recent years, artificial intelligence and neural networks have seen a proliferation of content, as well as the emergence of related theories and applications. In ”A deep learning theory for neural networks grounded in physics” [1] by Benjamin Scellier, we found that neural networks can also describe and quantify their iterative and convergence processes in an energy-like form. Therefore, we become interested in the Hopfield Network as the basis of this theory.
In the concept of Hopfield network [2], the states of all neurons are binary, and thus the amount of possible states of the network is limited. Based on this premise, we are able to recognize the convergent property of a neural network by the energy function and its corresponding algorithm [3]. According to this, we use Hebbian learning to construct the neural network so that it can indeed converge to a given target. In the case of handwritten numbers or license plate recognition, for example, we expect a recognition system to recognize only the specified alphanumeric characters or symbols, and not other unexpected targets. However, the behavior of neural networks with high dimension and multiple targets is too complex, so we will discuss whether there are other targets that we do not expect to converge in some simplified cases.
en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-08-15T17:54:17Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2023-08-15T17:54:17Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontentsAcknowledgements i
摘要 iii
Abstract v
Contents vii
List of Figures ix
List of Tables xi
Chapter 1 Introduction 1
1.1 Introduction to Hopfield Network 1
1.2 Operation Modes of Network 3
1.3 Stability of Network 6
1.4 Energy Function and Algorithm 8
Chapter 2 Applications of Hopfield Network 11
2.1 Hebbian Learning on Weight Matrix 11
2.2 Single Pattern Model 13
2.3 Multiple Patterns Model 14
Chapter 3 Maximizer of the Hopfield model 17
3.1 2-patterns Hopfield Model 17
3.2 Equidistant Model 22
References 31
-
dc.language.isoen-
dc.subject霍普菲爾德神經網路zh_TW
dc.subject赫布型學習zh_TW
dc.subjectHebbian Learningen
dc.subjectHopfield Neural Networken
dc.titleHopfield 神經網路理論之探討zh_TW
dc.titleA Survey on the Theory of Hopfield Networken
dc.typeThesis-
dc.date.schoolyear111-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee楊鈞澔;林澤佑zh_TW
dc.contributor.oralexamcommitteeChun-Hao Yang;Tse-Yu Linen
dc.subject.keyword霍普菲爾德神經網路,赫布型學習,zh_TW
dc.subject.keywordHopfield Neural Network,Hebbian Learning,en
dc.relation.page31-
dc.identifier.doi10.6342/NTU202302929-
dc.rights.note未授權-
dc.date.accepted2023-08-09-
dc.contributor.author-college理學院-
dc.contributor.author-dept應用數學科學研究所-
顯示於系所單位:應用數學科學研究所

文件中的檔案:
檔案 大小格式 
ntu-111-2.pdf
  未授權公開取用
414.03 kBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved