Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 理學院
  3. 應用數學科學研究所
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88816
Title: Hopfield 神經網路理論之探討
A Survey on the Theory of Hopfield Network
Authors: 劉環齊
Huan-Chi Liu
Advisor: 崔茂培
Mao-Pei Tsui
Keyword: 霍普菲爾德神經網路,赫布型學習,
Hopfield Neural Network,Hebbian Learning,
Publication Year : 2023
Degree: 碩士
Abstract: 近年人工智慧與神經網路相關的內容遍地開花,相關的理論與應用也不斷地出現在人們的視野裡。我們在Benjamin Scellier 所著的「A deep learning theory for neural networks grounded in physics」[1] 中發現,神經網路也能夠以類似能量的形式來描述並量化其迭代和收斂的過程,因此我們開始對做為其基礎的Hopfield Network 感興趣。
在Hopfield 的概念下[2],所有神經元的狀態都是二元的,因此神經網路整體可能的狀態數量是有限的。以此為前提,我們便能夠透過能量函數和與其相應的演算法來去判別一個神經網路的收斂性質[3]。在這個基礎上,我們透過Hebbian學習法去建構神經網路,使其能夠確實收斂至事先給定的目標。以手寫數字或車牌辨識為例,我們期望一個辨識系統只會將其判別為指定的英數字或符號,而非其他意料之外的目標。不過在高維度與多個目標下的神經網路其行為過於複雜,因此我們會討論在一些簡化的情況下,是否存在其他我們並不期望的收斂目標。
In recent years, artificial intelligence and neural networks have seen a proliferation of content, as well as the emergence of related theories and applications. In ”A deep learning theory for neural networks grounded in physics” [1] by Benjamin Scellier, we found that neural networks can also describe and quantify their iterative and convergence processes in an energy-like form. Therefore, we become interested in the Hopfield Network as the basis of this theory.
In the concept of Hopfield network [2], the states of all neurons are binary, and thus the amount of possible states of the network is limited. Based on this premise, we are able to recognize the convergent property of a neural network by the energy function and its corresponding algorithm [3]. According to this, we use Hebbian learning to construct the neural network so that it can indeed converge to a given target. In the case of handwritten numbers or license plate recognition, for example, we expect a recognition system to recognize only the specified alphanumeric characters or symbols, and not other unexpected targets. However, the behavior of neural networks with high dimension and multiple targets is too complex, so we will discuss whether there are other targets that we do not expect to converge in some simplified cases.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88816
DOI: 10.6342/NTU202302929
Fulltext Rights: 未授權
Appears in Collections:應用數學科學研究所

Files in This Item:
File SizeFormat 
ntu-111-2.pdf
  Restricted Access
414.03 kBAdobe PDF
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved