請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/72823
標題: | 卷積神經網路中基於投影的維度縮減層 A Dimensionality Reduction Layer by Projection in a Convolutional Neural Network |
作者: | Toshinari Morimoto 森元俊成 |
指導教授: | 陳素雲 |
關鍵字: | 卷積神經網路,維度縮減,池化,截斷正交矩陣,投影,反向傳播演算法, Convolutional Neural Network,Dimensionality Reduction,Pooling,Truncated Orthogonal Matrix,Projection,Backpropagation Algorithm, |
出版年 : | 2019 |
學位: | 碩士 |
摘要: | 本研究提出了一個卷積神經網路中取代池化的降維方法。池化層是接在卷積層後面,並發揮維度縮減的作用。目前,最大池化或平均池化等的方法被廣泛使用,而我們提出的方法將卷積層的輸出利用截斷的正交矩陣來轉換為維度較小的矩陣。我們將該截斷的正交矩陣視為神經網路中的訓練參數,並推導反向傳播演算法中出現的相關微分。除此以外,我們實際將上述所提的方法寫為電腦程式,驗證其可行性;同時,針對上述所提的方法與池化方法,於盡量相同的條件下進行比較。在實驗中,我們的方法展現較池化方法佳的性能。 In this research, we proposed a dimensionality reduction method that takes the place of the pooling methods. A pooling layer is usually put after a convolutional layer to summarize the output images from the convolutional layer. At the moment, the max-pooling method or the average-pooling method is widely used on CNN. On the other hand, our proposed method transforms an output image from a convolutional layer into a lower-dimensional image by multiplying truncated orthogonal matrices. We regard the truncated orthogonal matrices as parameters of the neural network, and we derived the derivatives that appear in the backpropagation algorithm. Moreover, we also verified the feasibility of our proposed method by implementing it as a computer program. We compared the performance of our proposed method with the pooling methods under similar conditions. In the experiment, our proposed method achieved better performance than the pooling methods. |
URI: | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/72823 |
DOI: | 10.6342/NTU201901691 |
全文授權: | 有償授權 |
顯示於系所單位: | 應用數學科學研究所 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-108-1.pdf 目前未授權公開取用 | 528.39 kB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。