Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊網路與多媒體研究所
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/93313
Title: 深度神經網路行列剪枝的近似算法和模擬退火啟發式算法
Approximation Algorithms and Simulated Annealing Heuristics for Row-and-Column Pruning of Deep Neural Networks
Authors: 杜秉翰
Ping-Han Tu
Advisor: 劉邦鋒
Pangfeng Liu
Keyword: 深度學習,參數剪枝,行列剪枝,NP 完備,近似演算法,模擬退火,
Deep learning,Parameter pruning,Row-and-column pruning,,NP-completeness,Approximation algorithms,Simulated annealing,
Publication Year : 2024
Degree: 碩士
Abstract: 卷積神經網路(CNN)在計算機視覺和其他科學領域取得了巨大成功。儘管取得了成就,但最先進的 CNN 模型已經變得龐大,需要大量的計算能力和記憶體資源。作為一種模型壓縮技術,參數剪枝可以降低 CNN 模型所需的計算和記憶體。之前的工作提出了一種參數剪枝方法,通過消除權重矩陣的行和列來壓縮神經網路。移除行和列可以保留權重矩陣的密集結構,而不是非結構化剪枝產生的
稀疏結構。儘管行和列剪枝有效地減小了模型的大小,但在不犧牲準確性的前提下,使模型盡可能地小巧仍然是一個挑戰。我们證明了行和列剪枝問题是一個 NP完全問題,並提出了兩個解為最佳解兩倍以內的近似演算法。我們還表明,這兩種算法的近似比無法更低。最後,我們提出了兩種基於模擬退火的方案來解決行和列剪枝問題。這兩種提出的方案在使用 Food-101 資料集的 ResNet-18 模型上,在 93% 的稀疏度下分别將準確度提高了 2.7% 和 1.6% 。
Convolutional neural networks(CNNs) have achieved immense success in computer vision and other field of science. Despite the achievements, state-of-the-art CNN models have grown into gigantic sizes, which demand much computing power and memory resources. As a model compression technique, parameter pruning can lower the computation and memory a CNN model needs. Previous work has proposed a parameter pruning method, which compresses the neural networks by eliminating the rows and columns of the weight matrices. Removing rows and columns preserves the dense structure of the weight matrix, rather than the sparse structure produced by unstructured pruning. While row-and-column pruning effectively reduces the model size, making the model as compact as possible without sacrificing accuracy remains a challenge. We prove that the row-and-column pruning problem is an NP-complete problem, and we propose two approximation algorithms that provide solutions within twice the cost of the optimal solution. We also show that the approximation ratio cannot be improved for these two algorithms. Finally, we propose two schemes based on simulated annealing to solve the row-and-column pruning problem. The two proposed schemes improve the accuracy by 2.7% and 1.6% respectively, on the ResNet-18 model with 93% sparsity using the Food-101 dataset.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/93313
DOI: 10.6342/NTU202401008
Fulltext Rights: 同意授權(限校園內公開)
Appears in Collections:資訊網路與多媒體研究所

Files in This Item:
File SizeFormat 
ntu-112-2.pdf
Access limited in NTU ip range
861.15 kBAdobe PDF
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved