Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資料科學學位學程
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/8475
Title: 綜觀模型架構之代表性: 基於濾波器重要性學習之注意力機制通道剪枝
Dominance in an Overall Sight: Attentive Channel Pruning via Filter Importance Learning
Authors: Ting-An Chen
陳庭安
Advisor: 陳銘憲(Ming-Syan Chen)
Co-Advisor: 楊得年(De-Nian Yang)
Keyword: 卷積神經網路模型壓縮,通道剪枝,結構重要性學習,韋伯機率分佈,
CNN model compression,Channel pruning,Structure importance learning,Weibull probabilistic distribution,
Publication Year : 2020
Degree: 碩士
Abstract: 通道剪枝(Channel pruning)為一模型壓縮的方法以加速卷積神經網路 (Convolutional neural networks, CNNs)。然而,近期研究決定結構重要性時並無考慮特徵之冗餘性,以及濾波器於接續數層卷積層之遞減影響。因此本文提出一個兩階段剪枝框架,DOMINOS。在不損失過多準確率下,首先衡量結構重要性以有效移除冗餘模型結構。確立輕量化的模型架構後,DOMINOS 進一步使用注意力機制加以強調富含預測資訊的結構。本文定義 feature representativeness 用以衡量特徵的獨特性,以及定義 domino influences 用以表示濾波器於接續卷積神經網路層逐層衰減的影響,並使用存活分析當中的韋伯機率模型衡量該影響。此外,本文設計了一個 channel-to-block 剪枝方法有效移除模型當中不重要的塊(blocks)以解決連結層輸出特徵之通道數不一致的問題。實驗結果顯示 CNN 模型基於本文所提出的剪枝框架,在不失過多準確率情況下,大幅減少浮點數運算量以及參數量,其效果顯著超出近期相關研究之成果。
Channel pruning is a model compression approach to accelerate convolutional neural networks (CNNs). Nevertheless, previous approaches determined the structure’s importance without considering the redundancy of features and the decaying influences of filters on the subsequent layers. In this thesis, therefore, we propose a two-stage pruning framework, DOMINOS, to first effectively remove redundant structures without a significant accuracy loss by leveraging the structure's importance. After identifying the lightweight architecture, DOMINOS applies the attention mechanism to focus on the informative structures for prediction. We define feature representativeness to measure the uniqueness of feature patterns and domino influences to represent the decaying influences of filters on the subsequent CNN layers by leveraging Weibull probabilistic models in survival analysis. Moreover, we devise a channel-to-block pruning approach to efficiently remove unimportant blocks to solve the mismatch of channel sizes for output feature maps in linked layers. Experimental results manifest that CNN models under DOMINOS pruning framework achieve prominent reductions in flops and parameters without compromising the accuracy, significantly outperforming the state-of-the-art.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/8475
DOI: 10.6342/NTU202001198
Fulltext Rights: 同意授權(全球公開)
metadata.dc.date.embargo-lift: 2025-08-20
Appears in Collections:資料科學學位學程

Files in This Item:
File SizeFormat 
U0001-2906202022384800.pdf1.15 MBAdobe PDFView/Open
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved