Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電機工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/89105
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor王勝德zh_TW
dc.contributor.advisorSheng-De Wangen
dc.contributor.author謝長軒zh_TW
dc.contributor.authorChang-Hsuan Hsiehen
dc.date.accessioned2023-08-16T17:09:19Z-
dc.date.available2023-11-09-
dc.date.copyright2023-08-16-
dc.date.issued2023-
dc.date.submitted2023-08-08-
dc.identifier.citation[1] M. A. Carreira-Perpinán and Y. Idelbayev. “learning-compression"algorithms for neural net pruning. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 8532–8541, 2018.
[2] W. Chen, J. Wilson, S. Tyree, K. Weinberger, and Y. Chen. Compressing neural networks with the hashing trick. In International conference on machine learning, pages 2285–2294. PMLR, 2015.
[3] Y. Chen, X. Wen, Y. Zhang, and Q. He. Fpc: Filter pruning via the contribution of output feature map for deep convolutional neural networks acceleration. Know.-Based Syst., 238(C), 2022.
[4] R. Duggal, C. Xiao, R. Vuduc, D. H. Chau, and J. Sun. Cup: Cluster pruning for compressing deep neural networks. In 2021 IEEE International Conference on Big Data (Big Data), pages 5102–5106. IEEE, 2021.
[5] T. Elsken, J. H. Metzen, and F. Hutter. Neural architecture search: A survey. The Journal of Machine Learning Research, 20(1):1997–2017, 2019.
[6] J. Gou, B. Yu, S. J. Maybank, and D. Tao. Knowledge distillation: A survey. International Journal of Computer Vision, 129:1789–1819, 2021.
[7] S. Han, H. Mao, and W. J. Dally. Deep compression: Compressing deep neural networks with pruning, trained quantization and huffman coding. International Conference on Learning Representations (ICLR), 2016.
[8] K. He, X. Zhang, S. Ren, and J. Sun. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.
[9] Y. He, X. Dong, G. Kang, Y. Fu, C. Yan, and Y. Yang. Asymptotic soft filter pruning for deep convolutional neural networks. IEEE transactions on cybernetics, 50(8):3594–3604, 2019.
[10] Y. He, G. Kang, X. Dong, Y. Fu, and Y. Yang. Soft filter pruning for accelerating deep convolutional neural networks. arXiv preprint arXiv:1808.06866, 2018.
[11] Y. He, P. Liu, Z. Wang, Z. Hu, and Y. Yang. Filter pruning via geometric median for deep convolutional neural networks acceleration. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 4340–4349, 2019.
[12] Y. He, X. Zhang, and J. Sun. Channel pruning for accelerating very deep neural networks. In Proceedings of the IEEE international conference on computer vision, pages 1389–1397, 2017.
[13] G. Hinton, O. Vinyals, and J. Dean. Distilling the knowledge in a neural network. In NIPS Deep Learning and Representation Learning Workshop, 2015.
[14] H. Hu, R. Peng, Y.-W. Tai, and C.-K. Tang. Network trimming: A data-driven neuron pruning approach towards efficient deep architectures. arXiv preprint arXiv:1607.03250, 2016.
[15] A. Jacot, F. Gabriel, and C. Hongler. Neural tangent kernel: Convergence and generalization in neural networks. Advances in neural information processing systems, 31, 2018.
[16] H. Li, A. Kadav, I. Durdanovic, H. Samet, and H. P. Graf. Pruning filters for efficient convnets. In International Conference on Learning Representations, 2017.
[17] M. Lin, R. Ji, Y. Wang, Y. Zhang, B. Zhang, Y. Tian, and L. Shao. Hrank: Filter pruning using high-rank feature map. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 1529–1538, 2020.
[18] M. Lin, R. Ji, Y. Zhang, B. Zhang, Y. Wu, and Y. Tian. Channel pruning via automatic structure search. arXiv preprint arXiv:2001.08565, 2020.
[19] S. Lin, R. Ji, C. Yan, B. Zhang, L. Cao, Q. Ye, F. Huang, and D. Doermann. Towards optimal structured cnn pruning via generative adversarial learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 2790–2799, 2019.
[20] Z. Liu, J. Li, Z. Shen, G. Huang, S. Yan, and C. Zhang. Learning efficient convolutional networks through network slimming. In Proceedings of the IEEE international conference on computer vision, pages 2736–2744, 2017.
[21] Z. Liu, P. Wang, and Z. Li. More-similar-less-important: Filter pruning via kmeans clustering. In 2021 IEEE International Conference on Multimedia and Expo (ICME), pages 1–6. IEEE, 2021.
[22] J.-H. Luo, J. Wu, and W. Lin. Thinet: A filter level pruning method for deep neural network compression. In Proceedings of the IEEE international conference on computer vision, pages 5058–5066, 2017.
[23] J. Mellor, J. Turner, A. Storkey, and E. J. Crowley. Neural architecture search without training. In International Conference on Machine Learning, pages 7588–7598. PMLR, 2021.
[24] P. Molchanov, A. Mallya, S. Tyree, I. Frosio, and J. Kautz. Importance estimation for neural network pruning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 11264–11272, 2019.
[25] P. Molchanov, S. Tyree, T. Karras, T. Aila, and J. Kautz. Pruning convolutional neural networks for resource efficient inference. In International Conference of Learning Representation (ICLR), 2016.
[26] T. Niu, Y. Teng, and P. Zou. Asymptotic soft cluster pruning for deep neural networks. arXiv preprint arXiv:2206.08186, 2022.
[27] O. Russakovsky, J. Deng, H. Su, J. Krause, S. Satheesh, S. Ma, Z. Huang, A. Karpathy, A. Khosla, M. Bernstein, et al. Imagenet large scale visual recognition challenge. International journal of computer vision, 115:211–252, 2015.
[28] H. Tanaka, D. Kunin, D. L. Yamins, and S. Ganguli. Pruning neural networks without any data by iteratively conserving synaptic flow. Advances in neural information processing systems, 33:6377–6389, 2020.
[29] M.-T. Wu, H.-I. Lin, and C.-W. Tsai. A training-free genetic neural architecture search. In Proceedings of the 2021 ACM International Conference on Intelligent Computing and Its Emerging Applications, ACM ICEA ’21, page 65–70, New York, NY, USA, 2022. Association for Computing Machinery.
[30] X. Xu, Q. Chen, L. Xie, and H. Su. Batch-normalization-based soft filter pruning for deep convolutional neural networks. In 2020 16th International Conference on Control, Automation, Robotics and Vision (ICARCV), pages 951–956. IEEE, 2020.
[31] C. Yang and H. Liu. Channel pruning based on convolutional neural network sensitivity. Neurocomput., 507(C):97–106, 2022.
[32] J. Yang, X. Shen, J. Xing, X. Tian, H. Li, B. Deng, J. Huang, and X.-s. Hua. Quantization networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 7308–7316, 2019.
[33] J. Ye, X. Lu, Z. Lin, and J. Z. Wang. Rethinking the smaller-norm-less-informative assumption in channel pruning of convolution layers. In International Conference on Learning Representations, 2018.
[34] R. Yu, A. Li, C.-F. Chen, J.-H. Lai, V. I. Morariu, X. Han, M. Gao, C.-Y. Lin, and L. S. Davis. Nisp: Pruning networks using neuron importance score propagation. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 9194–9203, 2018.
[35] A. Zhou, A. Yao, Y. Guo, L. Xu, and Y. Chen. Incremental network quantization: Towards lossless cnns with low-precision weights. In International Conference on Learning Representations.
[36] H. Zhuo, X. Qian, Y. Fu, H. Yang, and X. Xue. Scsp: Spectral clustering filter pruning with soft self-adaption manners. arXiv preprint arXiv:1806.05320, 2018.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/89105-
dc.description.abstract近年來,模型壓縮技術已成為將大型模型部署於資源受限系統中的重要步驟。本研究提出了一種基於線性區間分析的混合式濾波器剪枝方法,通過設置距離閥值和範數閥值,將叢集式剪枝法和範數剪枝法的特性結合起來。此外,利用神經網路架構搜尋領域中的線性區間分析法,可以快速判斷模型架構的鑑別度特性。透過以上設計,混合式濾波器剪枝法可以在短時間內搜尋出剪枝後表現最佳的結構和對應閥值,從而達到壓縮模型的效果。使用此方法可以在 CIFAR10 資料集上將 ResNet56 網路之浮點數運算量 (FLOPs) 壓縮至原本的 20%,同時僅有不到2% 的精確度損失,在其他資料集與網路架構上也可以在高壓縮率下達到較低的精確度損失。zh_TW
dc.description.abstractModel compression techniques have become crucial for deploying large models on resource-limited systems. This study proposes a hybrid filter pruning method based on linear region analysis. Our approach combines the strengths of cluster pruning and norm-based filter pruning by introducing thresholds based on Euclidean distance and norm distance. Moreover, we incorporate the linear region analysis method from neural network architecture search to estimate the performance of trained model architectures. This enables us to efficiently search for the optimal pruned structure with corresponding threshold values, achieving effective model compression. Experimental results on the CIFAR10 dataset with ResNet56 demonstrate that our hybrid pruning method can achieve an 80% reduction in FLOPs with less than 2% accuracy loss. Furthermore, our method consistently performs well in terms of accuracy drop at high compression rates across various datasets and network architectures.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-08-16T17:09:19Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2023-08-16T17:09:19Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontentsVerification Letter from the Oral Examination Committee i
Acknowledgements ii
摘要 iii
Abstract iv
Contents v
List of Figures vii
List of Tables viii
Chapter 1 Introduction 1
Chapter 2 Related Works 3
2.1 Channel Pruning . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2 Norm Based Channel Pruning . . . . . . . . . . . . . . . . . . . . . 4
2.3 Cluster Pruning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.4 NASWOT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Chapter 3 Approach 8
3.1 Overview of Hybrid Filter Pruning Method . . . . . . . . . . . . . . 8
3.2 Norm Threshold-Based Channel Pruning . . . . . . . . . . . . . . . 9
3.3 Hybrid Pruning Threshold Search . . . . . . . . . . . . . . . . . . . 10
3.4 Pruning Rate Constraint . . . . . . . . . . . . . . . . . . . . . . . . 14
Chapter 4 Experiments 18
4.1 Experiment Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
4.1.1 Dataset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
4.1.2 Baseline and Retraining . . . . . . . . . . . . . . . . . . . . . . . . 19
4.2 ResNet on CIFAR10 . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.3 ResNet on CIFAR100 . . . . . . . . . . . . . . . . . . . . . . . . . 22
4.4 ResNet on Tiny ImgaeNet . . . . . . . . . . . . . . . . . . . . . . . 23
4.5 Reduce on Searching Time . . . . . . . . . . . . . . . . . . . . . . . 24
Chapter 5 Ablation Study 25
5.1 Different Pruning Rate Constraints . . . . . . . . . . . . . . . . . . . 25
5.2 The Evaluation Ability of NASWOT Scoring Method . . . . . . . . 27
Chapter 6 Conclusion 28
References 29
-
dc.language.isoen-
dc.subject通道剪枝zh_TW
dc.subject模型剪枝zh_TW
dc.subject深度學習zh_TW
dc.subject模型壓縮zh_TW
dc.subjectNetwork pruningen
dc.subjectDeep learningen
dc.subjectNetwork compressionen
dc.subjectChannel pruningen
dc.title基於線性區間分析之混合式濾波器剪枝法zh_TW
dc.titleA Hybrid Filter Pruning Method based on Linear Region Analysisen
dc.typeThesis-
dc.date.schoolyear111-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee鄧惟中;于天立zh_TW
dc.contributor.oralexamcommitteeWei-Chung Teng;Tian-Li Yuen
dc.subject.keyword深度學習,模型壓縮,模型剪枝,通道剪枝,zh_TW
dc.subject.keywordDeep learning,Network pruning,Network compression,Channel pruning,en
dc.relation.page32-
dc.identifier.doi10.6342/NTU202303049-
dc.rights.note同意授權(限校園內公開)-
dc.date.accepted2023-08-09-
dc.contributor.author-college電機資訊學院-
dc.contributor.author-dept電機工程學系-
顯示於系所單位:電機工程學系

文件中的檔案:
檔案 大小格式 
ntu-111-2.pdf
授權僅限NTU校內IP使用(校園外請利用VPN校外連線服務)
1.8 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved