Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電機工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88375
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor王勝德zh_TW
dc.contributor.advisorSheng-De Wangen
dc.contributor.author楊仁傑zh_TW
dc.contributor.authorJEN-CHIEH YANGen
dc.date.accessioned2023-08-09T16:47:05Z-
dc.date.available2023-11-09-
dc.date.copyright2023-08-09-
dc.date.issued2023-
dc.date.submitted2023-07-25-
dc.identifier.citation[1] K. Deb and H. Jain. An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part i: Solving problems with box constraints. IEEE Transactions on Evolutionary Computation, 18(4), 2014.
[2] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation, 6(2), 2002.
[3] X. Dong and Y. Yang. Network pruning via transformable architecture search. Advances in Neural Information Processing Systems, 32, 2019.
[4] R. Duggal, C. Xiao, R. Vuduc, D. H. Chau, and J. Sun. Cup: Cluster pruning for compressing deep neural networks. In 2021 IEEE International Conference on Big Data (Big Data). IEEE, 2021.
[5] A. E. Eiben and J. E. Smith. Introduction to evolutionary computing. Springer, 2015.
[6] T. Elsken, J. H. Metzen, and F. Hutter. Neural architecture search: A survey. The Journal of Machine Learning Research, 20(1), 2019.
[7] S. Han, H. Mao, and W. J. Dally. Deep compression: Compressing deep neural networks with pruning, trained quantization and huffman coding. International Conference on Learning Representations (ICLR), 2016.
[8] K. He, X. Zhang, S. Ren, and J. Sun. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, 2016.
[9] Y. He, X. Dong, G. Kang, Y. Fu, C. Yan, and Y. Yang. Asymptotic soft filter pruning for deep convolutional neural networks. IEEE transactions on cybernetics, 50(8), 2019.
[10] Y. He, G. Kang, X. Dong, Y. Fu, and Y. Yang. Soft filter pruning for accelerating deep convolutional neural networks. arXiv preprint arXiv:1808.06866, 2018.
[11] Y. He, J. Lin, Z. Liu, H. Wang, L.-J. Li, and S. Han. Amc: Automl for model compression and acceleration on mobile devices. In Proceedings of the European conference on computer vision (ECCV), 2018.
[12] Y. He, P. Liu, Z. Wang, Z. Hu, and Y. Yang. Filter pruning via geometric median for deep convolutional neural networks acceleration. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2019.
[13] Y. He, X. Zhang, and J. Sun. Channel pruning for accelerating very deep neural networks. In Proceedings of the IEEE international conference on computer vision, 2017.
[14] G. Hinton, O. Vinyals, and J. Dean. Distilling the knowledge in a neural network. In NIPS Deep Learning and Representation Learning Workshop, 2015.
[15] J. H. Holland. Genetic algorithms. Scientific American, 267(1), 1992.
[16] Y. Idelbayev. Proper ResNet implementation for CIFAR10/CIFAR100 in PyTorch. https://github.com/akamaster/pytorch_resnet_cifar10.
[17] IntelLabs. Resnet for cifar10. https://github.com/IntelLabs/distiller/blob/master/distiller/models/cifar10/resnet_cifar.py.
[18] A. Krizhevsky, G. Hinton, et al. Learning multiple layers of features from tiny images. 2009.
[19] kuangliu. Train cifar10 with pytorch. https://github.com/kuangliu/pytorch-cifar
[20] H. Li, A. Kadav, I. Durdanovic, H. Samet, and H. P. Graf. Pruning filters for efficient convnets. In International Conference on Learning Representations, 2017.
[21] M. Lin, R. Ji, Y. Wang, Y. Zhang, B. Zhang, Y. Tian, and L. Shao. Hrank: Filter pruning using high-rank feature map. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2020.
[22] M. Lin, R. Ji, Y. Zhang, B. Zhang, Y. Wu, and Y. Tian. Channel pruning via automatic structure search. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI), 2020.
[23] J.-H. Luo, J. Wu, and W. Lin. Thinet: A filter level pruning method for deep neural network compression. In Proceedings of the IEEE international conference on computer vision, 2017.
[24] T. Niu, Y. Teng, and P. Zou. Asymptotic soft cluster pruning for deep neural networks. arXiv preprint arXiv:2206.08186, 2022.
[25] S. Petchrompo, D. W. Coit, A. Brintrup, A. Wannakrairot, and A. K. Parlikad. A review of pareto pruning methods for multi-objective optimization. Computers & Industrial Engineering, 2022.
[26] O. Russakovsky, J. Deng, H. Su, J. Krause, S. Satheesh, S. Ma, Z. Huang, A. Karpathy, A. Khosla, M. Bernstein, et al. Imagenet large scale visual recognition challenge. International journal of computer vision, 115, 2015.
[27] N. Srinivas and K. Deb. Multiobjective optimization using nondominated sorting in genetic algorithms. Evolutionary Computation, 2(3), 1994.
[28] H. Tanaka, D. Kunin, D. L. Yamins, and S. Ganguli. Pruning neural networks without any data by iteratively conserving synaptic flow. Advances in neural information processing systems, 33, 2020.
[29] S. Verma, M. Pant, and V. Snasel. A comprehensive review on nsga-ii for multi-objective combinatorial optimization problems. IEEE Access, 9, 2021.
[30] K. Xu, D. Zhang, J. An, L. Liu, L. Liu, and D. Wang. Genexp: Multi-objective pruning for deep neural network based on genetic algorithm. Neurocomputing, 451, 2021.
[31] H. Zhuo, X. Qian, Y. Fu, H. Yang, and X. Xue. Scsp: Spectral clustering filter pruning with soft self-adaption manners. arXiv preprint arXiv:1806.05320, 2018.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88375-
dc.description.abstract本研究提出了一種名為 NSGAP 的新方法,它是一種基於 "非支配性排序遺傳算法" 的自動化剪枝方法。與現有方法相比, NSGAP 在壓縮率、準確性和搜索時間方面都具有競爭力。 NSGAP 的一個顯著特點是它產生了一個柏拉圖前緣作為搜索結果,從而避免為獲得不同壓縮率的架構而進行多次搜索的需要。

為了加強搜索過程,本研究還引入了 "非對稱高斯分佈"方法,將其應用於遺傳算法的初始化群體,從而有效地改善了搜索結果。總體而言,NSGAP方法在在模型剪枝和壓縮領域具有競爭力和高效性。它的貢獻在於多目標優化能力、柏拉圖前緣的生成和"非對稱高斯分佈"的引入,在壓縮率、準確性和搜索時間方面都取得了優異的表現。
zh_TW
dc.description.abstractThis study proposes NSGAP, an innovative architecture for automatic pruning compression that incorporates the "Non-Dominated Sorting Genetic Algorithm II" (NSGA-II). NSGAP possesses an advantage in terms of compression rate, accuracy, and search duration, diverging from conventional methods. A notable feature of NSGAP is its ability to generate a Pareto Front as a search result, eliminating the need for multiple searches to obtain architectures with different compression rates.

To optimize the search procedure, this investigation incorporates the "Asymmetric Gaussian Distribution" (AGD) strategy. By applying the AGD to the initial population of the genetic algorithm, the study facilitates improved search outcomes. In conclusion, NSGAP demonstrates impressive competitiveness and efficiency in the field of model pruning and compression. The principal contributions of this approach include its capability for multi-objective optimization, the generation of the Pareto Front, and the integration of AGD. These contributions result in superior performance in terms of compression rate, accuracy, and search duration.
en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-08-09T16:47:05Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2023-08-09T16:47:05Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontentsVerification Letter i
Acknowledgements ii
摘要iii
Abstract iv
Contents v
List of Figures viii
List of Tables ix
Chapter 1 Introduction 1
1.1 Problems 1
1.2 Contribution 2
Chapter 2 Related Works 3
2.1 Approach of Pruning 3
2.1.1 Soft Filter Pruning (SFP) 3
2.1.2 Filter Pruning via Geometric Median (FPGM)3
2.1.3 Cluster Pruning (CUP) 4
2.1.4 Channel Pruning via Automatic Structure Search (ABCPruner) 4
2.2 Filter Pruning 5
2.3 Non-Dominated Sorting Genetic Algorithm II 6
2.3.1 Non-Dominated Sorting 7
2.3.2 Crowding Distance 10
Chapter 3 Approach 12
3.1 Problem Definition 12
3.1.1 Traditional Definition 13
3.1.2 Multi-Objective Problem Definition 14
3.2 Initialization 15
3.2.1 Random Initialization 15
3.2.2 AGD: Asymmetric Gaussian Distribution Initialization 16
3.3 Genetic Algorithm 19
3.3.1 Continuous Crossover 19
3.3.2 Random Mutation 20
Chapter 4 Experiments 23
4.1 Experiment Overview 23
4.2 Environment 24
4.3 Baseline Models 25
4.3.1 ResNet56/ResNet110 25
4.3.2 ResNet18 25
4.4 Dataset 27
4.4.1 CIFAR-10 28
4.4.2 Tiny-ImageNet 28
4.5 Ablation Experiment 30
4.5.1 Consider the Initialization 30
4.5.2 Consider the Balance of Parameters and FLOPs 31
4.6 Results and Discussion 33
4.6.1 Results for the Different Baseline Models 33
4.6.2 Results for the Same Baseline Models 34
4.6.3 Time Consumption of the Same Baseline Models 38
Chapter 5 Conclusion 40
References 41
Appendix A — Experiment Results 44
-
dc.language.isoen-
dc.subject濾波器剪枝zh_TW
dc.subject非對稱高斯分佈zh_TW
dc.subject非支配排序zh_TW
dc.subject非支配排序遺傳算法IIzh_TW
dc.subject柏拉圖前緣zh_TW
dc.subject距離擁擠度zh_TW
dc.subjectPareto Frontsen
dc.subjectCrowding Distanceen
dc.subjectAsymmetric Gaussian Distributionen
dc.subjectNon-Dominated Sortingen
dc.subjectNon-Dominated Sorting Genetic Algorithms IIen
dc.subjectFilter Pruningen
dc.title基於非支配排序遺傳算法的深度學習模型剪枝zh_TW
dc.titleNSGAP: Filter Pruning for Deep Learning Models using NSGA-IIen
dc.typeThesis-
dc.date.schoolyear111-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee于天立;鄧惟中zh_TW
dc.contributor.oralexamcommitteeTian-Li Yu;Wei-Chung Tengen
dc.subject.keyword非支配排序遺傳算法II,柏拉圖前緣,非支配排序,距離擁擠度,非對稱高斯分佈,濾波器剪枝,zh_TW
dc.subject.keywordNon-Dominated Sorting Genetic Algorithms II,Pareto Fronts,Non-Dominated Sorting,Crowding Distance,Asymmetric Gaussian Distribution,Filter Pruning,en
dc.relation.page47-
dc.identifier.doi10.6342/NTU202301925-
dc.rights.note同意授權(限校園內公開)-
dc.date.accepted2023-07-27-
dc.contributor.author-college電機資訊學院-
dc.contributor.author-dept電機工程學系-
顯示於系所單位:電機工程學系

文件中的檔案:
檔案 大小格式 
ntu-111-2.pdf
授權僅限NTU校內IP使用(校園外請利用VPN校外連線服務)
3.66 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved