請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101347| 標題: | 在結構剪枝中保留模型權重的方法 Structural Pruning without Losing Model Weights |
| 作者: | 曾貴鴻 Kuei-Hung Tseng |
| 指導教授: | 施吉昇 Chi-Sheng Shih |
| 關鍵字: | 結構化剪枝,權重壓縮權重重新分配單次剪枝微調有效的模型壓縮 Structural Pruning,Weight CompressionWeight RedistributionOne-shot PruningFine-tuningEfficient Model Compression |
| 出版年 : | 2026 |
| 學位: | 碩士 |
| 摘要: | 深度神經網路在各種應用中展現了優異的成果,但其對空間與運算資源的需求,對於資源受限的設備構成了重大的挑戰。模型剪枝(model pruning)透過移除較不重要的權重來解決此問題,通常會再進行微調(fine-tuning)以恢復性能。儘管已有研究顯示,在剪枝後更新權重可以提升結果,但這些方法往往依賴特定的剪枝準則,限制了其通用性。本研究提出了一種名為 WeightCompression 的有效且具彈性的方式,在微調前,將被剪除的權重重新分配到各層中保留的權重上。這樣的重分配機制為微調提供了更佳的初始化,使得本方法僅需一次剪枝步驟,即可達到與傳統多次剪枝方法相當的效能。與現有方法相比,WeightCompression 可將剪枝流程加速 3.27 倍至 12.58 倍,同時維持競爭力的準確率。此外,該方法對剪枝準則具有高度的適應性,並可有效擴展至大型模型。實驗結果顯示,WeightCompression 是一個靈活且高效的剪枝後權重更新方法。 Deep neural networks have shown promising results across a wide range of applications, but the requirements on space and computation present significant challenges for deployment on resource-constrained devices. Model pruning addresses this issue by removing less important weights, typically followed by fine-tuning to recover the performance. While prior work has shown that updating weights after pruning can improve results, these approaches are often tied to specific pruning criteria, limiting their generality. This work proposes WeightCompression, an effective and flexible method that redistributes pruned weights into the remaining ones within each layer before fine-tuning. This redistribution provides a better initialization for fine-tuning, enabling the proposed method to match the performance of iterative pruning approaches using only a single pruning step. Compared to existing methods, WeightCompression accelerates the pruning process by 3.27x to 12.58x while maintaining competitive accuracy. Last but not the least, it is agnostic to the choice of pruning criterion and scales well to large models. The results suggest that WeightCompression can be a flexible and efficient framework for post-pruning weight updates. |
| URI: | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101347 |
| DOI: | 10.6342/NTU202600167 |
| 全文授權: | 同意授權(限校園內公開) |
| 電子全文公開日期: | 2028-01-20 |
| 顯示於系所單位: | 資訊工程學系 |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-114-1.pdf 未授權公開取用 | 13.9 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
