基於最佳化演算法的類神經網路剪枝策略

No Thumbnail Available

Date

2021

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

隨著深度學習領域不斷的進步,類神經網路的架構比起以往擁有更多的參數量和記憶體的使用量,相對地對於硬體的要求也就更高。如何在有限的記憶體和硬體效能中擁有差不多的辨識效能也成為需要被關注的問題之一,而網路剪枝則是最直接能夠解決參數量過大問題,將網路中不必要的參數刪除,就能夠省去大量的記憶體空間。過去在網路剪枝當中,通常的策略都是將較小的權重刪除。這些網路剪枝方法的主要策略都是假設網路裡較小的權重,對於網路本身的影響較小,而可以被捨棄掉。但是我們認為這個假設對於神經網路而言並不是絕對的。在本篇論文中我們假設小權重也有可能會是重要權重的可能性,我們提出一個最佳化的剪枝策略,在剪枝時不只留下較大權重,還會留下由最佳化策略所挑選出的較小權重,能證明保留網路中重要的較小權重,有益於剪枝網路的準確率, 讓剪枝網路能夠在低參數量和高準確度中取得最佳的權衡。實驗結果說明在相較於只留較大權重的做法,透過最佳化的方法留下的較小權重,在相同的剪枝率網路會有更高的準確度。
With the continuous progress in the field of deep learning, artificial neural networks require more parameters and memory footprint, as well as higher hardware requirements. How to achieve high recognition accuracy with limited memory and hardware has become a critical issue. Network pruning is the most direct way to solve the problem of excessive parameters. By deleting unnecessary parameters in the network, a lot of memory space can be saved.In the proposed methods, the strategies of network pruning mainly consider pruning the small weights. The main strategy of these network pruning methods is to assume that the smaller weights in the network have less impact on the network itself and can be discarded. But we believe that this assumption is not absolute for neural networks.In this thesis, we assume that small weights may also be important weights. We propose an optimized pruning strategy that not only leaves larger weights when pruning, but also leaves smaller weights selected by the optimized strategy. During pruning process, not only the larger weights are left, but also the smaller weights selected by the optimization. We treat the retention of parameters in a network model as an optimization problem. Experiment results show that that keeping important small weights in the network is beneficial to the accuracy of the pruning network, so that the pruning network can achieve the best trade-off between low parameter amount and high accuracy

Description

Keywords

深度學習, 類神經網路, 網路剪枝, Deep learning, neural network, network pruning

Citation

Collections

Endorsement

Review

Supplemented By

Referenced By