Title
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression
Abstract
In this paper, we analyze two popular network compression techniques, i.e. filter pruning and low-rank decomposition, in a unified sense. By simply changing the way the sparsity regularization is enforced, filter pruning and low-rank decomposition can be derived accordingly. This provides another flexible choice for network compression because the techniques complement each other. For example, in popular network architectures with shortcut connections (e.g. ResNet), filter pruning cannot deal with the last convolutional layer in a ResBlock while the low-rank decomposition methods can. In addition, we propose to compress the whole network jointly instead of in a layer-wise manner. Our approach proves its potential as it compares favorably to the state-of-the-art on several benchmarks.
Year
DOI
Venue
2020
10.1109/CVPR42600.2020.00804
CVPR
DocType
Citations 
PageRank 
Conference
3
0.39
References 
Authors
37
5
Name
Order
Citations
PageRank
Yawei Li1315.58
Shuhang Gu270128.25
Christoph Mayer3514.53
Luc Van Gool4275661819.51
Radu Timofte51880118.45