Title
Revisiting Random Channel Pruning for Neural Network Compression
Abstract
Channel (or 3D filter) pruning serves as an effective way to accelerate the inference of neural networks. There has been a flurry of algorithms that try to solve this practical problem, each being claimed effective in some ways. Yet, a benchmark to compare those algorithms directly is lacking, mainly due to the complexity of the algorithms and some custom settings such as the particular network configuration or training procedure. A fair benchmark is important for the further development of channel pruning. Meanwhile, recent investigations reveal that the channel configurations discovered by pruning algorithms are at least as important as the pre-trained weights. This gives channel pruning a new role, namely searching the optimal channel configuration. In this paper, we try to determine the channel configuration of the pruned models by random search. The proposed approach provides a new way to compare different methods, namely how well they behave compared with random pruning. We show that this simple strategy works quite well compared with other channel pruning methods. We also show that under this setting, there are surprisingly no clear winners among different channel importance evaluation methods, which then may tilt the research efforts into advanced channel configuration searching methods. Code will be released at https://github.com/ofsoundof/random_channel_pruning.
Year
DOI
Venue
2022
10.1109/CVPR52688.2022.00029
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Keywords
DocType
ISSN
Machine learning, Recognition: detection,categorization,retrieval
Conference
1063-6919
ISBN
Citations 
PageRank 
978-1-6654-6947-0
0
0.34
References 
Authors
6
6
Name
Order
Citations
PageRank
Yawei Li1315.58
Kamil Adamczewski200.34
Wen Li337321.87
Shuhang Gu470128.25
Radu Timofte51880118.45
Luc Van Gool600.34