Repository logo
 

Dynamic channel pruning: Feature boosting and suppression

Published version
Peer-reviewed

Loading...
Thumbnail Image

Type

Conference Object

Change log

Authors

Gao, X 
Zhao, Y 
Dudziak, L 
Cheng-Zhong, X 

Abstract

© 7th International Conference on Learning Representations, ICLR 2019. All Rights Reserved. Making deep convolutional neural networks more accurate typically comes at the cost of increased computational and memory resources. In this paper, we reduce this cost by exploiting the fact that the importance of features computed by convolutional layers is highly input-dependent, and propose feature boosting and suppression (FBS), a new method to predictively amplify salient convolutional channels and skip unimportant ones at run-time. FBS introduces small auxiliary connections to existing convolutional layers. In contrast to channel pruning methods which permanently remove channels, it preserves the full network structures and accelerates convolution by dynamically skipping unimportant input and output channels. FBS-augmented networks are trained with conventional stochastic gradient descent, making it readily available for many state-of-the-art CNNs. We compare FBS to a range of existing channel pruning and dynamic execution schemes and demonstrate large improvements on ImageNet classification. Experiments show that FBS can respectively provide 5× and 2× savings in compute on VGG-16 and ResNet-18, both with less than 0.6% top-5 accuracy loss.

Description

Keywords

Journal Title

7th International Conference on Learning Representations, ICLR 2019

Conference Name

International Conference on Learning Representations

Journal ISSN

Volume Title

Publisher

Sponsorship
This work is supported in part by the National Key R&D Program of China (No. 2018YFB1004804), the National Natural Science Foundation of China (No. 61806192). We thank EPSRC for providing Yiren Zhao his doctoral scholarship.