Dynamic channel pruning: Feature boosting and suppression
7th International Conference on Learning Representations, ICLR 2019
International Conference on Learning Representations
MetadataShow full item record
Gao, X., Zhao, Y., Dudziak, L., Mullins, R., & Cheng-Zhong, X. (2019). Dynamic channel pruning: Feature boosting and suppression. 7th International Conference on Learning Representations, ICLR 2019 https://iclr.cc/Conferences/2019
© 7th International Conference on Learning Representations, ICLR 2019. All Rights Reserved. Making deep convolutional neural networks more accurate typically comes at the cost of increased computational and memory resources. In this paper, we reduce this cost by exploiting the fact that the importance of features computed by convolutional layers is highly input-dependent, and propose feature boosting and suppression (FBS), a new method to predictively amplify salient convolutional channels and skip unimportant ones at run-time. FBS introduces small auxiliary connections to existing convolutional layers. In contrast to channel pruning methods which permanently remove channels, it preserves the full network structures and accelerates convolution by dynamically skipping unimportant input and output channels. FBS-augmented networks are trained with conventional stochastic gradient descent, making it readily available for many state-of-the-art CNNs. We compare FBS to a range of existing channel pruning and dynamic execution schemes and demonstrate large improvements on ImageNet classification. Experiments show that FBS can respectively provide 5× and 2× savings in compute on VGG-16 and ResNet-18, both with less than 0.6% top-5 accuracy loss.
This work is supported in part by the National Key R&D Program of China (No. 2018YFB1004804), the National Natural Science Foundation of China (No. 61806192). We thank EPSRC for providing Yiren Zhao his doctoral scholarship.
External link: https://iclr.cc/Conferences/2019
This record's URL: https://www.repository.cam.ac.uk/handle/1810/299948