Convolutional Neural Networks (CNNs) have achieved tremendous success in a number of learning tasks including image classification. Residual-like networks, such as ResNets, mainly focus on the skip connection to avoid gradient vanishing. However, the skip connection mechanism limits the utilization of intermediate features due to simple iterative updates. To mitigate the redundancy of residual-like networks, we design Attentive Feature Integration (AFI) modules, which are widely applicable to most residual-like network architectures, leading to new architectures named AFI-Nets. AFI-Nets explicitly model the correlations among different levels of features and selectively transfer features with a little overhead. AFI-ResNet-152 obtains a 1.24% relative improvement on the ImageNet dataset while decreases the FLOPs by about 10% and the number of parameters by about 9.2% compared to ResNet-152.
Keywords: Attention; CNN; Feature integration; Image classification.
Copyright © 2022 Elsevier Ltd. All rights reserved.