AFINet: Attentive Feature Integration Networks for image classification

Neural Netw. 2022 Nov:155:360-368. doi: 10.1016/j.neunet.2022.08.026. Epub 2022 Sep 5.

Abstract

Convolutional Neural Networks (CNNs) have achieved tremendous success in a number of learning tasks including image classification. Residual-like networks, such as ResNets, mainly focus on the skip connection to avoid gradient vanishing. However, the skip connection mechanism limits the utilization of intermediate features due to simple iterative updates. To mitigate the redundancy of residual-like networks, we design Attentive Feature Integration (AFI) modules, which are widely applicable to most residual-like network architectures, leading to new architectures named AFI-Nets. AFI-Nets explicitly model the correlations among different levels of features and selectively transfer features with a little overhead. AFI-ResNet-152 obtains a 1.24% relative improvement on the ImageNet dataset while decreases the FLOPs by about 10% and the number of parameters by about 9.2% compared to ResNet-152.

Keywords: Attention; CNN; Feature integration; Image classification.

MeSH terms

  • Image Processing, Computer-Assisted* / methods
  • Neural Networks, Computer*