A homotopy training algorithm for fully connected neural networks

Proc Math Phys Eng Sci. 2019 Nov;475(2231):20190662. doi: 10.1098/rspa.2019.0662. Epub 2019 Nov 13.

Abstract

In this paper, we present a homotopy training algorithm (HTA) to solve optimization problems arising from fully connected neural networks with complicated structures. The HTA dynamically builds the neural network starting from a simplified version and ending with the fully connected network via adding layers and nodes adaptively. Therefore, the corresponding optimization problem is easy to solve at the beginning and connects to the original model via a continuous path guided by the HTA, which provides a high probability of obtaining a global minimum. By gradually increasing the complexity of the model along the continuous path, the HTA provides a rather good solution to the original loss function. This is confirmed by various numerical results including VGG models on CIFAR-10. For example, on the VGG13 model with batch normalization, HTA reduces the error rate by 11.86% on the test dataset compared with the traditional method. Moreover, the HTA also allows us to find the optimal structure for a fully connected neural network by building the neutral network adaptively.

Keywords: homotopy method; machine learning; neural network; training algorithm.