The New ERA in Supervised Learning

Neural Netw. 1997 Mar;10(2):343-352. doi: 10.1016/s0893-6080(96)00090-1.

Abstract

Conventional methods of supervised learning are inevitably faced with the problem of local minima; evidence is presented that second order methods such as the conjugate gradient and quasi-Newton techniques are particularly susceptible to being trapped in sub-optimal solutions. A new technique, expanded range approximation (ERA), is presented, which by the use of a homotopy on the range of the target outputs allows supervised learning methods to find a global minimum of the error function in almost every case. Copyright 1997 Elsevier Science Ltd. All Rights Reserved.