Global convergence and limit cycle behavior of weights of perceptron

IEEE Trans Neural Netw. 2008 Jun;19(6):938-47. doi: 10.1109/TNN.2007.914187.

Abstract

In this paper, it is found that the weights of a perceptron are bounded for all initial weights if there exists a nonempty set of initial weights that the weights of the perceptron are bounded. Hence, the boundedness condition of the weights of the perceptron is independent of the initial weights. Also, a necessary and sufficient condition for the weights of the perceptron exhibiting a limit cycle behavior is derived. The range of the number of updates for the weights of the perceptron required to reach the limit cycle is estimated. Finally, it is suggested that the perceptron exhibiting the limit cycle behavior can be employed for solving a recognition problem when downsampled sets of bounded training feature vectors are linearly separable. Numerical computer simulation results show that the perceptron exhibiting the limit cycle behavior can achieve a better recognition performance compared to a multilayer perceptron.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Biological Clocks / physiology*
  • Female
  • Humans
  • Male
  • Neural Networks, Computer*
  • Nonlinear Dynamics
  • Pattern Recognition, Automated / methods*
  • Time Factors
  • Voice