Format

Send to

Choose Destination
See comment in PubMed Commons below
IEEE Trans Neural Netw. 2005 May;16(3):533-40.

Deterministic convergence of an online gradient method for BP neural networks.

Author information

1
Applied Mathematics Department, Dalian University of Technology, Dalian 116023, China. wuweiw@dlut.edu.cn

Abstract

Online gradient methods are widely used for training feedforward neural networks. We prove in this paper a convergence theorem for an online gradient method with variable step size for backward propagation (BP) neural networks with a hidden layer. Unlike most of the convergence results that are of probabilistic and nonmonotone nature, the convergence result that we establish here has a deterministic and monotone nature.

PMID:
15940984
DOI:
10.1109/TNN.2005.844903
[Indexed for MEDLINE]
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for IEEE Engineering in Medicine and Biology Society
    Loading ...
    Support Center