Send to

Choose Destination
IEEE Trans Neural Netw. 2004 May;15(3):545-58.

Efficient learning algorithms for three-layer regular feedforward fuzzy neural networks.

Author information

Department of Mathematics, Beijing Normal University, Beijing 100875, China.


A key step of using gradient descend methods to develop learning algorithms of a regular feedforward fuzzy neural network (FNN) is to differentiate max--min functions, which contain max and min operations. The paper aims at several objectives. First, investigate further the differentiation of max--min functions. Second, employ general fuzzy numbers, which include triangular and trapezoidal fuzzy numbers as special cases to define a three-layer regular FNN. The general fuzzy numbers related can be approximately determined by their corresponding finite level sets. So, we can approximately represent the input-output (I/O) relationship of the regular FNN as functions of the endpoints of all finite level sets. Third, a fuzzy back-propagation algorithm is presented. And to speed up the convergence of the learning algorithm, a fuzzy conjugate gradient algorithm for fuzzy weights and biases is developed, furthermore, the convergence of the algorithm is analyzed, systematically. Finally, some real simulations demonstrate the efficiency of our learning algorithms. The regular FNN is applied to the approximate realization of fuzzy inference rules and fuzzy functions defined on given compact sets.

[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for IEEE Engineering in Medicine and Biology Society
Loading ...
Support Center