Improving Leung's bidirectional learning rule for associative memories

IEEE Trans Neural Netw. 2001;12(5):1222-6. doi: 10.1109/72.950150.

Abstract

Leung (1994) introduced a perceptron-like learning rule to enhance the recall performance of bidirectional associative memories (BAMs). He proved that his so-called bidirectional learning scheme always yields a solution within a finite number of learning iterations in case that a solution exists. Unfortunately, in the setting of Leung a solution only exists in case that the training set is strongly linear separable by hyperplanes through the origin. We extend Leung's approach by considering conditionally strong linear separable sets allowing separating hyperplanes not containing the origin. Moreover, we deal with BAMs, which are generalized by defining so-called dilation and translation parameters enlarging their capacity, while leaving their complexity almost unaffected. The whole approach leads to a generalized bidirectional learning rule which generates BAMs with dilation and translation that perform perfectly on the training set in a case that the latter satisfies the conditionally strong linear separability assumption. Therefore, in the sense of Leung, we conclude with an optimal learning strategy which contains Leung's initial idea as a special case.