Experimental study of Perceptron-type local learning rule for Hopfield associative memory

Arun Jagota , Jacek Mańdziuk

Abstract

Kamp and Hasler proposed in 1990 a Perceptron-type learning rule for storing binary patterns in the Hopfield associative memory. They also proved its convergence. In this paper this rule is evaluated experimentally. Its performance is compared with that of the commonly-used Hebb rule. Both rules are tested on a variety of randomly generated collections of library patterns parametrized by the number of patterns M in a collection, the density p of the patterns, and the measure of correlation B of the bits in a pattern. The results are evaluated on two criteria: stability of the library patterns, and error-correction ability during the recall phase. The Perceptron-type rule was found to be perfect in ensuring stability of the stored library patterns under all evaluated conditions. The Hebb rule on the other hand was found to degrade rapidly as M was increased, or the density p was decreased, or the correlation B was increased. For not too large M, the Perceptron-type rule was also found to work much better, under a variety of conditions, than the Hebb rule in correcting errors in probe vectors. The conditions, when the Perceptron-type rule worked better, included a range of pattern-densities p, a range of correlations B, and several degrees of error e in a e in a probe vector. The uniformly random case (p = 0.5, B = 1) was the main exception when the Hebb rule systematically equalled or outperformed the Perceptron-type rule in the error-correction experiments. The experiments revealed interesting aspects of the evolution of the Perceptron-type rule. Unlike the Hebb rule, the Perceptron-type rule evolved over multiple epochs (multiple presentations of a library pattern). The error was found not to decrease monotonically, though eventually it always became zero. When M was sufficiently large, it was found that, for the uniformly random patterns, the Perceptron-type rule produced a degenerate weight matrix, whose diagonal terms dominated (stability was achieved as a byproduct).
Author Arun Jagota
Arun Jagota,,
-
, Jacek Mańdziuk ZZIMN
Jacek Mańdziuk,,
- Department of Applied Computer Science and Computation Methods
Journal seriesInformation Sciences, ISSN 0020-0255
Issue year1998
Vol111
No1–4
Pages65-81
Keywords in Englisherror correction, neural networks, Recurrent networks, Storage capacity
DOIDOI:10.1016/S0020-0255(98)00005-X
URL http://www.sciencedirect.com/science/article/pii/S002002559800005X
Score (nominal)40
Publication indicators WoS Impact Factor: 2006 = 1.003 (2) - 2007=1.813 (5)
Citation count*6 (2014-12-30)
Cite
Share Share



* presented citation count is obtained through Internet information analysis and it is close to the number calculated by the Publish or Perish system.
Back