Categorization in a Hopfield network trained with weighted examples : extensive number of concepts

View/ Open
Date
2000Type
Abstract
We consider the categorization problem in a Hopfield network with an extensive number of concepts p =αN and trained with s examples of weight λτ, T=1, . . . ,s in the presence of synaptic noise represented by a dimensionless ‘‘temperature’’ T. We find that the retrieval capacity of an example with weight λ₁, and the corresponding categorization error, depend also on the arithmetic mean λm of the other weights. The categorization process is similar to that in a network trained with Hebb’s rule, ...
We consider the categorization problem in a Hopfield network with an extensive number of concepts p =αN and trained with s examples of weight λτ, T=1, . . . ,s in the presence of synaptic noise represented by a dimensionless ‘‘temperature’’ T. We find that the retrieval capacity of an example with weight λ₁, and the corresponding categorization error, depend also on the arithmetic mean λm of the other weights. The categorization process is similar to that in a network trained with Hebb’s rule, but for λ₁/λm>1 the retrieval phase is enhanced. We present the phase diagram in the T-α plane, together with the de Almeida–Thouless line of instability. The phase diagrams in the α-s plane are discussed in the absence of synaptic noise and several values of the correlation parameter b. ...
In
Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics. Melville. Vol. 61, no. 5B (May 2000), p. 4860-4865
Source
Foreign
Collections
-
Journal Articles (35814)Exact and Earth Sciences (5620)
This item is licensed under a Creative Commons License
