We introduce and analyze a minimal network model of semantic memory in the human brain. The model is a global associative memory structured as a collection of N local modules, each coding a feature, which can take S possible values, with a global sparseness a (the average fraction of features describing a concept). We show that, under optimal conditions, the number c of modules connected on average to a module can range widely between very sparse connectivity (c/N -> 0) and full connectivity (c = N), maintaining a global network storage capacity (the maximum number p of stored and retrievable concepts) that scales like c*S^2/a, with logarithmic corrections consistent with the constraint that each synapse may store up to a fraction of a bit.
The storage capacity of Potts models for semantic memory retrieval / Kropff, E.; Treves, A.. - In: JOURNAL OF STATISTICAL MECHANICS: THEORY AND EXPERIMENT. - ISSN 1742-5468. - 2005:8(2005), pp. 1-19. [10.1088/1742-5468/2005/08/P08010]
The storage capacity of Potts models for semantic memory retrieval
Treves, A.
2005-01-01
Abstract
We introduce and analyze a minimal network model of semantic memory in the human brain. The model is a global associative memory structured as a collection of N local modules, each coding a feature, which can take S possible values, with a global sparseness a (the average fraction of features describing a concept). We show that, under optimal conditions, the number c of modules connected on average to a module can range widely between very sparse connectivity (c/N -> 0) and full connectivity (c = N), maintaining a global network storage capacity (the maximum number p of stored and retrievable concepts) that scales like c*S^2/a, with logarithmic corrections consistent with the constraint that each synapse may store up to a fraction of a bit.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.