We derive the Gardner storage capacity for associative networks of threshold linear units, and show that with Hebbian learning they can operate closer to such Gardner bound than binary networks, and even surpass it. This is largely achieved through a sparsification of the retrieved patterns, which we analyze for theoretical and empirical distributions of activity. As reaching the optimal capacity via nonlocal learning rules like back propagation requires slow and neurally implausible training procedures, our results indicate that one-shot self-organized Hebbian learning can be just as efficient.
Efficiency of Local Learning Rules in Threshold-Linear Associative Networks / Schönsberg, Francesca; Roudi, Yasser; Treves, Alessandro. - In: PHYSICAL REVIEW LETTERS. - ISSN 0031-9007. - 126:1(2021), pp. 1-5. [10.1103/PhysRevLett.126.018301]
Efficiency of Local Learning Rules in Threshold-Linear Associative Networks
Schönsberg, Francesca
;Treves, Alessandro
2021-01-01
Abstract
We derive the Gardner storage capacity for associative networks of threshold linear units, and show that with Hebbian learning they can operate closer to such Gardner bound than binary networks, and even surpass it. This is largely achieved through a sparsification of the retrieved patterns, which we analyze for theoretical and empirical distributions of activity. As reaching the optimal capacity via nonlocal learning rules like back propagation requires slow and neurally implausible training procedures, our results indicate that one-shot self-organized Hebbian learning can be just as efficient.File | Dimensione | Formato | |
---|---|---|---|
Sch+21.pdf
accesso aperto
Tipologia:
Versione Editoriale (PDF)
Licenza:
Creative commons
Dimensione
836.12 kB
Formato
Adobe PDF
|
836.12 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.