The main object of this thesis is the design of structured distributed memories for the purpose of studying their storage and retrieval properties in large scale cortical autoassociative networks. For this, an autoassociative network of Potts units, coupled via tensor connections, has been proposed and analyzed as an effective model of an extensive cortical network with distinct short and longrange synaptic connections. Recently, we have clarified in what sense it can be regarded as an effective model. While the fullyconnected (FC) and the very sparsely connected, that is, highly diluted (HD) limits of the model have thoroughly analyzed, the realistic case of the intermediate partial connectivity has been simply assumed to interpolate the FC and HD cases. In this thesis, we first study the storage capacity of Potts network with such intermediate connectivity. We corroborate the outcome of the analysis by showing that the resulting mean field equations are consistent with the FC and HD equations under the appropriate limits. The meanfield equations are only derived for randomly diluted connectivity (RD). Through simulations, we also study symmetric dilution (SD) and state dependent random dilution (SDRD). We find that the Potts network has a higher capacity for symmetric than for random dilution. We then turn to the core question: how to use a model originally conceived for the storage of p unrelated patterns of activity, in order to study semantic memory, which is organized in terms of the relations between the facts and the attributes of realworld knowledge. To proceed, we first formulate a mathematical model of generating patterns with correlations, as an extension of a hierarchical procedure for generating ultrametrically organized patterns. The model ascribes the correlations between patterns to the influence of underlying "factors"; if many factors act with comparable strength, their influences balance out and correlations are low; whereas if a few factors dominate, which in the model occurs for increasing values of a control parameter ζ, correlations between memory patterns can become much stronger. We show that the extension allows for correlations between patterns that are neither trivial (as in the random case) nor a plain tree (as in the ultrametric case), but that are highly sensitive to the values of the correlation parameters that we define. Next, we study the storage capacity of the Potts network when the patterns are correlated by way of our algorithm. We show that fewer correlated patterns can be stored and retrieved than random ones, and that the higher the degree of correlation, the lower the capacity. We find that the meanfield equations yielding the storage capacity are different from those obtained with uncorrelated patterns through only an additional term in the noise, proportional to the number of learned patterns p and to the difference between the average correlation between correlated patterns and independently generated patterns of the same sparsity. Of particular interest is the role played by the parameter we have introduced, ζ, which controls the strength of the influences of different factors (the "parents") in generating the memory patterns (the "children"). In particular, we find that for high values of ζ, so that only a handful of parents are effective, the network exhibits correlated retrieval, in which the network, though not being able to retrieve the pattern cued, settles into a configuration of high overlap with another pattern. This behavior of the network can be interpreted as reflecting the semantic structure of the correlations, in which even after capacity collapse, what the network can still do is to recognize the strongest features associated with the pattern. This observation is better quantified using the mutual information between the pattern cued and the configuration the network settles into, after retrieval dynamics. This information is found to increase from zero to a nonzero value abruptly when increasing the parameter ζ, akin to a phase transition. Two alternative phases are then identified, ζ < ζ c , in which many factors are on equal footing and there is not much structure. In this phase, when the network fails to retrieve, it fails to retrieve any learned configuration. For ζ > ζ c , memories form clusters, such that while the specifics of the cued pattern cannot be retrieved, some of the structure informing the cluster of memories can still be retrieved. In a final short chapter, we attempt to understand the implications of having stored correlated memories on latching dynamics, the spontaneous behavior which has been proposed to be an emergent property, beyond the simple cued retrieval paradigm, of large cortical networks. Progress made in this direction, studying the Potts network, has so far focused on uncorrelated memories. Introducing correlations, we find a rich phase space of behaviors, from sequential retrieval of memories, to parallel retrieval of clusters of highly correlated memories and oscillations, depending on the various correlation parameters. The parameters of our algorithm may be found to emerge as critical control parameters, corresponding to the statistical features in human semantic memory most important in determining the dynamics of our trains of thoughts.
The storage of semantic memories in the cortex: a computational study / Boboeva, Vezha.  (2018 Jan 30).
The storage of semantic memories in the cortex: a computational study
Boboeva, Vezha
2018
Abstract
The main object of this thesis is the design of structured distributed memories for the purpose of studying their storage and retrieval properties in large scale cortical autoassociative networks. For this, an autoassociative network of Potts units, coupled via tensor connections, has been proposed and analyzed as an effective model of an extensive cortical network with distinct short and longrange synaptic connections. Recently, we have clarified in what sense it can be regarded as an effective model. While the fullyconnected (FC) and the very sparsely connected, that is, highly diluted (HD) limits of the model have thoroughly analyzed, the realistic case of the intermediate partial connectivity has been simply assumed to interpolate the FC and HD cases. In this thesis, we first study the storage capacity of Potts network with such intermediate connectivity. We corroborate the outcome of the analysis by showing that the resulting mean field equations are consistent with the FC and HD equations under the appropriate limits. The meanfield equations are only derived for randomly diluted connectivity (RD). Through simulations, we also study symmetric dilution (SD) and state dependent random dilution (SDRD). We find that the Potts network has a higher capacity for symmetric than for random dilution. We then turn to the core question: how to use a model originally conceived for the storage of p unrelated patterns of activity, in order to study semantic memory, which is organized in terms of the relations between the facts and the attributes of realworld knowledge. To proceed, we first formulate a mathematical model of generating patterns with correlations, as an extension of a hierarchical procedure for generating ultrametrically organized patterns. The model ascribes the correlations between patterns to the influence of underlying "factors"; if many factors act with comparable strength, their influences balance out and correlations are low; whereas if a few factors dominate, which in the model occurs for increasing values of a control parameter ζ, correlations between memory patterns can become much stronger. We show that the extension allows for correlations between patterns that are neither trivial (as in the random case) nor a plain tree (as in the ultrametric case), but that are highly sensitive to the values of the correlation parameters that we define. Next, we study the storage capacity of the Potts network when the patterns are correlated by way of our algorithm. We show that fewer correlated patterns can be stored and retrieved than random ones, and that the higher the degree of correlation, the lower the capacity. We find that the meanfield equations yielding the storage capacity are different from those obtained with uncorrelated patterns through only an additional term in the noise, proportional to the number of learned patterns p and to the difference between the average correlation between correlated patterns and independently generated patterns of the same sparsity. Of particular interest is the role played by the parameter we have introduced, ζ, which controls the strength of the influences of different factors (the "parents") in generating the memory patterns (the "children"). In particular, we find that for high values of ζ, so that only a handful of parents are effective, the network exhibits correlated retrieval, in which the network, though not being able to retrieve the pattern cued, settles into a configuration of high overlap with another pattern. This behavior of the network can be interpreted as reflecting the semantic structure of the correlations, in which even after capacity collapse, what the network can still do is to recognize the strongest features associated with the pattern. This observation is better quantified using the mutual information between the pattern cued and the configuration the network settles into, after retrieval dynamics. This information is found to increase from zero to a nonzero value abruptly when increasing the parameter ζ, akin to a phase transition. Two alternative phases are then identified, ζ < ζ c , in which many factors are on equal footing and there is not much structure. In this phase, when the network fails to retrieve, it fails to retrieve any learned configuration. For ζ > ζ c , memories form clusters, such that while the specifics of the cued pattern cannot be retrieved, some of the structure informing the cluster of memories can still be retrieved. In a final short chapter, we attempt to understand the implications of having stored correlated memories on latching dynamics, the spontaneous behavior which has been proposed to be an emergent property, beyond the simple cued retrieval paradigm, of large cortical networks. Progress made in this direction, studying the Potts network, has so far focused on uncorrelated memories. Introducing correlations, we find a rich phase space of behaviors, from sequential retrieval of memories, to parallel retrieval of clusters of highly correlated memories and oscillations, depending on the various correlation parameters. The parameters of our algorithm may be found to emerge as critical control parameters, corresponding to the statistical features in human semantic memory most important in determining the dynamics of our trains of thoughts.File  Dimensione  Formato  

thesis_Boboeva_30Jan2018.pdf
embargo fino al 01/02/2019
Tipologia:
Tesi
Licenza:
Non specificato
Dimensione
10.3 MB
Formato
Adobe PDF

10.3 MB  Adobe PDF  Visualizza/Apri 
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.