Broad distributions appear frequently in empirical data obtained from natural systems even in seemingly unrelated domains. The emergence of power law distributions is so ubiquitous that it has puzzled scientists across disciplines. To understand its origin is thus crucial to understand the mechanisms from which it transpires. In this thesis, we present an information theoretic perspective on the origin of broad distributions. Guided by the principle that learning from the data is equivalent to an optimal coding problem, we show, through various viewpoints, that broad distributions manifest when the sample is maximally informative on the underlying data generating process. Furthermore, by working under the Minimum Description Length (MDL) principle, we show that the origin of broad distributions – a signature of statistical criticality – can be understood precisely as a second order phase transition with the coding cost as the order parameter. This formulation then allows us to find the neurons in the brain that contain relevant representations during spatial navigation. Taken together, this thesis suggests that statistical criticality emerges from the efficient representation of samples of a complex system which does not rely on any specific mechanism of self-organization to a critical point.
|Autori:||Cubero, Ryan John|
|Titolo:||Statistical mechanics of samples, efficient representations and criticality|
|Relatore/i esterni:||Marsili, Matteo; Yasser, Roudi|
|Data di pubblicazione:||29-ott-2018|
|Appare nelle tipologie:||8.1 PhD thesis|