Institut für Mathematik

Vortrag

Modul:   MAT870  Zurich Colloquium in Applied and Computational Mathematics

Neural networks do not become asynchronous in the large size limit: there is no propagation of chaos

Vortrag von Prof. Dr. Olivier Faugeras

Datum: 07.03.18  Zeit: 16.15 - 17.45  Raum: Y27H25

We have developed a new method for establishing the thermodynamic limit of a network of fully connected rate neurons with correlated, Gaussian distributed, synaptic weights, and random inputs. The method is based on the formulation of a large deviation principle (LDP) for the probability distribution of the neuronal activity of a sequence of networks of increasing sizes. The motivation for using random connections comes from the fact that connections in neural networks are complex, poorly known and heterogeneous. The motivation for introducing correlation is the emphasis in computational modelling of neuroscience that neural networks are modular, and the correlations in the connection distribution reproduce this modularity, unlike in previous work. The limiting probability law is Gaussian and its mean and covariance functions are computed using a very quickly converging fixed point algorithm. Our outstanding new result is that in the thermodynamic limit the network does not become asynchronous, there is no propagation of chaos: neurons remain correlated and the amount of correlation can be described precisely from the correlation between the synaptic weights.