Institute of Mathematics

Talk

Modul:   MAT870  Zurich Colloquium in Applied and Computational Mathematics

A deep learning theory for neural networks grounded in physics

Talk by Dr. Benjamin Scellier

Date: 21.04.21  Time: 16.15 - 17.45  Room: Online ZHACM

We present a mathematical framework for machine learning, which allows us to train "physical systems with adjustable parameters" by gradient descent. Our framework applies to a very broad class of systems, namely those whose state or dynamics are described by variational equations. This includes physical systems whose equilibrium state is the minimum of an energy function, and physical systems whose trajectory minimizes an action functional (principle of least action). We present a simple procedure to compute the loss gradients in such systems. This procedure, called equilibrium propagation (EqProp), requires solely locally available information for each trainable parameter. In particular, our framework offers the possibility to build and train neural networks in substrates that directly exploit the laws of physics. As an example, we show how to use our framework to train a class of electrical circuits called nonlinear resistive networks. We also sketch a path to apply our framework to spiking neural networks (specifically spiking electrical circuits), by showing that nonlinear RLC circuits satisfy a principle of least action.