Josef Teichmann

Joint work with Christa Cuchiero, Lukas Gonon, Lyudmila Grigoryeva and Juan-Pablo Ortega

 

Video

josef teichmann

Abstract

A new explanation of geometric nature of the reservoir computing phenomenon is presented. Reservoir computing is understood in the literature as the possibility of approximating input/output systems with randomly chosen recurrent neural systems and a trained linear readout layer. Light is shed on this phenomenon by constructing what is called strongly universal reservoir systems as random projections of a family of state-space systems that generate Volterra series expansions. This procedure yields a state-affine reservoir system with randomly generated coefficients in a dimension that is logarithmically reduced with respect to the original system. This reservoir system is able to approximate any element in the fading memory filters class just by training a different linear readout for each different filter. Explicit expressions for the probability distributions needed in the generation of the projected reservoir system are stated and bounds for the committed approximation error are provided.

Our speaker

Josef Teichmann is a professor for Mathematical Finance at ETH Zurich since 2009. He holds a PhD from Vienna University in the area of infinite dimensional geometry from 1999 and has worked as an Associate Professor at TU Vienna from 2002 to 2009. His research interests lie in Stochastic Finance, Stochastic Partial Differential Equations, Rough Analysis and Machine Learning.