Title data
Grüne, Lars:
Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a smallgain condition.
In: IFACPapersOnLine.
Vol. 54
(2021)
Issue 9
.
 pp. 317322.
ISSN 24058963
DOI: https://doi.org/10.1016/j.ifacol.2021.06.152
This is the latest version of this item.
Related URLs
Project information
Project financing: 
Deutsche Forschungsgemeinschaft 

Abstract in another language
We propose a deep neural network architecture for storing approximate Lyapunov functions of systems of ordinary differential equations. Under a smallgain condition on the system, the number of neurons needed for an approximation of a Lyapunov function with fixed accuracy grows only polynomially in the state dimension, i.e., the proposed approach is able to overcome the curse of dimensionality.
Further data
Available Versions of this Item

Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a smallgain condition. (deposited 27 Jan 2020 14:16)
 Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a smallgain condition. (deposited 20 May 2020 07:19) [Currently Displayed]