Title data
Grüne, Lars:
Computing Lyapunov functions using deep neural networks.
In: Journal of Computational Dynamics.
Vol. 8
(2021)
Issue 2
.
- pp. 131-152.
ISSN 2158-2491
DOI: https://doi.org/10.3934/jcd.2021006
This is the latest version of this item.
Related URLs
Abstract in another language
We propose a deep neural network architecture and a training algorithm for computing approximate Lyapunov functions of systems of nonlinear ordinary differential equations. Under the assumption that the system admits a compositional Lyapunov function, we prove that the number of neurons needed for an approximation of a Lyapunov function with fixed accuracy grows only polynomially in the state dimension, i.e., the proposed approach is able to overcome the curse of dimensionality. We show that nonlinear systems satisfying a small-gain condition admit compositional Lyapunov functions. Numerical examples in up to ten space dimensions illustrate the performance of the training scheme.
Further data
Available Versions of this Item
-
Computing Lyapunov functions using deep neural networks. (deposited 20 May 2020 08:51)
- Computing Lyapunov functions using deep neural networks. (deposited 07 Jan 2021 13:50) [Currently Displayed]