Titelangaben
Grüne, Lars:
Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a small-gain condition.
In: IFAC-PapersOnLine.
Bd. 54
(2021)
Heft 9
.
- S. 317-322.
ISSN 2405-8963
DOI: https://doi.org/10.1016/j.ifacol.2021.06.152
Dies ist die aktuelle Version des Eintrags.
Weitere URLs
Angaben zu Projekten
Projektfinanzierung: |
Deutsche Forschungsgemeinschaft |
---|
Abstract
We propose a deep neural network architecture for storing approximate Lyapunov functions of systems of ordinary differential equations. Under a small-gain condition on the system, the number of neurons needed for an approximation of a Lyapunov function with fixed accuracy grows only polynomially in the state dimension, i.e., the proposed approach is able to overcome the curse of dimensionality.
Weitere Angaben
Zu diesem Eintrag verfügbare Versionen
-
Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a small-gain condition. (deposited 27 Jan 2020 14:16)
- Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a small-gain condition. (deposited 20 Mai 2020 07:19) [Aktuelle Anzeige]