Title data
Grüne, Lars:
Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a small-gain condition.
Bayreuth
,
2020
. - 6 p.
There is a more recent version of this item available. |
Related URLs
Project information
Project financing: |
Deutsche Forschungsgemeinschaft |
---|
Abstract in another language
We propose a deep neural network architecture for storing approximate Lyapunov functions of systems of ordinary differential equations. Under a small-gain condition on the system, the number of neurons needed for an approximation of a Lyapunov function with fixed accuracy grows only polynomially in the state dimension, i.e., the proposed approach is able to overcome the curse of dimensionality.
Further data
Available Versions of this Item
- Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a small-gain condition. (deposited 27 Jan 2020 14:16) [Currently Displayed]