Title data
Grüne, Lars ; Sperl, Mario:
Examples for separable control Lyapunov functions and their neural network approximation.
In: IFAC-PapersOnLine.
Vol. 56
(2023)
Issue 1
.
- pp. 19-24.
ISSN 2405-8963
DOI: https://doi.org/10.1016/j.ifacol.2023.02.004
This is the latest version of this item.
Related URLs
Project information
Project title: |
Project's official title Project's id Curse-of-dimensionality-free nonlinear optimal feedback control with deep neural networks. A compositionality-based approach via Hamilton-Jacobi-Bellman PDEs GR 1569/23-1 |
---|---|
Project financing: |
Deutsche Forschungsgemeinschaft |
Abstract in another language
In this paper, we consider nonlinear control systems and discuss the existence of a separable control Lyapunov function. To this end, we assume that the system can be decomposed into subsystems and formulate conditions such that a weighted sum of Lyapunov functions of the subsystems yields a control Lyapunov function of the overall system. Since deep neural networks are capable of approximating separable functions without suffering from the curse of dimensionality, we can thus identify systems where an efficient approximation of a control Lyapunov function via a deep neural network is possible. A corresponding network architecture and training algorithm are proposed. Further, numerical examples illustrate the behavior of the algorithm.
Further data
Available Versions of this Item
-
Examples for existence and non-existence of separable control Lyapunov functions. (deposited 01 Oct 2022 21:00)
-
Examples for existence and non-existence of separable control Lyapunov functions. (deposited 16 Nov 2022 11:14)
- Examples for separable control Lyapunov functions and their neural network approximation. (deposited 24 Mar 2023 08:08) [Currently Displayed]
-
Examples for existence and non-existence of separable control Lyapunov functions. (deposited 16 Nov 2022 11:14)