Titelangaben
Sperl, Mario ; Saluzzi, Luca ; Kalise, Dante ; Grüne, Lars:
Separable approximations of optimal value functions and their representation by neural networks.
Bayreuth
,
2025
. - 20 S.
DOI: https://doi.org/10.48550/arXiv.2502.08559
Angaben zu Projekten
Projekttitel: |
Offizieller Projekttitel Projekt-ID Curse-of-dimensionality-free nonlinear optimal feedback control with deep neural networks. A compositionality-based approach via Hamilton-Jacobi-Bellman PDEs 463912816 |
---|---|
Projektfinanzierung: |
Deutsche Forschungsgemeinschaft |
Abstract
The use of separable approximations is proposed to mitigate the curse of dimensionality related to the approximation of high-dimensional value functions in optimal control. The separable approximation exploits intrinsic decaying sensitivity properties of the system, where the influence of a state variable on another diminishes as their spatial, temporal, or graph-based distance grows. This property allows the efficient representation of global functions as a sum of localized contributions. A theoretical framework for constructing separable approximations in the context of optimal control is proposed by leveraging decaying sensitivity in both discrete and continuous time. Results extend prior work on decay properties of solutions to Lyapunov and Riccati equations, offering new insights into polynomial and exponential decay regimes. Connections to neural networks are explored, demonstrating how separable structures enable scalable representations of high-dimensional value functions while preserving computational efficiency.