Bases of ReLU Neural Networks
školitel: | doc. RNDr. Jan Vybíral, Ph.D. |
e-mail: | zobrazit e-mail |
typ práce: | bakalářská práce, diplomová práce |
zaměření: | MI_MM, MI_AMSM, MINF |
klíčová slova: | ReLU, Neural Networks, Riesz Basis, Frame |
popis: | The astonishing performance of neural networks in multivariate problems still lacks satisfactory mathematical explanation. In a recent work of I. Daubechies and her co-authors, they proposed a univariate system of piecewise linear functions, which resemble very much the trigonometric system and which form the so-called Riesz basis. Moreover, these functions are easily reproducible as ReLU Neural Networks. In a follow-up work of C. Schneider and J. Vybiral, this was generalized to the multivariate setting. The task of this work will be to investigate further potential improvements of the recent research, both on theoretical as well as practical side. This includes a) optimization of the Riesz constants of the system b) application of an orthonormalization procedure c) numerical implementation of the proposed NN-architecture and the study of its performance in approximation of multivariate functions. |
literatura: | C. Schneider and J. Vybiral, Multivariate Riesz basis of ReLU neural networks, submitted I. Daubechies, R. DeVore, S. Foucart, B. Hanin, and G. Petrova, Nonlinear Approximation and (Deep) ReLU Networks, Constr. Appr. 55 (2022), 127–172 P. Beneventano, P. Cheridito, R. Graeber, A. Jentzen, and B. Kuckuck, Deep neural network approximation theory for high-dimensional functions, available at arXiv:2112.14523 |
naposledy změněno: | 10.03.2023 14:14:49 |
za obsah této stránky zodpovídá:
Čestmír Burdík | naposledy změněno: 9.9.2021