Bases of ReLU Neural Networks

advisor: doc. RNDr. Jan Vybíral, Ph.D.
e-mail: show e-mail
type: bachelor thesis, master thesis
branch of study: MI_MM, MI_AMSM, MINF
key words: ReLU, Neural Networks, Riesz Basis, Frame
description: The astonishing performance of neural networks in multivariate problems still lacks satisfactory mathematical explanation. In a recent work of I. Daubechies and her co-authors, they proposed a univariate system of piecewise linear functions, which resemble very much the trigonometric system and which form the so-called Riesz basis. Moreover, these functions are easily reproducible as ReLU Neural Networks. In a follow-up work of C. Schneider and J. Vybiral, this was generalized to the multivariate setting. The task of this work will be to investigate further potential improvements of the recent research, both on theoretical as well as practical side. This includes a) optimization of the Riesz constants of the system b) application of an orthonormalization procedure c) numerical implementation of the proposed NN-architecture and the study of its performance in approximation of multivariate functions.
references: C. Schneider and J. Vybiral, Multivariate Riesz basis of ReLU neural networks, submitted I. Daubechies, R. DeVore, S. Foucart, B. Hanin, and G. Petrova, Nonlinear Approximation and (Deep) ReLU Networks, Constr. Appr. 55 (2022), 127–172 P. Beneventano, P. Cheridito, R. Graeber, A. Jentzen, and B. Kuckuck, Deep neural network approximation theory for high-dimensional functions, available at arXiv:2112.14523
last update: 10.03.2023 14:14:49

administrator for this page: Ľubomíra Dvořáková | last update: 09/12/2011
Trojanova 13, 120 00 Praha 2, tel. +420 770 127 494
Czech Technical Univeristy in Prague | Faculty of Nuclear Sciences and Physical Engineering | Department of Mathematics