نبذة مختصرة : This thesis explores some problems in random matrix theory and high-dimensional statistics motivated by the need to improve our understanding of deep learning. Training deep neural networks involves solving high-dimensional, large-scale, and nonconvex optimization problems that should, in theory, be intractable but are surprisingly feasible in practice. To understand this paradox, we study solvable models that balance practical relevance with rigorous mathematical analysis. Random matrices and high-dimensional statistics are central to these efforts due to the large datasets and high dimensionality inherent in such models. We first consider the random features model, a two-layer neural network with fixed random weights in the first layer and learnable weights in the second layer. Our focus is on the asymptotic spectrum of the conjugate kernel matrix YY* with Y = f(WX), where W and X are rectangular random matrices with i.i.d. entries and f is a nonlinear activation function applied entry-wise. We extend prior results on light-tailed distributions for W and X by considering two new settings. First, we study the case of additive bias Y = f(WX + B), where B is an independent rank-one Gaussian random matrix, closer modeling the neural network architectures encountered in practice. To obtain the asymptotics for the empirical spectral density we follow the resolvent method via the cumulant expansion. Second, we investigate the case where W has heavy-tailed entries, X remains light-tailed, and f is a smooth, bounded, and odd function. We show that heavy-tailed weights induce much stronger correlations among the entries of Y, resulting in a novel spectral behavior. This analysis relies on the moment method through traffic probability theory. Next, we address the tensor PCA (Principal Component Analysis) problem, a high-dimensional inference task that investigates the computational hardness of estimating an unknown signal vector from noisy tensor observations via maximum likelihood estimation. Tensor PCA serves as a ...
No Comments.