23.10.2023 15:00 Rainer Engelken (Columbia):
Lyapunov Spectra of Recurrent Neural Networks: Implications for Machine LearningOnline: attend (945856)

We examine the dynamics of recurrent neural networks by calculating their full Lyapunov spectrum. Our results show a size-invariant Lyapunov spectrum and attractor dimensions smaller than the phase space dimensions. Through random matrix theory, we provide analytical approximations for the Lyapunov spectrum near the onset of chaos for strong coupling and discrete-time dynamics. We also uncover a point-symmetry in the Lyapunov spectrum, reminiscent of symplectic structures in chaotic Hamiltonian systems. For trained recurrent networks, our analysis serves as a quantitative measure of error propagation and stability. Based on these findings, we propose to mitigate the vanishing/exploding gradient problem by regularizing Lyapunov exponents, thereby highlighting the potential of dynamical systems theory in machine learning.