12.01.2026 15:00 Christian Meisel:
Phase transitions govern optimal dynamics in deep learning and biological neural networksMI 03.06.011 (Boltzmannstr. 3, 85748 Garching)

The rapid advances in artificial intelligence (AI) have largely been driven by scaling deep neural networks (DNNs) - increasing model size, data, and computational resources. However, performance is ultimately governed by network dynamics. The lack of a principled understanding of DNN dynamics beyond heuristic-based design has contributed to challenges with their robustness, suboptimal performance, high energy consumption and pathologies in continual and AI-generated content learning. In contrast, the human brain does not seem to suffer these problems, and converging evidence suggest that these benefits are achieved by dynamics being poised at a critical phase transition. Inspired by this principle, we propose that criticality provides a unifying framework linking structure, dynamics, and function also in DNNs. First, by analyzing more than 80 state-of-the-art models, we report that a decade of AI progress has implicitly driven successful networks towards criticality – explaining why certain architectures succeeded while others failed. Second, we demonstrate that incorporating criticality explicitly into training improves robustness and accuracy preventing key limitations of current models. Third, we show that catastrophic AI pathologies, including the performance degradation in continual learning and in model collapse - where performance degrades when training on AI-generated data - constitute a loss of critical dynamics. By maintaining networks at criticality, we provide a principled solution to this fundamental AI problem, demonstrating how criticality-based optimization mitigates performance degradation. This work highlights criticality as substrate-independent principle of intelligence, connecting AI advancement with core principles of brain function. It provides theoretical insights along with immediate practical value solving major AI challenges to ensure long-term DNN performance and resilience as models grow in scale and complexity.