Speaker
Description
Recurrent neural networks (RNNs) based on model neurons that communicate via continuous signals have been widely used to study how neural circuits perform cognitive tasks. Training such networks to perform tasks that require information maintenance over a brief period (working memory tasks) remains a challenge. Inspired by the robust information maintenance observed in higher cortical areas such as the prefrontal cortex, despite substantial inherent noise, we investigated the effects of random noise on RNNs across different functions, including working memory. Our findings reveal that random noise not only speeds up training but also enhances the stability and performance of biologically plausible RNNs on working memory tasks. Importantly, this robust working memory performance induced by random noise during training is attributed to an increase in synaptic decay time constants of inhibitory units, resulting in a slower dissipative dynamics of stimulus-specific activity critical for memory maintenance. These results highlight a mechanistic role for noise in shaping network dynamics that operate near the edge of instability.
Reference
Rungratsameetaweemana, N., Kim, R., Chotibut, T., Sejnowski, T. Random noise promotes slow heterogeneous synaptic dynamics important for robust working memory computation, Proceedings of the National Academy of Sciences 122, e2316745122 (2025) [link]