Speaker
Description
Adaptive behavior relies on activity-dependent synaptic plasticity to sculpt internal models of the world. I introduce two complementary frameworks for how the brain encodes abstraction and probability. Regarding abstract representations, we first propose a three-factor plasticity rule for nonlinear dimensionality reduction in a three-layer network inspired by the Drosophila olfactory circuit. This rule approximates the t-SNE algorithm and reproduces experimental findings from fly studies. Next, we describe a dual-pathway hippocampal model—featuring a dense, direct input path and a sparse, indirect input path—where modulation of inhibitory tone toggles recall between abstract categories and concrete exemplars. Regarding probabilistic representations, we exploit chaotic fluctuations in a recurrent network to perform Bayesian posterior sampling. Trained with biologically plausible learning on a cue-integration task, the network reliably approximates target distributions despite chaos-induced sensitivity to initial conditions. Together, these models illustrate how synaptic plasticity and neural dynamics could underlie abstract and probabilistic internal representations.