A Neural Network Model of Complementary Learning Systems

AbstractWe introduce a computational model capturing the high-level features of the complementary learning systems (CLS) framework. In particular, we model the integration of episodic memory with statistical learning in an end-to-end trainable neural network architecture. We demonstrate on vision and control tasks that our model’s behavior aligns with a variety of behavioral and neural data. In particular, our model performs consistently with results indicating that episodic memory systems aid early learning and transfer generalization. We also find qualitative results consistent with findings that neural traces of memories of similar events converge over time. Furthermore, without explicit instruction or incentive, the behavior of our model naturally aligns with results suggesting that the usage of episodic systems wanes with learning. These results suggest that key features of the CLS framework emerge in a task-optimized model containing statistical and episodic learning components, supporting several hypotheses of the framework.

Return to previous page