Learning list concepts through program induction
- Joshua Rule, BCS, MIT, Cambridge, Massachusetts, United States
- Eric Schulz, Psychology, Harvard, Boston, Massachusetts, United States
- Steven Piantadosi, Brain & Cognitive Sciences, University of Rochester, Rochester, New York, United States
- Josh Tenenbaum, Brain and Cognitive Sciences, MIT, Cambridge, Massachusetts, United States
AbstractHumans master complex systems of interrelated concepts like mathematics and natural language. Previous work suggests learning these systems relies on iteratively and directly revising a language-like conceptual representation. We introduce and assess a novel concept learning paradigm called Martha's Magical Machines that captures complex relationships between concepts. We model human concept learning in this paradigm as a search in the space of term rewriting systems, previously developed as an abstract model of computation. Our model accurately predicts that participants learn some transformations more easily than others and that they learn harder concepts more easily using a bootstrapping curriculum focused on their compositional parts. Our results suggest that term rewriting systems may be a useful model of human conceptual representations.