A neural dynamic architecture that autonomously builds mental models

AbstractReasoning and other mental operations are believed to rely on mental models. Arguments have been made that mental models share representational substrate with perception. Here, we demonstrate that a neural dynamic architecture that perceptually grounds language may also support the building of mental models. Supplied with a sequence of simple premises that specify the colors of object pairs as well as their spatial relation, the architecture builds a mental model of the described scene. We show how the neural processes of the architecture evolve in response to both determinate and indeterminate premises. For indeterminate premises, we demonstrate that the preferred mental models observed in human participants emerge from the underlying neural dynamics.


Return to previous page