We explore neural network learning and parallel human learning on an artificial language task. The task generates rich data on human interaction with syntactic systems, including recursive ones. Studying the networks properties, we argue for a Structured Manifold view of syntactic representation. The Structured Manifold lies in the parameter space (weight space) of the network. It exhibits (1) loci of high order, corresponding to complex rule systems, (2) continuity, which explains how one rule system can morph into another one, and (3) recursion approximation, a concept related to symbolic recursion, which addresses some of the puzzles about embedding patterns in human behavior.