Whether learning how pressing on the gas pedal of rental car will affect its acceleration or learning how changing the volume of speakers affects the perceived loudness of the sound they produce, humans can quickly learn functions from a few examples. Recent hybrid models (Lucas et al., 2015) combine the structure of rule-based models with the flexibility of similarity-based models by exploiting the equivalence of Bayesian linear regression and Gaussian processes. We expand on these models by taking advantage of the compositional nature of Gaussian processes and imposing a generative grammar over a set of base components in order to build the structured but diverse hypothesis spaces that appear to be represented by people. Subsequent testing will compare this model’s ability to reproduce people’s learning difficulty rankings of different functions, extrapolation results, and representations of multiple overlapping functions to that of other hybrid models.