On the Relationship Between Entropy and Meaning in Music: An Exploration with Recurrent Neural Networks

Abstract

Meyer (1956) postulated that meaning in music is directly related to entropy--that high entropy (uncertainty) engenders greater subjective tension, which is correlated with more meaningful musical events. Current statistical models of music are often limited to music with a single melodic line, impeding wider investigation of Meyer's hypothesis. I describe a recurrent neural network model which produces estimates of instantaneous entropy for music with multiple parts and use it to analyze a Haydn string quartet. Features found by traditional analysis to be related to tension are shown to have characteristic signatures in the model's entropy measures. Thus, an information-based approach to musical analysis can elaborate on traditional understanding of music and can shed light on the more general cognitive phenomenon of musical meaning.


Back to Table of Contents