Compositionality in emerging multi-agent languages: Marrying Language Evolution and Natural Language Processing

AbstractThe mainstream approach in NLP is to train systems on large amounts of data. Such passive learning contrasts with the way language is learnt by humans. Human language is acquired within communities, it is culturally transmitted and changes dynamically. These evolutionary mechanisms have been extensively studied in the field of Language Evolution. Despite limited prior interaction between fields, such mechanisms are now increasingly incorporated into NLP systems. Such models have the potential to both study the evolution of language in multi-agent simulations with state-of-the-art (deep) learning systems in more naturalistic settings and improve NLP systems by having language emerge organically. We examine how findings from a model by Havrylov & Titov (2017) compare to those from traditional Language Evolution models and quantify the emerging compositionality using an existing Language Evolution method (Tamariz, 2011). This approach reveals novel insights into the generated data, the applied methodology and the nature of compositionality.


Return to previous page