Semantic coordination of speech and gesture in young children
- Olga Abramov, Bielefeld University, Bielefeld, Germany
- Stefan Kopp, Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- Katharina Rohlfing, Paderborn University, Paderborn, Germany
- Friederike Kern, Bielefeld University, Bielefeld, Germany
- Ulrich Mertens, Paderborn University, Paderborn, Germany
- Anne Németh, Bielefeld University, Bielefeld, Germany
AbstractPeople use speech and gesture together when describing an event or action, where both modalities have different expressive opportunities (Kendon, 2004). One question is how the two modalities are semantically coordinated, i.e. how meaning is distributed across speech and accompanying gestures. While this has been studied only for adult speakers so far, here, we present a study on how young children (4 years of age) semantically coordinate speech and gesture, and how this relates to their cognitive and (indirectly) their verbal skills. Results indicate significant positive correlations between cognitive skills of the children and gesture-speech coordination. In addition, high cognitive skills correlate with the number of semantically relevant child descriptions revealing a link between verbal and cognitive skills.