Are you Sure How to Move? Expected Uncertainty Modulates Anticipatory Crossmodal Interactions

AbstractTheories of event-predictive, anticipatory behavior state that action planning, decision making, and control are realized by activating future goal states. That is, anticipated and desired final event boundaries as well as sensorimotor-grounded event codes are activated before actual motor control unfolds. The involved active inference process thereby focuses sensorimotor processing on those upcoming events and event boundaries, in which expected uncertainties need to be resolved. Here, we investigated anticipatory behavior during object interactions, that is, grasping and placing bottles. We investigated whether peripersonal hand space is remapped onto the to-be grasped bottle during action preparation and whether this remapping depends on (i) the bottle's orientation and (ii) the certainty about upcoming sensorimotor contingencies. To do so, we conducted two experiments in an immersive virtual reality, combining the crossmodal congruency paradigm, which has been used to study selective interactions between vision and touch within peripersonal space, with a grasping task. In both experiments, we observed anticipatory crossmodal congruency between vision and touch at the future finger position on the bottle. Moreover, in the second experiment, a manipulation of the visuo-motor mapping of the participants' virtual hand while approaching the bottle selectively reduced crossmodal congruency at movement onset. Thus, the expected movement uncertainty decreased the anticipatory remapping of peripersonal space. Our results support theories of event-predictive cognition and show how expected uncertainties influence anticipatory, active inference processes.

Return to previous page