Interactive settings for interacting brains: EEG implementation in the Art-Science-Interaction Lab​​​​​​​
Presenter and author: Mattia Rosso
Organization: University of Ghent
The cross-fertilization of science, technology, and art practice lies at the heart of the innovations targeted in the research program of IPEM (Institute for Systematic Musicology).
In developing such program, we have been focusing on two aspects. The first one pertains to the idea that musical expression, communication and sense-making are active and dynamic processes, rooted in expressive gestures and bodily interactions. The second one is social in its nature, and pertains to the power of music to connect people and to create strong feelings of togetherness, shared understanding and identity. These two aspects are at the core of IPEM’s pioneering research on embodied music cognition and interaction (Leman, 2008; Lesaffre, Maes, & Leman, 2017) leading to theoretical and empirical outcomes, which provide a solid ground to reflect on how new music technologies can extent the creative-expressive potential of bodily movement and enrich human interaction through music. Whit the Art-Science-Interaction Lab (ASIL) fully operational, the adoption of neuroscientific methods represents the next step towards a deeper understanding of how people interact with music. 
In choosing the technique to adopt, we kept three criteria in mind: 1) grasping neural dynamics at the millisecond-scale over the course of interactions, 2) recording two interacting subjects at a time and 3) allowing a certain degree of mobility during the experimental tasks. Therefore, we believe that implementing dual-EEG recordings in our research and facilities has the potential to open a window on the intra- and inter-brain dynamics underpinning human interaction with music. Furthermore, a growing body of research in social neuroscience is taking advantage of the first two criteria we address here (Dumas et al., 2019; Varlet et al., 2019; Goldstein et al., 2017; for a review, see Babiloni & Astolfi, 2014), while the description of implementation approaches recently appeared in the scientific literature (Barraza et al., 2019). Moving from the design of tapping experiments of minimalistic and yet valid controlled forms of joint sensorimotor tasks (see Repp & Su, 2013 for a review), we aim at taking full advantage of ASIL’s technologies and dual eego™ mylab systems in order to bring the interaction in gradually more naturalistic (Müller et al., 2013; Sänger et al., 2013; Lindenbergen et al., 2009) and eventually augmented musical scenarios: measurement of human movement and physiology, systematic analysis of musical performance, immersive virtual reality and interactive 3D audio synthesis are the main technologies that will be brought together in this line of EEG research.
Keywords: music, interaction, movement, EEG, hyperscanning, rhythm, coordination dynamics