2023-24-project-catalogue

Development of neuroimaging-compatible wearable music technology in the visual and somatosensory modalities

Project ID: 2228bd1236 (You will need this ID for your application)

Research Theme: Information and Communication Technologies

UCL Lead department: Division of Psychology and Language Sciences (PALS)

Department Website

Lead Supervisor: Velia Cardin

Project Summary:

Sound is at the centre of most musical experiences. However, the fact that deaf individuals also perform, dance, compose and enjoy music, with reduced or no access to sound, reveals that information can be extracted from music also through other senses. These multimodal aspects of music are conveyed mainly by vibrations perceived through touch and rhythmic patterns in visual cues. To study the neuroscience of music across modalities, and to produce more inclusive musical experiences, we need technology that can reliably convey its different components to vision and touch. This can be done by delivering different musical information to separate parts of the body and visual field, so that the type of segregation of information usually performed by the auditory system is achieved by exploiting the spatial dimension of other senses. Such technology is not currently available in a form that is wearable or compatible with neuroimaging techniques. The aim of this project is to develop such technology. Specific aims include: 1) Develop algorithms and technology that convey, in the visual and somatosensory modalities, different aspects of music (from low-level features like notes, through to mid-level expressive gestures like phrasing and performance motive, and larger-scale formic and stylistic aspects). 2) Adapt these into the environment of MRI scanners, commonly used in neuroscience research with humans 3) Incorporate this technology into wearable devices for more accessible musical performances. The project will use different libraries and languages for audio synthesis and algorithmic composition, such as SuperCollider, Pure Data, Ardour, and ZynAddSubFX, as well as somatosensory piezo stimulation available at Cardin’s lab, wearable movement technology available at Prof. Berthouze’s lab, and neuroimaging facilities in the Division of Psychology and Language.