Beneficiary: Oticon, Denmark

For people who use cochlear implants, even the presence of moderate noise or acoustic scenes with multiple talkers can severely degrade speech understanding. When children with cochlear implants are integrated in schools for normally-hearing children, the negative effect of noise on learning as well as on social integration is a serious concern. Recent evidence suggests that adding a wristband which provides vibrotactile input can substantially improve speech understanding in noise as well as appreciation of acoustically complex signals like music. Although these improvements have been shown in several published studies, it is not known what neural mechanisms underlie these effects, and therefore how to maximise the potential benefits that tactile stimulation could provide.  

The objective of this project is to disentangle the neural mechanisms which underlie audio-tactile integration using non-invasive neuroimaging suitable for eventual use in children. Based on a comprehensive literature search on how tactile information could enhance speech understanding and music appreciation, hypotheses on neural correlates of these processes and the neural adaption to vibrotactile hearing will be formed. Research questions will address the impact of cross-modal plasticity across auditory and somatosensory areas on listening performance, as well as the effects of training periods and attentional mechanisms for successful audio-tactile hearing in complex acoustic scenes. In order to test the study’s hypotheses, experimental paradigms involving brain imaging techniques as fNIRS or EEG will be developed and implemented. In collaboration with researchers from the Technical University of Denmark, the University of Southampton and the University of Iceland, an effective vibrotactile stimulation strategy and device to be used in this study will be defined. The measurements will be conducted in normal hearing as well as cochlear implant listeners. The outcomes of the project will be an increased understanding of how vibrotactile inputs can drive adaptive brain plasticity, and how this plasticity might be related to success in understanding speech. 

Supervisors: Hamish Innes-Brown, Andrej Kral, Jeremy Marozeau and Søren Kamaric Riis

ESR 4: Alina Schulte