Beneficiary: Oticon, Denmark
For people who use cochlear implants, even the presence of moderate noise or acoustic scenes with multiple talkers can severely degrade speech understanding. When children with cochlear implants are integrated in schools for normally-hearing children, the negative effect of noise on learning as well as on social integration is a serious concern. Recent evidence suggests that adding vibrotactile input can substantially improve speech understanding in noise as well as appreciation of acoustically complex signals like music. These improvements have been shown in a few published studies, but it is not known what neural mechanisms underlie these effects, and therefore how to maximize the potential benefits that tactile stimulation could provide.
The objective of this project is to disentangle the neural mechanisms which underlie audio-tactile integration using non-invasive neuroimaging suitable for eventual use in children. Based on a comprehensive literature search on how tactile information could enhance speech understanding, hypotheses on neural correlates of these processes and the neural adaption to vibrotactile hearing were formed. Research questions will address the impact of cross-modal plasticity across auditory and somatosensory areas on listening performance, as well as the effects of training periods and attentional mechanisms for successful audio-tactile hearing in complex acoustic scenes. In order to test the study’s hypotheses, experimental paradigms involving neuroimaging with functional near-infrared spectroscopy (fNIRS) were developed and implemented. Measurements were conducted in normal hearing as well as in cochlear implant listeners, in collaboration with Hannover Medical School and the German Hearing Center. The outcomes of the project will be an increased understanding of how vibrotactile inputs are processed together with auditory speech stimuli and how this process relates to success in understanding speech.
ESR 4: Alina Schulte