Cortical Oscillations in Pre-verbal Infants Track Rhythmic Speech and Non-speech Stimuli
Accepted version
Peer-reviewed
Repository URI
Repository DOI
Change log
Authors
Abstract
The foundations for language acquisition are laid in infancy. A key feature of infant-directed speech (IDS) is that the slowest modulations of its amplitude envelope (~2 Hz) contain more energy than in adult-directed speech. These slow modulations may provide a cross-language rhythmic scaffold for the neural tracking of speech in infancy. To investigate relations between early neural processing of speech and language acquisition in English, the BabyRhythm project followed 113 infants during infancy and toddlerhood. The neural predictor of language development reported here was the cortical tracking of slow, rhythmic audiovisual stimuli, processing of which is known to differ in older children with dyslexia. To find out how such stimuli are tracked early in development, infants were presented with videos of a woman repeating the syllable “Ta” twice per second, and a ball bouncing on a drum to create a 2Hz beat. At the ages of six and nine months, infants exhibited a significant peak in EEG power at 2Hz when listening to these stimuli, indicating that the infant brain was responding to these stimuli at the expected frequency. Time-frequency analysis showed increased inter-trial EEG phase coherence at 2Hz, suggesting that the increase in oscillatory power was driven by the stimuli. There were no differences in how the speech and non-speech stimuli were tracked. These results indicate that the infant brain can track the rhythm of slow auditory stimuli. They lay the foundation for future investigation of how individual differences in tracking might relate to later language acquisition.