Cortical tracking of visual rhythmic speech by 5- and 8-month-old infants: Individual differences in phase angle relate to language outcomes up to 2 years
Accepted version
Peer-reviewed
Repository URI
Repository DOI
Change log
Authors
Abstract
It is known that the rhythms of speech are visible on the face, accurately mirroring changes in the vocal tract. These low-frequency visual temporal movements are tightly correlated with speech output, and both visual speech (for example, mouth motion) and the acoustic speech amplitude envelope entrain neural oscillations. Low-frequency visual temporal information (‘visual prosody’) is known from behavioural studies to be perceived by infants, but oscillatory studies are currently lacking. Here we measure cortical tracking of low-frequency visual temporal information by five- and eight-month-old infants using a rhythmic speech paradigm (repetition of the syllable “ta” at 2 Hz). Eye-tracking data was collected simultaneously with EEG, enabling computation of cortical tracking and phase angle during visual-only speech presentation. Significantly higher power at the stimulus frequency indicated that cortical tracking occurred across both ages. Further, individual differences in preferred phase to visual speech related to subsequent measures of language acquisition. The difference in phase between visual-only speech and the same speech presented as auditory-visual at 6- and 9- months was also examined. These neural data suggest that individual differences in early language acquisition may be related to the phase of entrainment to visual rhythmic input in infancy.
Description
Keywords
Journal Title
Conference Name
Journal ISSN
1467-7687