Spoken language consists of a complex, sequentially arrayed signal that contains patterns that can be described in terms of statistical relations among language units. Previous research has suggested that a domain-general ability to learn structured sequential patterns may underlie language acquisition. To test this prediction, we examined the extent to which implicit sequence learning of probabilistically structured patterns in hearing adults is correlated with a spoken sentence perception task under degraded listening conditions. Performance on the sentence perception task was found to be correlated with implicit sequence learning, but only when the sequences were composed of stimuli that were easy to encode verbally. Implicit learning of phonological sequences thus appears to underlie spoken language processing and may indicate a hitherto unexplored cognitive factor that may account for the enormous variability in language outcomes in deaf children with cochlear implants. The present findings highlight the importance of investigating individual differences in specific cognitive abilities as a way to understand and explain language in deaf learners and, in particular, variability in language outcomes following cochlear implantation.
ASJC Scopus subject areas
- Speech and Hearing