Modality-constrained statistical learning of tactile, visual, and auditory sequences

Christopher M. Conway, Morten H. Christiansen

Research output: Contribution to journalArticlepeer-review

339 Scopus citations

Abstract

The authors investigated the extent to which touch, vision, and audition mediate the processing of statistical regularities within sequential input. Few researchers have conducted rigorous comparisons across sensory modalities; in particular, the sense of touch has been virtually ignored. The current data reveal not only commonalities but also modality constraints affecting statistical learning across the senses. To be specific, the authors found that the auditory modality displayed a quantitative learning advantage compared with vision and touch. In addition, they discovered qualitative learning biases among the senses: Primarily, audition afforded better learning for the final part of input sequences. These findings are discussed in terms of whether statistical learning is likely to consist of a single, unitary mechanism or multiple, modality-constrained ones.

Original languageEnglish (US)
Pages (from-to)24-39
Number of pages16
JournalJournal of Experimental Psychology: Learning Memory and Cognition
Volume31
Issue number1
DOIs
StatePublished - Jan 2005
Externally publishedYes

ASJC Scopus subject areas

  • Language and Linguistics
  • Experimental and Cognitive Psychology
  • Linguistics and Language

Fingerprint

Dive into the research topics of 'Modality-constrained statistical learning of tactile, visual, and auditory sequences'. Together they form a unique fingerprint.

Cite this