Motor, not visual, encoding of potential reach targets

Brandie M. Stewart, Jason P. Gallivan, Lee A. Baugh, J. Randall Flanagan

Research output: Contribution to journalLetterpeer-review

35 Scopus citations


We often encounter situations in which there are multiple potential targets for action, as when, for example, we hear the request "could you pass the ..." at the dinner table. It has recently been shown that, in such situations, activity in sensorimotor brain areas represents competing reach targets in parallel prior to deciding between, and then reaching towards, one of these targets [1]. One intriguing possibility, consistent with the influential notion of action 'affordances' [2], is that this activity reflects movement plans towards each potential target [3]. However, an equally plausible explanation is that this activity reflects an encoding of the visual properties of the potential targets (for example, their locations or directions), prior to any target being selected and the associated movement plan being formed. Notably, previous work showing spatial averaging behaviour during reaching, in which initial movements are biased towards the midpoint of the spatial distribution of potential targets [4-6], remains equally equivocal concerning the motor versus visual encoding of reach targets. Here, using a rapid reaching task that disentangles these two competing accounts, we show that reach averaging behaviour reflects the parallel encoding of multiple competing motor plans. This provides direct evidence for theories proposing that the brain prepares multiple available movements before selecting between them [3].

Original languageEnglish (US)
Pages (from-to)R953-R954
JournalCurrent Biology
Issue number19
StatePublished - Oct 6 2014
Externally publishedYes

ASJC Scopus subject areas

  • Biochemistry, Genetics and Molecular Biology(all)
  • Agricultural and Biological Sciences(all)


Dive into the research topics of 'Motor, not visual, encoding of potential reach targets'. Together they form a unique fingerprint.

Cite this