TY - JOUR
T1 - Under What Conditions Can Recursion Be Learned? Effects of Starting Small in Artificial Grammar Learning of Center-Embedded Structure
AU - Poletiek, Fenna H.
AU - Conway, Christopher M.
AU - Ellefson, Michelle R.
AU - Lai, Jun
AU - Bocanegra, Bruno R.
AU - Christiansen, Morten H.
N1 - Funding Information:
This research has been supported in part by a grant from the Human Frontiers Science Program (grant RGP0177/2001-B) to MHC, and by The Netherlands Organization for scientific Research (NWO) to FHP. We thank Jelle van Leusden for his assistance in carrying out Experiment 4.
Publisher Copyright:
© 2018 The Authors. Cognitive Science - A Multidisciplinary Journal published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
PY - 2018/11
Y1 - 2018/11
N2 - It has been suggested that external and/or internal limitations paradoxically may lead to superior learning, that is, the concepts of starting small and less is more (Elman,; Newport,). In this paper, we explore the type of incremental ordering during training that might help learning, and what mechanism explains this facilitation. We report four artificial grammar learning experiments with human participants. In Experiments 1a and 1b we found a beneficial effect of starting small using two types of simple recursive grammars: right-branching and center-embedding, with recursive embedded clauses in fixed positions and fixed length. This effect was replicated in Experiment 2 (N = 100). In Experiment 3 and 4, we used a more complex center-embedded grammar with recursive loops in variable positions, producing strings of variable length. When participants were presented an incremental ordering of training stimuli, as in natural language, they were better able to generalize their knowledge of simple units to more complex units when the training input “grew” according to structural complexity, compared to when it “grew” according to string length. Overall, the results suggest that starting small confers an advantage for learning complex center-embedded structures when the input is organized according to structural complexity.
AB - It has been suggested that external and/or internal limitations paradoxically may lead to superior learning, that is, the concepts of starting small and less is more (Elman,; Newport,). In this paper, we explore the type of incremental ordering during training that might help learning, and what mechanism explains this facilitation. We report four artificial grammar learning experiments with human participants. In Experiments 1a and 1b we found a beneficial effect of starting small using two types of simple recursive grammars: right-branching and center-embedding, with recursive embedded clauses in fixed positions and fixed length. This effect was replicated in Experiment 2 (N = 100). In Experiment 3 and 4, we used a more complex center-embedded grammar with recursive loops in variable positions, producing strings of variable length. When participants were presented an incremental ordering of training stimuli, as in natural language, they were better able to generalize their knowledge of simple units to more complex units when the training input “grew” according to structural complexity, compared to when it “grew” according to string length. Overall, the results suggest that starting small confers an advantage for learning complex center-embedded structures when the input is organized according to structural complexity.
KW - Artificial grammar learning
KW - Center-embedded structures
KW - Starting small
KW - Statistical learning
UR - http://www.scopus.com/inward/record.url?scp=85053909032&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85053909032&partnerID=8YFLogxK
U2 - 10.1111/cogs.12685
DO - 10.1111/cogs.12685
M3 - Article
C2 - 30264489
AN - SCOPUS:85053909032
SN - 0364-0213
VL - 42
SP - 2855
EP - 2889
JO - Cognitive Science
JF - Cognitive Science
IS - 8
ER -