Show simple item record

dc.contributor.authorPoletiek, Fenna
dc.contributor.authorConway, Christopher
dc.contributor.authorEllefson, Michelle
dc.contributor.authorLai, June
dc.contributor.authorBocanegra, Bruno
dc.contributor.authorChristiansen, Morten
dc.date.accessioned2018-09-21T15:22:46Z
dc.date.available2018-09-21T15:22:46Z
dc.date.issued2018-11
dc.identifier.issn0364-0213
dc.identifier.urihttps://www.repository.cam.ac.uk/handle/1810/280658
dc.description.abstractIt has been suggested that external and/or internal limitations paradoxically may lead to superior learning, i.e., the concepts of starting small and less is more (Elman, 1993; Newport, 1990). In this paper, we explore the type of incremental ordering during training that might help learning, and what mechanism explains this facilitation. We report four artificial grammar learning experiments with human participants. In Experiments 1a and 1b we found a beneficial effect of starting small using two types of simple recursive grammars: right-branching and center-embedding, with recursive embedded clauses in fixed positions and fixed length. This effect was replicated in Experiment 2 (N=100). In Experiment 3 and 4, we used a more complex center-embedded grammar with recursive loops in variable positions, producing strings of variable length. When participants were presented an incremental ordering of training stimuli, as in natural language, they were better able to generalize their knowledge of simple units to more complex units when the training input ‘grew’ according to structural complexity, compared to when it ‘grew’ according to string length. Overall, the results suggest that starting small confers an advantage for learning complex center-embedded structures when the input is organized according to structural complexity.
dc.description.sponsorshipThis research has been supported in part by a grant from the Human Frontiers Science Program (grant RGP0177/2001-B) to MHC, and by the Netherlands Organization for scientific Research (NWO) to FHP
dc.publisherWiley-Blackwell
dc.rightsAttribution 4.0 International
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.subjectArtificial Grammar Learning
dc.subjectCenter Embedded Structures
dc.subjectStarting Small
dc.subjectStatistical Learning
dc.titleUnder What Conditions Can Recursion be Learned? Effects of Starting Small in Artificial Grammar Learning of Center Embedded Structure
dc.typeArticle
prism.publicationNameCognitive Science
dc.identifier.doi10.17863/CAM.28024
dcterms.dateAccepted2018-07-24
rioxxterms.versionofrecord10.1111/cogs.12685
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserved
rioxxterms.licenseref.startdate2018-07-24
dc.contributor.orcidEllefson, Michelle [0000-0003-0407-9767]
dc.identifier.eissn1551-6709
rioxxterms.typeJournal Article/Review
cam.issuedOnline2018-09-27
cam.orpheus.successThu Jan 30 10:54:21 GMT 2020 - The item has an open VoR version.
rioxxterms.freetoread.startdate2100-01-01


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International
Except where otherwise noted, this item's licence is described as Attribution 4.0 International