Skip to main content
Skip to main content

General Meeting - Laurel Perkins / Filtering input for grammar learning

PhD student Laurel Perkins, facing the camera with a determined expression, sitting at a table in heavy shadow, holding a cup of tea in front of her

General Meeting - Laurel Perkins / Filtering input for grammar learning

Linguistics Friday, September 9, 2022 3:00 pm - 4:30 pm Marie Mount Hall, 1108B

Friday September 9th, our department's General Meeting kicks off with a visit from Laurel Perkins *19, Assistant Professor of Linguistics at UCLA, who will discuss recent computational work with colleague Tim Hunter *09 modeling the acquisition of basic word order given only immature syntactic representations. The abstract follows.


Filtering input for grammar learning

Perkins, Feldman and Lidz (2022) introduced a model for filtering a kind of internal 'noise' in early language acquisition – a mechanism that would allow young learners to draw accurate grammatical generalizations even when they have messy and immature representations of their input. Their test case was verb transitivity learning, i.e., identifying which verbs require direct objects. Tim Hunter and I are now working on scaling up this noise filtering mechanism to model the learning of more complex grammatical rule systems, such as word order parameters. 

In initial simulations on child-directed English and French, we demonstrate that it is possible for learners to learn word order only from strings of imperfectly-identified noun-phrases and verbs. Our modeled learner chooses among grammars that deterministically produce a canonical word order, while also considering probabilistic processes that introduce noise into the data. We find that successful learning does not require prosodic or semantic cues to sentence structure, but benefits from determinism in the learner’s hypothesis space. We think this has intriguing implications for the mechanisms that drive regularization in learning, where tendencies to regularize emerge from the learner’s expectation that its data are a noisy realization of a deterministic underlying system.

Add to Calendar 09/09/22 15:00:00 09/09/22 16:30:00 America/New_York General Meeting - Laurel Perkins / Filtering input for grammar learning

Friday September 9th, our department's General Meeting kicks off with a visit from Laurel Perkins *19, Assistant Professor of Linguistics at UCLA, who will discuss recent computational work with colleague Tim Hunter *09 modeling the acquisition of basic word order given only immature syntactic representations. The abstract follows.


Filtering input for grammar learning

Perkins, Feldman and Lidz (2022) introduced a model for filtering a kind of internal 'noise' in early language acquisition – a mechanism that would allow young learners to draw accurate grammatical generalizations even when they have messy and immature representations of their input. Their test case was verb transitivity learning, i.e., identifying which verbs require direct objects. Tim Hunter and I are now working on scaling up this noise filtering mechanism to model the learning of more complex grammatical rule systems, such as word order parameters. 

In initial simulations on child-directed English and French, we demonstrate that it is possible for learners to learn word order only from strings of imperfectly-identified noun-phrases and verbs. Our modeled learner chooses among grammars that deterministically produce a canonical word order, while also considering probabilistic processes that introduce noise into the data. We find that successful learning does not require prosodic or semantic cues to sentence structure, but benefits from determinism in the learner’s hypothesis space. We think this has intriguing implications for the mechanisms that drive regularization in learning, where tendencies to regularize emerge from the learner’s expectation that its data are a noisy realization of a deterministic underlying system.

Marie Mount Hall false