Linguistics Colloquium: Kyle Gorman, Assistant Professor, Linguistics, The Graduate Center, CUNY

OCT 08, 2020 | 4:15 PM TO 6:00 PM

Details

WHERE:

Online Event

WHEN:

October 08, 2020: 4:15 PM-6:00 PM

ADMISSION:

Free

Description

Kyle Gorman (The Graduate Center, CUNY) presents: 

Abstractness, arbitrariness, and productivity in Polish declension

Two aspects of Polish declension have received a great deal of attention: 1) unpredictable suppletive allomorphy in the masculine gen.sg. suffix (e.g., gen.sg. wołu 'ox' vs. gen.sg. słoika 'jar') and 2) yers, which predictably alternate with zero (e.g., nom.sg. ocet 'vinegar', gen.sg.octu) but which contrast with otherwise-identical non-alternating mid vowels (e.g., nom.sg. facet 'guy, fellow'' vs. gen.sg. faceta). These seemingly lexical patterns, which have been studied formally, experimentally, and computationally, raise familiar questions about the abstractness, arbitrariness, and productivity of morphophonological grammar.

In the first portion of the talk I introduce the Tolerance model of productivity and apply it to the problem of -a/-u allomorphy. The Tolerance model makes two testable predictions, both borne out in speech data: somewhat counterintuitively, children acquiring Polish make few errors involving overgeneralization of the masculine gen.sg. suffixes, and that, for adult speakers, some masculine nouns will exhibit gen.sg. paradigm gaps.

In the second portion of this talk I analyze the performance of neural network models trained to inflect Polish nouns. These models include the two top-performing systems from the 2017 CoNLL-SIGMORPHON shared task on morphological reinflection and newer models trained on newly-collected Polish data. I explain how this task is important for speech and language technology, and show how error analysis can be automated using finite-state transducers. The Tolerance model of productivity and the analysis of yers proposed by Rubach (2013, 2016, etc.) suggest that it should be impossible to accurately predict, for example, the gen.sg. form of a Polish noun from its nom.sg. citation form. Is this correct, or can modern neural nets somehow transcend these limitations? Stay tuned!

Register online now »