Linguistics Colloquium: Kyle Gorman, Assistant Professor, Linguistics, The Graduate Center, CUNY

OCT 08, 2020 | 4:15 PM TO 6:00 PM



Online Event


October 08, 2020: 4:15 PM-6:00 PM




Kyle Gorman (The Graduate Center, CUNY) presents: 

Abstractness, arbitrariness, and productivity in Polish declension

Two aspects of Polish declension have received a great deal of attention: 1) unpredictable suppletive allomorphy in the masculine suffix (e.g., wołu 'ox' vs. słoika 'jar') and 2) yers, which predictably alternate with zero (e.g., ocet 'vinegar', but which contrast with otherwise-identical non-alternating mid vowels (e.g., facet 'guy, fellow'' vs. faceta). These seemingly lexical patterns, which have been studied formally, experimentally, and computationally, raise familiar questions about the abstractness, arbitrariness, and productivity of morphophonological grammar.

In the first portion of the talk I introduce the Tolerance model of productivity and apply it to the problem of -a/-u allomorphy. The Tolerance model makes two testable predictions, both borne out in speech data: somewhat counterintuitively, children acquiring Polish make few errors involving overgeneralization of the masculine suffixes, and that, for adult speakers, some masculine nouns will exhibit paradigm gaps.

In the second portion of this talk I analyze the performance of neural network models trained to inflect Polish nouns. These models include the two top-performing systems from the 2017 CoNLL-SIGMORPHON shared task on morphological reinflection and newer models trained on newly-collected Polish data. I explain how this task is important for speech and language technology, and show how error analysis can be automated using finite-state transducers. The Tolerance model of productivity and the analysis of yers proposed by Rubach (2013, 2016, etc.) suggest that it should be impossible to accurately predict, for example, the form of a Polish noun from its citation form. Is this correct, or can modern neural nets somehow transcend these limitations? Stay tuned!

Register online now »