CIRCL: Katherine Howitt (Computational Linguistics & Language Acquisition)

MAR 10, 2020 | 6:30 PM TO 8:30 PM

Details

WHERE:

The Graduate Center
365 Fifth Avenue

ROOM:

7102

WHEN:

March 10, 2020: 6:30 PM-8:30 PM

ADMISSION:

Free

Description

Katherine Howitt will present on a topic in Computational Linguistics & Language Acquisition.

Gradual Syntactic Triggering: The Gradient Parameter Hypothesis

The principles and parameters framework has provided a useful, but limited way to model syntax acquisition. In this talk, we propose a re-conceptualization of the principles and parameters (P&P) framework. We argue that in lieu of discrete parameter values, a parameter value exists on a gradient plane which encodes a learner’s confidence that a particular parametric structure licenses the utterances in the learner’s linguistic input. Crucially, this gradient parameter hypothesis obviates the need for default parameter values. Default parameter values can be put to use effectively from the perspective of linguistic learnability, but are lacking in terms of empirical and theoretical consistency. We present findings from a computational implementation of a gradient P&P learner based on Sakas & Fodor (2012) and inspired by Yang (2002). The findings suggest that the gradient parameter hypothesis provides the basis for a viable alternative to existing computational language acquisition models in the classic P&P paradigm.