Paper

Noise-Tolerant Learning as Selection Among Deterministic Grammatical Hypotheses

Authors
  • Laurel Perkins (University of California, Los Angeles)
  • Tim Hunter (University of California, Los Angeles)

Abstract

Children acquire their language\'s canonical word order from data that contains a messy mixture of canonical and non-canonical clause types. We model this as noise-tolerant learning of grammars that deterministically produce a single word order. In simulations on English and French, our model successfully separates signal from the noise introduced by non-canonical clause types, in order to identify that both languages are SVO. No such preference for the target word order emerges from a comparison model which operates with a fully-gradient hypothesis space and an explicit numerical regularization bias. This provides an alternative general mechanism for regularization in various learning domains, whereby tendencies to regularize emerge from a learner\'s expectation that the data are a noisy realization of a deterministic underlying system.

Keywords: language acquisition, probabilistic grammar, regularization

How to Cite:

Perkins, L. & Hunter, T., (2023) “Noise-Tolerant Learning as Selection Among Deterministic Grammatical Hypotheses”, Society for Computation in Linguistics 6(1), 186-198. doi: https://doi.org/10.7275/fjc8-qw28

Downloads:
Download PDF

187 Views

51 Downloads

Published on
01 Jun 2023