Paper

Neural network learning of the Russian genitive of negation: optionality and structure sensitivity

Authors
  • Natalia Talmina (Johns Hopkins University)
  • Tal Linzen (Johns Hopkins University)

Abstract

A number of recent studies have investigated the ability of language models (specifically, neural network language models without syntactic supervision) to capture syntactic dependencies. In this paper, we contribute to this line of work and investigate the neural network learning of the Russian genitive of negation. The genitive case can optionally mark direct objects of negated verbs, but it is obligatory in the existential copula construction under negation. We find that the recurrent neural network language model we tested can learn this grammaticality pattern, although it is not clear whether it learns the locality constraint on the genitive objects. Our results further provide evidence that RNN models can distinguish between optionality and obligatoriness.

Keywords: recurrent neural networks, syntactic dependencies, Russian, genitive of negation

How to Cite:

Talmina, N. & Linzen, T., (2020) “Neural network learning of the Russian genitive of negation: optionality and structure sensitivity”, Society for Computation in Linguistics 3(1), 199-208. doi: https://doi.org/10.7275/z7np-fx81

Downloads:
Download PDF

55 Views

30 Downloads

Published on
01 Jan 2020