Extended Abstract

Can language models capture syntactic associations without surface cues? A case study of reflexive anaphor licensing in English control constructions

Authors
  • Soo-Hwan Lee (New York University)
  • Sebastian Schuster (New York University)

Abstract

We examine GPT-2 (Radford et al., 2019), which is trained only on surface strings, to see whether or not the model makes correct predictions about the agreement patterns of a reflexive anaphor in English control constructions. Our findings show that GPT-2 struggles with transitive subject control constructions, but does well on transitive object control constructions. One reason might be that the model tries to associate the anaphor with the closest noun phrase. Moreover, while we find that a model with a larger number of parameters shows higher accuracy on the tasks related to subject control constructions, performance remains below chance.

Keywords: control constructions, reflexive anaphor, GPT-2, long-distance dependencies

How to Cite:

Lee, S. & Schuster, S., (2022) “Can language models capture syntactic associations without surface cues? A case study of reflexive anaphor licensing in English control constructions”, Society for Computation in Linguistics 5(1), 206-211. doi: https://doi.org/10.7275/s1kt-qg26

Downloads:
Download PDF

129 Views

41 Downloads

Published on
01 Feb 2022