Abstract

Transformer Performance on Case in Balto-Slavic Languages

Author
  • Lorenss Martinsons (Yale University)

Abstract

Recent advances in natural language processing have focused evaluation primarily on English models. This research aims to shed light on four major yet understudied Balto-Slavic languages ñ Ukrainian, Russian, Lithuanian, and Latvian. Balto-Slavic languages feature rich morphological systems, including noun case marking, that pose challenges for natural language processing. We test six major multilingual transformer models on targeted case agreement constructions. The findings showcase transformer models consistently employing a case agreement heuristic for short-range dependencies, a significant impact of balanced training data on accuracy, and reveal model types, such as XMOD, that improve cross-lingual syntactic processing. These findings underscore the need for continued rigorous evaluation of diverse languages to guide future model development.

Keywords: NLP, transformer, Balto-Slavic, language, model, LLM, case, case assignment, case agreement, Ukrainian, Russian, Lithuanian, Latvian, BERT, XLM, XMOD, GPT2, GPT3, LLaMA 2

How to Cite:

Martinsons, L., (2024) “Transformer Performance on Case in Balto-Slavic Languages”, Society for Computation in Linguistics 7(1), 285–288. doi: https://doi.org/10.7275/scil.2163

Downloads:
Download PDF

194 Views

56 Downloads

Published on
25 Jun 2024
Peer Reviewed