Paper

Do speakers minimize dependency length during naturalistic dialogue?

Authors
  • Meghna Hooda (Indian Institute of Technology Delhi)
  • Mudafia Zafar (Indian Institute of Technology Delhi)
  • Samar Husain (Indian Institute of Technology Delhi)

Abstract

Dependency Length Minimization (DLM) is considered to be a linguistic universal governing word order variation cross-linguistically. However, evidence for DLM from large-scale corpus work is typically based on written (news) corpus and its effect on sentence production during naturalistic dialogue is largely unknown. Furthermore, Subject-Object-Verb languages are known to show a weaker preference for DLM. In this work, we test the validity of DLM using a dialogue corpus of Hindi, an SOV language. We also undertake a quantitative analysis of various syntactic phenomena that lead to DLM and compare the effect of DLM on both spoken and written modalities. Results provide novel evidence supporting a robust effect of DLM in spoken corpus. At the same time, compared to the written data, DLM was found to be weaker in dialogue. We discuss the implications of these findings on sentence production and on methodological issues with regards to the use of corpus data to investigate DLM.

Keywords: dependency length minimization, dialogue, word order, SOV

How to Cite:

Hooda, M., Zafar, M. & Husain, S., (2024) “Do speakers minimize dependency length during naturalistic dialogue?”, Society for Computation in Linguistics 7(1), 139–149. doi: https://doi.org/10.7275/scil.2138

Downloads:
Download PDF

57 Views

16 Downloads

Published on
24 Jun 2024
Peer Reviewed