Abstract

Natural-Language-Like Systematicity from a Constraint on Excess Entropy

Author
  • Richard Futrell

Abstract

Human language is systematic: parts of form correspond regularly to components of meaning. For example, in the sentences I saw the cat, a cat ate food, etc., the part cat systematically refers to a particular aspect of meaning. Across languages, these parts are usually combined by concatenation. When they are not, the resulting string still usually has subsequences that correspond to components of meaning, and these parts remain fairly contiguous. We call this property locality. Here we argue that local systematicity in natural language arises from minimization of excess entropy, a measure of the complexity of incremental information processing.

Keywords: information theory, information locality, systematicity, incremental processing

How to Cite:

Futrell, R., (2024) “Natural-Language-Like Systematicity from a Constraint on Excess Entropy”, Society for Computation in Linguistics 7(1), 336–335. doi: https://doi.org/10.7275/scil.2222

Downloads:
Download PDF

189 Views

67 Downloads

Published on
24 Jun 2024
Peer Reviewed