Paper

How many maximum entropy grammars are predicted by a constraint set when we ignore small differences among grammars?

Author
  • Giorgio Magri

Abstract

All constraint-based probabilistic phonological typologies considered in the recent literature consist of uncountably many different grammars. Yet, what if two grammars that differ only slightly are coarsely counted as only one grammar when assessing the finiteness of a probabilistic typology? This paper formalizes various notions of coarse identity between probabilistic grammars and corresponding notions of coarse finiteness. It then shows that typologies of maximum entropy grammars can remain stubbornly infinite even when their grammars are counted coarsely. A companion paper shows that ypologies of noisy or stochastic harmonic grammars are instead always coarsely finite. Coarse finiteness thus provides further evidence that maximum entropy is a richer, less restrictive framework.

Keywords: maximum entropy grammars, noisy harmonic grammars

How to Cite:

Magri, G., (2024) “How many maximum entropy grammars are predicted by a constraint set when we ignore small differences among grammars?”, Society for Computation in Linguistics 7(1), 190–204. doi: https://doi.org/10.7275/scil.2143

Downloads:
Download PDF

190 Views

46 Downloads

Published on
24 Jun 2024
Peer Reviewed