Paper

Concurrent hidden structure & grammar learning

Author
  • Adeline Tan (University of California, Los Angeles)

Abstract

The concurrent learning of both unseen structures and grammar is an enduring problem in phonological acquisition. The present study develops a joint model of word-UR-SR triples that incorporates a Maximum Entropy model of SRs conditioned on URs. The learner was presented with word-SR frequencies, and successfully learned the hidden structures and grammars that enabled it to generalize well on test data that were withheld during training. When given an option between acquiring a grammar that supported a rich base analysis and one that didn’t, the learner always acquired the grammar that supported rich bases. These results suggest that the preference for acquiring a rich base grammar over a non rich base one is an emergent property of the proposed model.

Keywords: richness of the base, hidden structure, UR learning, maximum entropy

How to Cite:

Tan, A., (2022) “Concurrent hidden structure & grammar learning”, Society for Computation in Linguistics 5(1), 55-64. doi: https://doi.org/10.7275/fjh8-ne47

Downloads:
Download PDF

125 Views

34 Downloads

Published on
01 Feb 2022