MaxEnt Learners are Biased Against Giving Probability to Harmonically Bounded Candidates
- Charlie O\'Hara (University of Michigan)
Abstract
One of the major differences between MaxEnt Harmonic Grammar (Goldwater and Johnson, 2003) and Noisy Harmonic Grammar (Boersma and Pater, 2016) is that in MaxEnt harmonically bounded candidates are able to get some probability, whereas in most other constraint-based grammars they can never be output (Jesney, 2007). The probability given to harmonically bounded candidates is taken from other candidates, in some cases allowing Max- Ent to model grammars that subvert some of the universal implications that are true in NoisyHG (Anttila and Magri, 2018). Magri (2018) argues that the types of implicational universals that remain valid in MaxEnt are phonologically implausible, suggesting that Max- Ent overgenerates NoisyHG. However, recent work has shown that some of the possible grammars in a constraint based grammar may be unlikely to be observed because they are difficult to learn (Staubs, 2014; Stanton, 2016; Pater and Moreton, 2012; Hughto, 2019; O’Hara, 2021). Here, I show that grammars that give weight to harmonically bounded candidates are harder to learn than other grammars. With learnability applied, I claim that the typological predictions of MaxEnt and NoisyHG are in fact much more similar than they would seem based on the grammars alone.
Keywords: MaxEnt, computational phonology, phonological learning, learning bias, harmonically bounded, noisy harmonic grammar, typological overgeneration
How to Cite:
O\'Hara, C., (2022) “MaxEnt Learners are Biased Against Giving Probability to Harmonically Bounded Candidates”, Society for Computation in Linguistics 5(1), 229-234. doi: https://doi.org/10.7275/sc8a-rf84
Downloads:
Download PDF