Information flow, artificial phonology and typology
- Adamantios Gafos (Universität Potsdam)
Abstract
In the context Artificial Grammar Learning (AGL) experiments, it is possible to quantify how effectively a stimulus has conveyed information and specifically the information the experimenter thinks it was designed to convey. At the most basic level, this can be done if one has access to the response variability of independent responses to the same stimulus (or subparts of the stimulus). The variability of these responses serves as an index of the amount of information that flows from the source of the stimulus to the perceiver. Quantifying information flow in this way, it is shown that under conditions where participants learn a ‘natural’ but not an ‘unnatural’ rule there are asymmetries in entropic quantities under the different conditions.
Keywords: artificial grammar learning, artificial phonology, information theory, entropy, memory, typology
How to Cite:
Gafos, A., (2021) “Information flow, artificial phonology and typology”, Society for Computation in Linguistics 4(1), 148-157. doi: https://doi.org/10.7275/6zx1-p517
Downloads:
Download PDF