Paper

Using Word Embeddings to Uncover Discourses

Authors
  • Quentin Dénigot (Université de Paris, Paris, France)
  • Heather Burnett (CNRS)

Abstract

Word embeddings are generally trained on very large corpora to ensure their reliability and to better perform in specific sets of tasks.Critical Discourse Analysis usually studies corpora of much more modest sizes, but could use the word similarity ratings that word embeddings can provide. In this paper, we explore the possiblity of using word embeddings on these smaller corpora and see how the results we obtain can be interpreted when synchronically analysing corpora from different groups.

Keywords: word embeddings, discourse analysis, semantic variation

How to Cite:

Dénigot, Q. & Burnett, H., (2021) “Using Word Embeddings to Uncover Discourses”, Society for Computation in Linguistics 4(1), 298-312. doi: https://doi.org/10.7275/t4y8-z343

Downloads:
Download PDF

143 Views

35 Downloads

Published on
01 Jan 2021