How Contextutal are contextual language Models?

Speaker: Sebastian Schuster

Location: On-Line

Date: Thursday, December 2, 2021

The interpretation of many sentences depends on context. This is particularly true for pragmatic interpretations, that is, interpretations that go beyond the literal meaning of sentences. For example, consider the sentence "The dog looks awfully happy." Out of context, its interpretation primarily concerns some dog's emotional state. However, if this sentence is given as an answer to "What happened to the turkey?", listeners will additionally infer that the dog likely ate the turkey. In this talk, speaker will present three studies that investigate to what extent contextual language models (as well as models based on contextual language models) are able to take context into account when drawing different pragmatic inferences. Speaker will focus on context-dependent scalar inferences (inferring that "some" sometimes, but not always, means "some but not all"), presuppositions, and tracking of discourse entities, and explain to what extent models like BERT, DeBERTa, and GPT-3 can interpret context-sensitive sentences.