Events
NLP/Text-As-Data: How contextual are contextual language models?
Speaker: Sebastian Schuster
Location: 60 Fifth Avenue, Room 7th Floor Open Space
Date: Thursday, March 31, 2022
The interpretation of many sentences depends on context. This is particularly true for pragmatic interpretations, that is, interpretations that go beyond the literal meaning of sentences. For example, consider the sentence "The dog looks awfully happy." Out of context, its interpretation primarily concerns some dog's emotional state. However, if this sentence is given as an answer to "What happened to the turkey?", listeners will additionally infer that the dog likely ate the turkey. In this talk, I will present three studies that investigate to what extent contextual language models (as well as models based on contextual language models) are able to take context into account when drawing different pragmatic inferences. I will focus on context-dependent scalar inferences (inferring that "some" sometimes, but not always, means "some but not all"), presuppositions, and tracking of discourse entities, and explain to what extent models like BERT, DeBERTa, and GPT-3 can interpret context-sensitive sentences.