Assistant Professor of Linguistics, Data Science & Computer Science
Co-PI, ML² Group & CILVR Lab
New York University
PhD 2016, Stanford NLP Group & Stanford Linguistics
I study artificial neural network models for natural language understanding, with a focus on building high-quality training and evaluation data and on applying these models to scientific questions in syntax and semantics.
I'm also generally sympathetic toward effective altruism, and I'm part of Giving What We Can.
You're most likely to have encountered my group by way of our SNLI, MultiNLI datasets, our GLUE and SuperGLUE benchmark competitions, our jiant software toolkit, or our papers on topics like the inductive biases of language models or the viability of transfer learning for NLP.
If we haven't been in contact previously, please look through this FAQ before emailing me. You can reach me at email@example.com.
- I'm be presenting a position paper at at NAACL, speaking at the ACL workshop on Benchmarking, and co-presenting a tutorial at EMNLP on crowdsourcing.
- My lab's software toolkit, jiant, just got a from-scratch rewrite, and we think it's a good starting point for most research on NLU, especially for the kinds of tasks that you see in benchmarks like GLUE, SuperGLUE, and XTREME. It builds on the popular Transformers and Datasets libraries.
- I joined the Department of Computer Science at NYU's Courant Institute as an associated faculty member, in addition to my primary appointment in Linguistics and Data Science.
- I'm pleased to announce that our paper on the ULMFit data set for zero-shot learning has been accepted at EMNLP and is available on arXiv.
- I'm designing a new course on Natural Language Understanding for the Information and Computer Science School at NYU. It'll be a hands-on course, using Python and NLTK. We'll cover all of the basics: word and sentence segmentation, part-of-speech tagging, syntactic parsing, and named entity recognition.
- Those last two news items aren't real. They were generated by GPT-3, conditioning on the preceeding three news items.