Events
Collaborative, Communal, & Continual Machine Learning
Speaker: Colin Raffel
Location:
60 Fifth Avenue, Room C15
Videoconference link:
https://nyu.zoom.us/j/92163375439
Date: Thursday, March 16, 2023
Pre-trained models have become a cornerstone of machine learning thanks to the fact that they can provide improved performance with less labeled data on downstream tasks. However, these models are typically created by resource-rich research groups that unilaterally decide how a given model should be built, trained, and released, after which point it is never updated. In contrast, open-source development has demonstrated that it is possible for a community of contributors to work together to iteratively build complex and widely used software. This kind of large-scale distributed collaboration is made possible through a mature set of tools including version control and package management. In this talk, I will discuss a research focus in my group that aims to make it possible to build machine learning models in the way that open-source software is developed. Specifically, I will discuss our preliminary work on merging multiple models while retaining their individual capabilities, patching models with cheaply-communicable updates, designing modular model architectures, and tracking changes through a version control system for model parameters. I will conclude with an outlook on how the field will change once truly collaborative, communal, and continual machine learning is possible.