Mosaic ML

The goal of Mosaic ML is making ML training efficient, and to improve efficiency of neural network training with algorithmic methods that deliver speed, boost quality and reduce cost. Built by a mosaic of passionate technologists, including MIT principal investigator Michael Carbin, the team was brought together to discover new ways to make Machine Learning training better, faster, and more efficient in order to accelerate breakthroughs.

The MosaicML Composer is an open source deep learning library purpose-built to make it easy to add algorithmic methods and compose them together into novel recipes that speed up model training and improve model quality. This library includes 20 methods for computer vision and natural language processing in addition to standard models, datasets, and benchmarks.

Detailed information available from Mosaic ML.