Will Knight | Wired Magazine

As language models get more complex, they also get more expensive to create and run. Some companies are locked out. AI has spawned exciting breakthroughs in the past decade—programs that can beat humans at complex games, steer cars through city streets under certain conditions, respond to spoken commands, and write coherent text based on a short prompt. Writing in particular relies on recent advances in computers’ ability to parse and manipulate language. Those advances are largely the result of feeding the algorithms more text as examples to learn from, and giving them more chips with which to digest it. And that costs money.

One option is a startup, Mosaic ML, spun out of MIT that is developing software tricks designed to increase the efficiency of machine-learning training. Developed by Michael Carbin, a professor at MIT, and Jonathan Frankle, one of his students, that involves “pruning” a neural network to remove inefficiencies and create a much smaller network capable of similar performance.

Complete article from Wired.