Will Knight | Wired Magazine
As language models get more complex, they also get more expensive to create and run. Some companies are locked out. AI has spawned exciting breakthroughs in the past decade—programs that can beat humans at complex games, steer cars through city streets under certain conditions, respond to spoken commands, and write coherent text based on a short prompt. Writing in particular relies on recent advances in computers’ ability to parse and manipulate language. Those advances are largely the result of feeding the algorithms more text as examples to learn from, and giving them more chips with which to digest it. And that costs money.
One option is a startup, Mosaic ML, spun out of MIT that is developing software tricks designed to increase the efficiency of machine-learning training. Developed by Michael Carbin, a professor at MIT, and Jonathan Frankle, one of his students, that involves “pruning” a neural network to remove inefficiencies and create a much smaller network capable of similar performance.
Complete article from Wired.
Explore
AI Tool Generates High-Quality Images Faster Than State-of-the-Art Approaches
Adam Zewe | MIT News
Researchers fuse the best of two popular methods to create an image generator that uses less energy and can run locally on a laptop or smartphone.
Photonic Processor Could Enable Ultrafast AI Computations with Extreme Energy Efficiency
Adam Zewe | MIT News
This new device uses light to perform the key operations of a deep neural network on a chip, opening the door to high-speed processors that can learn in real-time.
New Security Protocol Shields Data From Attackers During Cloud-based Computation
Adam Zewe | MIT News
The technique leverages quantum properties of light to guarantee security while preserving the accuracy of a deep-learning model.