Rob Matheson | MIT News Office

A new area in artificial intelligence involves using algorithms to automatically design machine-learning systems known as neural networks, which are more accurate and efficient than those developed by human engineers. But this so-called neural architecture search (NAS) technique is computationally expensive.

In a paper at the International Conference on Learning Representations, MIT researchers describe a NAS algorithm that can directly learn specialized convolutional neural networks (CNNs) for target hardware platforms — when run on a massive image dataset — in only 200 graphical processing unit (GPU) hours compared to 48,000 GPU of other NAS algorithms, which could enable far broader use of these types of algorithms.

Complete article from MIT News.