Google is training AI to design the next generation of chips
Researchers at Google have successfully trained a machine learning (ML) algorithm to create chips a lot faster than humans.
Its efforts, described in a recent paper in the journal Nature, has the potential to save “thousands of hours” of human effort by essentially completing months of work in mere hours.
What’s even more fascinating is that the method described in the paper isn’t just restricted to labs. The authors note that Google is putting it into action and using their research to design the upcoming version of Google’s own Tensor Processing Unit (TPU) chips, which are optimized to handle AI workloads.
- Here’s our roundup of the best laptops for programming
- We've put together a list of the best workstations on the market
- Need something more portable? Check out the best mobile workstations
“We show that our method can generate chip floorplans that are comparable or superior to human experts in under six hours, whereas humans take months to produce acceptable floorplans for modern accelerators,” write the Google research scientists in their paper.
Chip floorplanning is the engineering task of designing the physical layout of a computer chip. It requires months of effort by engineers, and the authors note that despite five decades of research, it is the one aspect of chip design that has defied automation.
AI for AI
In the paper, the authors note that the automatically generated chip floorplans outscore or match the ones created by humans in all key metrics such as power consumption, performance and chip area.
The researchers trained a reinforcement learning algorithm on a dataset of 10,000 chip floor plans. Each of the designs was tagged with a specific “reward” function based on different metrics, which helped the algorithm distinguish between good and bad floor plans.
The Verge notes that the Nature editorial calls the research an “important achievement” that could have major implications for the chip industry.
- Here's our list of the best business laptops
Via The Verge
No comments