Google unveils 4th generation TPU chips for faster machine learning

At its I/O conference tomorrow, Google will unveil a preview of Google Cloud’s latest machine learning clusters, which not only aim for nine exaflops of peak performance, but do so using 90% carbon-free power. It will be the largest publicly accessible machine learning center in the world.

At the heart of the new clusters is the TPU V4 pod. These tensor processing units were announced at Google I/O last year, and AI teams like Meta, LG, and Salesforce have already had access to the pods. V4 TPUs allow researchers to use any framework they want, be it Tensorflow, JAX or PyTorch, and have already enabled breakthroughs at Google Research in areas such as language understanding, computer vision and recognition vocal.

Based in Google’s data center in Oklahoma, the potential workloads for the clusters are expected to be similar, analyzing data in the areas of natural language processing, computer vision algorithms and recommender systems.

Tensor processing units in a Google data center

(Image credit: Google)

Access to clusters is offered in slices, ranging from four chips (one TPU virtual machine) to thousands of them. Slices with at least 64 chips use three-dimensional toroidal bonds, providing higher bandwidth for collective communication operations. The V4 chips are also able to access twice as much memory as the previous generation – 32 GB instead of 16 – and double the acceleration speed when driving large-scale models.

“In order to make advanced AI hardware more accessible, we launched the TPU Research Cloud (TRC) program a few years ago which provided free access to TPUs to thousands of ML enthusiasts around the world,” said Jeff Dean, SVP, Google Search and AI. “They’ve published hundreds of articles and open source github libraries on topics ranging from ‘writing Persian poetry with AI’ to ‘distinguishing between sleep and exercise-induced fatigue’. using computer vision and behavioral genetics.” The launch of Cloud TPU v4 is a major milestone for Google Research and our TRC program, and we are very excited about our long-term collaboration with ML developers around the world to use AI for good.

Google’s commitment to sustainability means the company has matched its data center energy consumption with venerable energy purchases since 2017, and by 2030 aims to manage all of its activities with renewable energies. The V4 TPU is also more power efficient than previous generations, producing three times the FLOPS per watt of the V3 chip.

Access to Cloud TPU v4 pods is offered in trial (on-demand), pre-emptive, and committed-use discount (CUD) options. It is offered to all Google AI Cloud users.

Comments are closed.