Anthropic has announced an expansion of its use of Google Cloud’s TPU chips, providing the company with access to well over a gigawatt of capacity coming online in 2026. Anthropic will have access to up to one million TPU chips, making it the largest expansion of Anthropic’s TPU usage to date.
In an official statement, Anthropic said that the expansion was worth “tens of billions of dollars” but did not specify the amount. Meanwhile, in a press release, Google Cloud said that the capacity and computing resources are required to train and serve the next generations of Claude models.
Krishna Rao, CFO, Anthropic, said, “Anthropic and Google have a longstanding partnership, and this latest expansion will help us continue to grow the compute power we need to define the frontier of AI.”
Thomas Kurian, CEO, Google Cloud, said, “We are continuing to innovate and drive further efficiencies and increased capacity of our TPUs, building on our already mature AI accelerator portfolio, including our seventh-generation TPU.”
Readers would recall that Anthropic and Google Cloud initially announced a strategic partnership in 2023, with Anthropic using Google Cloud’s AI infrastructure to train its models and making them available to businesses through Google Cloud’s Vertex AI platform and through Google Cloud Marketplace.
Anthropic further shed light on its compute strategy that focuses on a diversified approach that uses three chip platforms–Google’s TPUs, Amazon’s Trainium, and NVIDIA’s GPUs. It said that this multi-platform approach ensures Anthropic can continue advancing Claude’s capabilities while maintaining strong partnerships across the industry.

