Google Challenges Nvidia by Adjusting Its TPUs to Meta's PyTorch
By Reuters | 17 Dec, 2025
Google's TorchTPU initiative seeks to make its TPU chipsets a more universally accessible rival to Nvidia GPUs.
Alphabet's Google is working on a new initiative to make its artificial intelligence chips better at running PyTorch, the world’s most widely used AI software framework, in a move aimed at weakening Nvidia's longstanding dominance of the AI computing market, according to people familiar with the matter.
The effort is part of Google's aggressive plan to make its Tensor Processing Units a viable alternative to Nvidia's market-leading GPUs. TPU sales have become a crucial growth engine of Google's cloud revenue as it seeks to prove to investors that its AI investments are generating returns.
But hardware alone is not enough to spur adoption. The new initiative, known internally as “TorchTPU,” aims to remove a key barrier that has slowed adoption of TPU chips by making them fully compatible and developer-friendly for customers who have already built their tech infrastructure using PyTorch software, the sources said. Google is also considering open-sourcing parts of the software to speed uptake among customers, some of the people said.
Compared with earlier attempts to support PyTorch on TPUs, Google has devoted more organizational focus, resources and strategic importance to TorchTPU, as demand grows from companies that want to adopt the chips but view the software stack as a bottleneck, the sources said.
PyTorch, an open-source project heavily supported by Meta Platforms, is one of the most widely used tools for developers who make AI models. In Silicon Valley, very few developers write every line of code that chips from Nvidia, Advanced Micro Devices or Google will actually execute.
Instead, those developers rely on tools like PyTorch, which is a collection of pre-written code libraries and frameworks that automate many common tasks in developing AI software. Originally released in 2016, PyTorch’s history has been closely tied to Nvidia’s development of CUDA, the software that some Wall Street analysts regard as the company’s strongest shield against competitors.
Nvidia’s engineers have spent years ensuring that software developed with PyTorch runs as fast and efficiently as possible on its chips. Google, by contrast, has long had its internal armies of software developers use a different code framework called Jax, and its TPU chips use a tool called XLA to make that code run efficiently. Much of Google’s own AI software stack and performance optimization has been built around Jax, widening the gap between how Google uses its chips and how customers want to use them.
A Google Cloud spokesperson did not comment on the specifics of the project, but confirmed to Reuters that the move would provide customers with choice.
"We are seeing massive, accelerating demand for both our TPU and GPU infrastructure," the spokesperson said. "Our focus is providing the flexibility and scale developers need, regardless of the hardware they choose to build on."
TPU FOR CUSTOMERS
Alphabet had long reserved the lion’s share of its own chips, or TPUs, for in-house use only. That changed in 2022, when Google’s cloud computing unit successfully lobbied to oversee the group that sells TPUs. The move drastically increased Google Cloud's allocation of TPUs and as customers' interest in AI has grown, Google has sought to capitalize by ramping up production and sales of TPUs to external customers.
But the mismatch between the PyTorch frameworks used by most of the world’s AI developers and the Jax frameworks that Google’s chips are currently most finely tuned to run means that most developers cannot easily adopt Google’s chips and get them to perform as well as Nvidia’s without undertaking significant, extra engineering work. Such work takes time and money in the fast-paced AI race.
If successful, Google's “TorchTPU” initiative could significantly reduce switching costs for companies that want alternatives to Nvidia’s GPUs. Nvidia’s dominance has been reinforced not only by its hardware but by its CUDA software ecosystem, which is deeply embedded in PyTorch and has become the default method by which companies train and run large AI models.
Enterprise customers have been telling Google that TPUs are harder to adopt for AI workloads because they historically required developers to switch to Jax, a machine-learning framework favored internally at Google, rather than PyTorch, which most AI developers already use, the sources said.
JOINT EFFORTS WITH META
To speed development, Google is working closely with Meta, the creator and steward of PyTorch, according to the sources. The two tech giants have been discussing deals for Meta to access more TPUs, a move first reported by The Information.
Early offerings for Meta were structured as Google-managed services, in which customers like Meta installed Google's chips designed to run Google software and models, with Google providing operational support. Meta has a strategic interest in working on software that makes it easier to run TPUs, in a bid to lower inference costs and diversify its AI infrastructure away from Nvidia’s GPUs to gain negotiating power, the people said.
Meta declined to comment.
This year, Google has begun selling TPUs directly into customers’ data centers rather than limiting access to its own cloud. Amin Vahdat, a Google veteran, was named head of AI infrastructure this month, reporting directly to CEO Sundar Pichai.
Google needs that infrastructure both to run its own AI products, including the Gemini chatbot and AI-powered search, and to supply customers of Google Cloud, which sells access to TPUs to companies such as Anthropic.
(Reporting by Krystal Hu, Kenrick Cai and Stephen Nellis in San Francisco; Editing by Kenneth Li and Matthew Lewis)
A 3D printed image of Meta Platforms' logo is seen in front of displayed Google logo in this illustration created on November 2, 2021. REUTERS/Dado Ruvic/Illustration/File Photo
Asian American Success Stories
- The 130 Most Inspiring Asian Americans of All Time
- 12 Most Brilliant Asian Americans
- Greatest Asian American War Heroes
- Asian American Digital Pioneers
- New Asian American Imagemakers
- Asian American Innovators
- The 20 Most Inspiring Asian Sports Stars
- 5 Most Daring Asian Americans
- Surprising Superstars
- TV’s Hottest Asians
- 100 Greatest Asian American Entrepreneurs
- Asian American Wonder Women
- Greatest Asian American Rags-to-Riches Stories
- Notable Asian American Professionals
