Google has traditionally partnered up with Broadcom to design its Tensor Processing Units (TPU) which are basically specialized AI chips utilized in data centers and for cloud computing. The AI accelerator chips are being used by its parent company, Alphabet, and are not the same as Tensor Gx processors used for Pixel phones. However, as per the new report, this is about to change. It is now being claimed that the tech giant is switching from Broadcom to MediaTek to design its seventh-gen TPU, which could possibly be an attempt to improve efficiency and cut down on costs.
Google is now moving from Broadcom to MediaTek to design its seventh-gen AI chip
As per a report by The Information via Reuters, Google is set to collaborate with Taiwan's MediaTek for the next generation of its TPUs, which will hit production next year. This could be possibly due to MediaTek's strong ties with TSMC and how, in comparison to Broadcom, it can offer cost-effective production. Broadcom has remained the tech giant's exclusive AI chip partner for design, but moving away from it could be part of Google's effort to reduce reliance on third-party chipmakers like NVIDIA for AI computing.
Despite reports claiming that Google is moving away to a new design partner, MediaTek, it is important to highlight how the company is not compromising its ties with Broadcom completely and would still be depending on it for certain designs and the transitional phase. The strategic shift to MediaTek is primarily influenced by its ability to negotiate better manufacturing costs than Broadcom, given its strong ties with the world's leading chip foundry, TSMC. Google spent up to $9 billion on TPU last year, and saving up even a small fraction of the cost per chip would translate to billions in savings.
This switch would also help Google with its other goal, which is to gain more control over the chip design and reduce reliance on third-party solutions. By partnering up with MediaTek, the company could have more influence on the TPU architecture than otherwise, and it could play a significant role in designing its own chips. OpenAI and Meta rely heavily on NVIDIA's GPUs for training and running their AI models, and in case of any shortages, the rival companies could be at a major disadvantage.
Google's TPU strategy already puts it in a better position given how it has a self-sufficient AI hardware ecosystem, and, with the help of improved efficiency and lower cost by handing over some of the design to MediaTek, the company could have a major edge over the competition. Google, by investing heavily in its own AI chips and exploring diverse design partners, is avoiding any potential supply bottlenecks and strengthening its AI infrastructure.