Is Google The New Leader of the AI Race?

shutterstock 2412777321

Google’s parent company, Alphabet, is very close to becoming the fourth company to join the $4 trillion market cap club (the current members crossing that threshold are Apple, Microsoft, and Nvidia).

It’s arguably a week of good news thanks to its AI efforts.

Fellow tech giant and important Nvidia customer Meta is considering supplying some of its data centers with Google chips, according to The Information on Monday. The report claims the potentially billion-dollar deal will kick off in 2027, but Meta could also rent chips from Google Cloud as early as next year.

That news was preceded by the reveal of an exciting product. Last week, the tech giant released its latest AI model, Gemini 3, and announced some updates to its viral image generator Nano Banana Pro, both to considerable fanfare.

According to The Verge, Wei-Lin Chiang, co-founder and CTO of AI benchmarking firm LMArena, said that the release of Gemini 3 represents “much more than a leaderboard shuffle.”

Right now, two companies are generally seen as leading the AI ​​industry. On the product side you have OpenAI, whose ChatGPT has become almost synonymous with the term ‘AI chatbot’. In terms of hardware infrastructure, you have Nvidia, the world’s number one supplier of graphics processing units (GPUs), which are used to power AI.

But Google, a company that has a lot of money and resources to spend and the institutional knowledge to leverage as a Silicon Valley giant, is ready to put up a good fight on both fronts.

Many people on the Internet, including Salesforce CEO Marc Benioff, have claimed that Google’s Gemini 3 model is substantially superior to OpenAI’s ChatGPT.

Looking from the outside, OpenAI is still the leading name in AI chatbots. But, according to a report in The New York Times, Nick Turley, the head of ChatGPIT, told employees in October that the company was facing “the greatest competitive pressure it has ever seen.”

On the AI ​​chips front, Nvidia is still in the lead, but Google could make a big win in its catch-up efforts if the information report is true.

Nvidia’s GPUs are the AI ​​chips of choice right now, but Google’s custom tensor processing units (TPUs) are at least providing some competition.

While GPUs are considered versatile like a Swiss Army knife with the flexibility to accommodate a wide range of tasks, Google’s TPUs are specialized and considered more efficient for specific AI workloads. TPU is a type of application-specific integrated circuit (ASIC). One industry expert told CNBC last week that he sees custom ASICs growing “faster than the GPU market over the next few years.”

In addition to GPUs purchased from Nvidia, Google has been using its own TPUs to power its cloud computing business for several years. The tech giant is renting its TPUs to AI companies like Anthropic, which uses the chips for its chatbot cloud with Nvidia GPUs, as well as Amazon’s Trenium chips.

There’s no doubt that Meta would be a significant addition to that customer list, and perhaps give Google’s custom chips business a greater competitive edge in a market dominated by giants.



<a href

Leave a Comment