Gemini 3 gives Google a boost in the AI race against OpenAI and Nvidia

Google has added another twist to the fast-changing AI race. And its biggest competitors are taking notice.

“We are pleased with Google’s success – they have made great strides in AI and we continue to supply Google,” Nvidia wrote in a Nov. 25 post on

OpenAI CEO Sam Altman also wrote on X, “Congratulations to Google on Gemini 3! It looks like a great model.”

These posts come just days after a growing buzz about Google’s Gemini 3 model and the Google-made chips that help power it. Marc Benioff, CEO of Salesforce, wrote on X that he is not going back to ChatGPIT after trying out Google’s new model. He wrote, “The jump is insane – the logic, the speed, the images, the video… everything is faster and faster. It’s like the world has changed again.”

Meta is now said to be in talks with Google about buying its Tensor chips, with Anthropic reportedly saying in October that it plans to significantly expand its own use of Google’s technology.

Google shares were up nearly 8% last week, while Nvidia shares were down a little more than 2%.

There is more at stake than just bragging rights or a few sales contracts. As the tech industry claims AI will reshape the world — including the investment portfolios of everyone from billionaires to 401k-holding retirees — which company and which approach comes out on top could affect nearly every American.

At face value, Nvidia’s post says the company isn’t worried about Google encroaching on its territory. And for good reason – Google’s chips are fundamentally different from Nvidia’s offerings, which means they’re not match-for-match options.

But that OpenAI and Nvidia felt the need to acknowledge Google is telling.

“Right now they’re the leader, let’s call it, until someone else comes out with the next model,” CFRA senior vice president and technology chief Angelo Zino told CNN.

Google and Meta did not immediately respond to requests for comment. Nvidia declined to comment.

Google is hardly an AI underdog. Along with ChatGPT, Gemini is one of the world’s most popular AI chatbots, and Google is one of the few large cloud providers known as a “hyperscaler,” a term for a handful of tech giants that rent massive amounts of cloud-based computing resources to other companies. Google services like search and translation have used AI since the early 2000s.

Still, Google was left largely behind when OpenAI’s ChatGPIT arrived in 2022. According to The New York Times, Google management reportedly issued a “code red” in December 2022 following the overnight success of ChatGPT. According to its creator, OpenAI, ChatGPT now has at least 800 million weekly active users, while Google’s Gemini app has 650 million monthly active users.

Attendees try out new Gemini AI model features at the Made by Google event in Mountain View, California on August 13, 2024.

But Gemini 3, which debuted on November 18, now tops benchmark leaderboards for tasks like text generation, image editing, image processing and converting text to images, putting it ahead of rivals like ChatGPT, XAI’s Grok and Anthropic’s Cloud in those categories.

Google said more than one million users tried Gemini 3 in the first 24 hours, through both the company’s AI coding program and tools that allow digital services to connect to other apps.

But people use different AI models for different purposes, says Ben Barringer, global head of technology research at investment firm Quilter Cheviot. For example, xAI and Perplexity’s models ranked higher than Gemini 3 search performance in benchmark tests.

“This doesn’t mean that (Google parent) Alphabet is going to be everything when it comes to AI,” Zino said. “They are another part of this AI ecosystem that continues to grow larger.”

Google started making its own Tensor chips long before the recent AI boom. But Nvidia still dominates in AI chips, with the company reporting a 62% year-on-year sales increase in the October quarter and profits rising 65% from a year earlier.

The main reason is that Nvidia’s chips are powerful and can be used more widely. Nvidia and its main rival, AMD, specialize in chips known as graphics processing units, or GPUs, that can perform large amounts of complex calculations quickly.

Google’s Tensor chips are ASICs, or chips that are custom-built for specific purposes.

Components of the Nvidia Corp. GB3000 GPU on display during the Honorable High Tech Days conference in Taipei, Taiwan, on Friday, Nov. 21, 2025.

While both GPUs and Google’s chips can be used for training and running AI models, ASICs are typically designed for “narrower workloads” than GPUs, Jacob Feldgois, senior data research analyst at the Georgetown Center for Security and Emerging Technology, told CNN in an email.

In addition to differences in the types of chips, Nvidia offers complete technology packages for use in data centers that include not only GPUs, but other critical components such as networking chips.

It also offers a software platform that allows developers to customize their code so that their apps make better use of Nvidia’s chips, which is a major selling point for adding long-term customers. Even Google is an Nvidia client.

“If you look at the magnitude of Nvidia’s offerings, no one can really touch them,” said Ted Mortenson, a strategist on Baird’s technology desk sector.

Chips like Google won’t replace Nvidia any time soon. But the increase in ASIC adoption, coupled with more competition from AMD, may suggest companies are looking to reduce their reliance on Nvidia.

And Google won’t be the only AI chip competitor, said Quilter Cheviot’s Barringer, and it’s doubtful it will overcome Nvidia’s dominance.

“I think it’s part of the balance,” he said.



<a href=

Leave a Comment