Main Ads

Ad

Google is Announcing Its AI Chips Specified for Inference Processing

1 hour ago | Artificial Intelligence


Jakarta, INTI - Google's artificial intelligence (AI) chips are now one of the most sought-after commodities in the global technology industry. Several major AI developers are starting to secure supplies of these chips. Google's parent company is reportedly preparing to launch a new chip focused on inference, the stage where an AI model is run after the training process is complete.

Google’s AI chips are projected to challenge other leading companies in the rapidly growing AI chip market. Google's Chief Scientist, Jeff Dean, stated that the increasing need for high-speed AI processing makes chip specialization increasingly important. He said that it now makes sense to separate training and inference-specific AI chips.

Google is scheduled to announce the next generation of Tensor Processing Units (TPU) at the Google Cloud Next event in Las Vegas this week. While the company has not revealed details of its latest inference chip, Google's Head of AI Infrastructure, Amin Vahdat, has signaled that a further announcement will be made soon.

Google AI Chip Development

Currently, Nvidia’s GPU remains the main standard for large-scale AI model training. However, competition is shifting to the inference area, where response speed is a crucial factor, especially for chatbots and AI agents.

Google has several advantages, such as over a decade of experience in chip design, substantial resource support from its search business, and close integration between AI development and hardware. Google also produces its own chips on a large scale.

Google’s direction in making TPU is also an advantage, as recently, TPUs have been seen as increasingly important to support the computational needs of a new generation of AI agents capable of performing complex tasks automatically.

Demands for Google chips began last October, when Anthropic announced a major deal to access up to one million TPUs. Shortly after, Google launched its Gemini 3 AI model, which received widespread positive feedback. Another demand also came from Meta Platforms, which signed a deal with the company to use TPUs through Google Cloud.

However, Google also faces structural challenges in chip development. The production cycle could take three years, too slow for the fast-paced AI model development. To tackle this, Google is considering a more flexible chip design and developing several variants simultaneously.

Conclusion 

Google's AI chips are gaining traction in the global industry, including among its competitors. The company is preparing to launch a new chip focused on inference, designed to accelerate AI performance after training. This move has the potential to strengthen Google's position in challenging Nvidia's dominance, especially as competition shifts to inference requirements. With its extensive experience, technology integration, and in-house chip production, Google is considered to have an advantage, while interest in TPUs continues to grow to support the development of next-generation AI agents.

Read more: Google Updates AI Mode, Now Enables Simultaneous Web Search and AI Access

 

Indonesia Technology & Innovation
Advertisement 1