Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Tesla earnings don’t really matter to the stock anymore. What Wall Street is watching for

    April 22, 2026

    Proxy advisor ISS urges ConocoPhillips shareholders to vote for independent board chair

    April 22, 2026

    Microsoft’s LinkedIn makes executive Dan Shapero its new CEO

    April 22, 2026
    Facebook X (Twitter) Instagram
    Addison Markets
    • Home
    • USA
    • Europe
    • Business
    • Investing
    • Tech
    • Politics
    • Contact Us
    Addison Markets
    Home»Business»Google launches training and inference TPUs in latest shot at Nvidia
    Business

    Google launches training and inference TPUs in latest shot at Nvidia

    franperez66q@protonmail.comBy franperez66q@protonmail.comApril 22, 2026No Comments3 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email


    After years of producing chips that can both train artificial intelligence models and handle inference work, Google is separating those tasks into distinct processors, its latest effort to take on Nvidia in AI hardware.

    Google said Wednesday that it’s making the change for the eighth generation of its tensor processing unit, or TPU. Both chips will become available later this year.

    “With the rise of AI agents, we determined the community would benefit from chips individually specialized to the needs of training and serving,” Amin Vahdat, a Google senior vice president and chief technologist for AI and infrastructure, said in a blog post.

    In March, Nvidia talked up forthcoming silicon that can enable models to rapidly respond to users’ questions, thanks to technology obtained in its $20 billion deal with chip startup Groq. Google is a large Nvidia customer, but offers TPUs as an alternative for companies that use its cloud services.

    Most of the world’s top technology companies are pursuing custom semiconductor development for artificial intelligence to maximize efficiency and so they can build for specialized use cases. Apple has included neural engine AI components in its in-house iPhone chips for years. Microsoft announced a second-generation AI chip in January. Last week, Meta said it’s working with Broadcom to develop multiple versions of AI processors.

    Google was early to the trend. In 2015, the company started using processors it had designed for running AI models, and began renting them to cloud clients in 2018. Amazon Web Services announced the Inferentia chip for handling AI requests in 2018, and unveiled the Trainium processor for training AI models in 2020.

    DA Davidson analysts estimated in September that the TPU business, coupled with the Google DeepMind AI group, would be worth about $900 billion.

    None of the tech giants are displacing Nvidia, and Google isn’t even comparing the performance of its new chips with those from the AI chip leader. Google did say the training chip enables 2.8 times the performance of the seventh-generation Ironwood TPU, announced in November, for the same price, while performance is 80% better for the inference processor.

    Nvidia said its upcoming Groq 3 LPU hardware will draw on large quantities of static random-access memory, or SRAM, which is used by Cerebras, an AI chipmaker that filed to go public earlier this month. Google’s new inference chip, dubbed TPU 8i, also relies on SRAM. Each chip contains 384 megabytes of SRAM, triple the amount in Ironwood.

    The architecture is designed “to deliver the massive throughput and low latency needed to concurrently run millions of agents cost-effectively,” Sundar Pichai, CEO of Google parent Alphabet, wrote in a blog post.

    Adoption of Google’s AI chips is ramping up. Citadel Securities built quantitative research software that draws on Google’s TPUs, and all 17 U.S. Energy Department national laboratories use AI co-scientist software built on the chips, Google said. Anthropic has committed to using multiple gigawatts worth of Google TPUs.

    WATCH: Broadcom agrees to expanded chip deal with Google, Anthropic

    Broadcom agrees to expanded chip deal with Google, Anthropic
    Choose CNBC as your preferred source on Google and never miss a moment from the most trusted name in business news.



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    franperez66q@protonmail.com
    • Website

    Related Posts

    Tesla earnings don’t really matter to the stock anymore. What Wall Street is watching for

    April 22, 2026

    Yesway IPO: (YSWY) starts trading on the Nasdaq

    April 22, 2026

    Trump administration in advanced talks for Spirit Airlines rescue

    April 22, 2026

    Boeing (BA) Q1 2026 earnings

    April 22, 2026

    Lawmakers seek to override state data privacy laws with new bill

    April 22, 2026

    Trump says ‘maybe’ government should help struggling Spirit Airlines

    April 22, 2026
    Leave A Reply Cancel Reply

    Top Reviews
    Editors Picks

    Tesla earnings don’t really matter to the stock anymore. What Wall Street is watching for

    April 22, 2026

    Proxy advisor ISS urges ConocoPhillips shareholders to vote for independent board chair

    April 22, 2026

    Microsoft’s LinkedIn makes executive Dan Shapero its new CEO

    April 22, 2026

    Starmer admits No 10 asked about job for aide Matthew Doyle

    April 22, 2026
    © 2026 All right reserved
    • About Us
    • Privacy Policy
    • Terms & Conditions
    • Disclaimer

    Type above and press Enter to search. Press Esc to cancel.