Home » Artificial intelligence » META Custom AI Inference Chips MTIA for a META AI Personal Assistant
In 2023, Meta revealed they have AI inference accelerators that they are designing in-house specifically for Meta’s AI workloads. Deep learning recommendation models that are improving a variety of experiences across Meta products.
At yesterdays earnings call, Meta highlighted the Meta Training and Inference Accelerator (MTIA) family of custom-made chips designed for Meta’s AI workloads. Custom $AVGO MTIA chips now targeting training workloads & ranking systems, aiming to reduce NVIDIA GPU dependence. The custom ASICs support inference. They will get cost efficiencies by deploying the custom MTIA silicon in areas where they can achieve a lower cost of compute by optimizing the chip to their unique workloads.
Meta revenues increased 21% over the last year to a new record high of $48.4 billion. Net income increased 49% YoY to a new record high of $20.8 billion. Operating margins increased to 48% from 41% a year ago.
META will still spend $60-65 billion on Capex and most of this will be AI Infrastructure.
AI success involves heavily investing in infrastructure and CapEx to deliver quality products at scale.
AI is driving revenue growth. 4 million advertisers are using generative AI tools which is up from 1 million six months ago. AI is benefiting their business, and the advancements and investments are becoming evident as they take a thoughtful approach.
AI is enhancing margins. META will improve margins by developing an AI agent capable of coding at a mid-level engineering standard.
META will focus on AI monetization after they reach a billion user scale. META wants to gets the AI products to scale and then look at monetization later.
Meta’s strategic push: Custom $AVGO MTIA chips now targeting training workloads & ranking systems, aiming to reduce NVIDIA GPU dependence. $NVDA down -5% as Meta’s in-house silicon threatens their AI dominance. Strategic shift could impact future datacenter GPU sales $META https://t.co/z9Re1jlJAQ pic.twit