MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
meta
Search

Meta's Building an In-House AI Chip to Compete with Other Tech Giants

Saturday May 20, 2023. 06:34 PM , from Slashdot
An anonymous reader shared this report from the Verge:
Meta is building its first custom chip specifically for running AI models, the company announced on Thursday. As Meta increases its AI efforts — CEO Mark Zuckerberg recently said the company sees 'an opportunity to introduce AI agents to billions of people in ways that will be useful and meaningful' — the chip and other infrastructure plans revealed Thursday could be critical tools for Meta to compete with other tech giants also investing significant resources into AI.

Meta's new MTIA chip, which stands for Meta Training and Inference Accelerator, is its 'in-house, custom accelerator chip family targeting inference workloads,' Meta VP and head of infrastructure Santosh Janardhan wrote in a blog post... But the MTIA chip is seemingly a long ways away: it's not set to come out until 2025, TechCrunch reports.
Meta has been working on 'a massive project to upgrade its AI infrastructure in the past year,' Reuters reports, 'after executives realized it lacked the hardware and software to support demand from product teams building AI-powered features.'

As a result, the company scrapped plans for a large-scale rollout of an in-house inference chip and started work on a more ambitious chip capable of performing training and inference, Reuters reported...

Meta said it has an AI-powered system to help its engineers create computer code, similar to tools offered by Microsoft, Amazon and Alphabet.
TechCrunch calls these announcements 'an attempt at a projection of strength from Meta, which historically has been slow to adopt AI-friendly hardware systems — hobbling its ability to keep pace with rivals such as Google and Microsoft.'

Meta's VP of Infrastructure told TechCrunch 'This level of vertical integration is needed to push the boundaries of AI research at scale.'

Over the past decade or so, Meta has spent billions of dollars recruiting top data scientists and building new kinds of AI, including AI that now powers the discovery engines, moderation filters and ad recommenders found throughout its apps and services. But the company has struggled to turn many of its more ambitious AI research innovations into products, particularly on the generative AI front. Until 2022, Meta largely ran its AI workloads using a combination of CPUs — which tend to be less efficient for those sorts of tasks than GPUs — and a custom chip designed for accelerating AI algorithms...

The MTIA is an ASIC, a kind of chip that combines different circuits on one board, allowing it to be programmed to carry out one or many tasks in parallel... Custom AI chips are increasingly the name of the game among the Big Tech players. Google created a processor, the TPU (short for 'tensor processing unit'), to train large generative AI systems like PaLM-2 and Imagen. Amazon offers proprietary chips to AWS customers both for training (Trainium) and inferencing (Inferentia). And Microsoft, reportedly, is working with AMD to develop an in-house AI chip called Athena.

Meta says that it created the first generation of the MTIA — MTIA v1 — in 2020, built on a 7-nanometer process. It can scale beyond its internal 128 MB of memory to up to 128 GB, and in a Meta-designed benchmark test — which, of course, has to be taken with a grain of salt — Meta claims that the MTIA handled 'low-complexity' and 'medium-complexity' AI models more efficiently than a GPU. Work remains to be done in the memory and networking areas of the chip, Meta says, which present bottlenecks as the size of AI models grow, requiring workloads to be split up across several chips. (Not coincidentally, Meta recently acquired an Oslo-based team building AI networking tech at British chip unicorn Graphcore.) And for now, the MTIA's focus is strictly on inference — not training — for 'recommendation workloads' across Meta's app family...

If there's a common thread in today's hardware announcements, it's that Meta's attempting desperately to pick up the pace where it concerns AI, specifically generative AI... In part, Meta's feeling increasing pressure from investors concerned that the company's not moving fast enough to capture the (potentially large) market for generative AI. It has no answer — yet — to chatbots like Bard, Bing Chat or ChatGPT. Nor has it made much progress on image generation, another key segment that's seen explosive growth.

If the predictions are right, the total addressable market for generative AI software could be $150 billion. Goldman Sachs predicts that it'll raise GDP by 7%. Even a small slice of that could erase the billions Meta's lost in investments in 'metaverse' technologies like augmented reality headsets, meetings software and VR playgrounds like Horizon Worlds.

Read more of this story at Slashdot.
https://tech.slashdot.org/story/23/05/20/0432211/metas-building-an-in-house-ai-chip-to-compete-with-...
News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Apr, Fri 19 - 21:52 CEST