The first generation of Metas’ AI chips was revealed last year and was called Meta Training and Inference Accelerator v1 (or MTIA v1). In a blog post, the company reveals that the newer chips are simply titled “next generation” MTIA.
“The next generation of MTIA is part of our broader full-stack development program for custom, domain-specific silicon that addresses our unique workloads and systems”, the company states.
See Related: Meta Apes Launches on BNB Application Sidechain to Give Gamers the Best of Both Web2 and Web3 Gaming
Meta claims its latest chip has “double the compute and memory bandwidth” of previous versions. It offers more internal memory (124MB compared to 64MB) and higher clock speed (1.35GHz compared to 800MHz). The new chips are reported to be running in 16 of Meta’s data center regions. Although the chips are not exclusively meant for training generative AI models, the company believes this will pave the way for superior infrastructure and AI experience.
Meta also indicates that they will continue to improve these chips, stating, “We currently have several programs underway aimed at expanding the scope of MTIA, including support for GenAI workloads”.