\n

Meta claims its latest chip has \u201cdouble the compute and memory bandwidth\u201d of previous versions. It offers more internal memory (124MB compared to 64MB) and higher clock speed (1.35GHz compared to 800MHz). The new chips are reported to be running in 16 <\/a>of Meta\u2019s data center regions. Although the chips are not exclusively meant for training generative AI models, the company believes this will pave the way for superior infrastructure and AI experience. <\/p>\n\n\n\n

Meta also indicates that they will continue to improve these chips, stating, \u201cWe currently have several programs underway aimed at expanding the scope of MTIA, including support for GenAI workloads\u201d. <\/p>\n","post_title":"Meta Announces \u201cNext Generation\u201d AI Chip A Day After Intel And Google","post_excerpt":"","post_status":"publish","comment_status":"closed","ping_status":"closed","post_password":"","post_name":"meta-announces-next-generation-ai-chip-a-day-after-intel-and-google","to_ping":"","pinged":"\nhttps:\/\/about.fb.com\/news\/2024\/04\/introducing-our-next-generation-infrastructure-for-ai\/","post_modified":"2024-04-17 04:37:36","post_modified_gmt":"2024-04-16 18:37:36","post_content_filtered":"","post_parent":0,"guid":"https:\/\/www.thedistributed.co\/?p=16423","menu_order":0,"post_type":"post","post_mime_type":"","comment_count":"0","filter":"raw"}],"next":false,"total_page":false},"paged":1,"class":"jblog_block_13"};

Most Read

Subscribe To Our Newsletter

By subscribing, you agree with our privacy and terms.

Follow The Distributed

ADVERTISEMENT
\n

See Related:<\/em><\/strong> Meta Apes Launches on BNB Application Sidechain to Give Gamers the Best of Both Web2 and Web3 Gaming<\/a><\/p>\n\n\n\n

Meta claims its latest chip has \u201cdouble the compute and memory bandwidth\u201d of previous versions. It offers more internal memory (124MB compared to 64MB) and higher clock speed (1.35GHz compared to 800MHz). The new chips are reported to be running in 16 <\/a>of Meta\u2019s data center regions. Although the chips are not exclusively meant for training generative AI models, the company believes this will pave the way for superior infrastructure and AI experience. <\/p>\n\n\n\n

Meta also indicates that they will continue to improve these chips, stating, \u201cWe currently have several programs underway aimed at expanding the scope of MTIA, including support for GenAI workloads\u201d. <\/p>\n","post_title":"Meta Announces \u201cNext Generation\u201d AI Chip A Day After Intel And Google","post_excerpt":"","post_status":"publish","comment_status":"closed","ping_status":"closed","post_password":"","post_name":"meta-announces-next-generation-ai-chip-a-day-after-intel-and-google","to_ping":"","pinged":"\nhttps:\/\/about.fb.com\/news\/2024\/04\/introducing-our-next-generation-infrastructure-for-ai\/","post_modified":"2024-04-17 04:37:36","post_modified_gmt":"2024-04-16 18:37:36","post_content_filtered":"","post_parent":0,"guid":"https:\/\/www.thedistributed.co\/?p=16423","menu_order":0,"post_type":"post","post_mime_type":"","comment_count":"0","filter":"raw"}],"next":false,"total_page":false},"paged":1,"class":"jblog_block_13"};

Most Read

Subscribe To Our Newsletter

By subscribing, you agree with our privacy and terms.

Follow The Distributed

ADVERTISEMENT
\n

\u201cThe next generation of MTIA is part of our broader full-stack development program for custom, domain-specific silicon that addresses our unique workloads and systems\u201d<\/em>, the company states.\u00a0<\/p>\n\n\n\n

See Related:<\/em><\/strong> Meta Apes Launches on BNB Application Sidechain to Give Gamers the Best of Both Web2 and Web3 Gaming<\/a><\/p>\n\n\n\n

Meta claims its latest chip has \u201cdouble the compute and memory bandwidth\u201d of previous versions. It offers more internal memory (124MB compared to 64MB) and higher clock speed (1.35GHz compared to 800MHz). The new chips are reported to be running in 16 <\/a>of Meta\u2019s data center regions. Although the chips are not exclusively meant for training generative AI models, the company believes this will pave the way for superior infrastructure and AI experience. <\/p>\n\n\n\n

Meta also indicates that they will continue to improve these chips, stating, \u201cWe currently have several programs underway aimed at expanding the scope of MTIA, including support for GenAI workloads\u201d. <\/p>\n","post_title":"Meta Announces \u201cNext Generation\u201d AI Chip A Day After Intel And Google","post_excerpt":"","post_status":"publish","comment_status":"closed","ping_status":"closed","post_password":"","post_name":"meta-announces-next-generation-ai-chip-a-day-after-intel-and-google","to_ping":"","pinged":"\nhttps:\/\/about.fb.com\/news\/2024\/04\/introducing-our-next-generation-infrastructure-for-ai\/","post_modified":"2024-04-17 04:37:36","post_modified_gmt":"2024-04-16 18:37:36","post_content_filtered":"","post_parent":0,"guid":"https:\/\/www.thedistributed.co\/?p=16423","menu_order":0,"post_type":"post","post_mime_type":"","comment_count":"0","filter":"raw"}],"next":false,"total_page":false},"paged":1,"class":"jblog_block_13"};

Most Read

Subscribe To Our Newsletter

By subscribing, you agree with our privacy and terms.

Follow The Distributed

ADVERTISEMENT
\n

The first generation of Metas\u2019 AI chips was revealed last year and was called Meta Training and Inference Accelerator v1 (or MTIA v1). In a blog post<\/a>, the company reveals that the newer chips are simply titled \u201cnext generation\u201d MTIA. <\/p>\n\n\n\n

\u201cThe next generation of MTIA is part of our broader full-stack development program for custom, domain-specific silicon that addresses our unique workloads and systems\u201d<\/em>, the company states.\u00a0<\/p>\n\n\n\n

See Related:<\/em><\/strong> Meta Apes Launches on BNB Application Sidechain to Give Gamers the Best of Both Web2 and Web3 Gaming<\/a><\/p>\n\n\n\n

Meta claims its latest chip has \u201cdouble the compute and memory bandwidth\u201d of previous versions. It offers more internal memory (124MB compared to 64MB) and higher clock speed (1.35GHz compared to 800MHz). The new chips are reported to be running in 16 <\/a>of Meta\u2019s data center regions. Although the chips are not exclusively meant for training generative AI models, the company believes this will pave the way for superior infrastructure and AI experience. <\/p>\n\n\n\n

Meta also indicates that they will continue to improve these chips, stating, \u201cWe currently have several programs underway aimed at expanding the scope of MTIA, including support for GenAI workloads\u201d. <\/p>\n","post_title":"Meta Announces \u201cNext Generation\u201d AI Chip A Day After Intel And Google","post_excerpt":"","post_status":"publish","comment_status":"closed","ping_status":"closed","post_password":"","post_name":"meta-announces-next-generation-ai-chip-a-day-after-intel-and-google","to_ping":"","pinged":"\nhttps:\/\/about.fb.com\/news\/2024\/04\/introducing-our-next-generation-infrastructure-for-ai\/","post_modified":"2024-04-17 04:37:36","post_modified_gmt":"2024-04-16 18:37:36","post_content_filtered":"","post_parent":0,"guid":"https:\/\/www.thedistributed.co\/?p=16423","menu_order":0,"post_type":"post","post_mime_type":"","comment_count":"0","filter":"raw"}],"next":false,"total_page":false},"paged":1,"class":"jblog_block_13"};

Most Read

Subscribe To Our Newsletter

By subscribing, you agree with our privacy and terms.

Follow The Distributed

ADVERTISEMENT

AI Chip

Most Read

Subscribe To Our Newsletter

By subscribing, you agree with our privacy and terms.

Follow The Distributed

ADVERTISEMENT