Meta’s AI supercomputer
AI Research SuperCluster. © Meta.
Meta (formerly known as Facebook) says it has built an “AI supercomputer” exclusively for machine learning systems. By mid-year, when a system expansion is complete, it will be the fastest of its kind in the world, according to engineers at Meta. The RSC (AI Research SuperCluster) will eventually be able to work with neural networks comprising trillions of parameters, whose numbers are constantly growing. The GPT-3 natural language processor, for example, has 175 billion parameters.
Currently, the system is made up of an impressive cluster of 760 Nvidia DGX A100 computers, with a total of 6,080 GPUs. The computer cluster is linked by an Nvidia 200 gigabit-per-second Infiniband network. Storage includes 46 petabytes (46 million billion bytes) of cache memory and 175 petabytes of flash memory. When the system is complete in mid-2022, it will feature 16,000 GPUs. “RSC will help Meta’s AI researchers build new and better AI models that can learn from trillions of examples; work across hundreds of different languages; seamlessly analyze text, images, and video together; develop new augmented reality tools; and much more,” write researchers Kevin Lee and Shubho Sengupta in a Meta AI blog post announcing the news.
⇨ YouTube, “Introducing the AI Research SuperCluster — Meta’s cutting-edge supercomputer for AI research”
⇨ The Verge, James Vincent, “Meta has built an AI supercomputer it says will be world’s fastest by end of 2022.”
⇨ IEEE Spectrum, Samuel K. Moore, “Meta aims to build the world’s fastest AI supercomputer.”
2022-01-24