Outbyte PC Repair

Microsoft Azure Becomes the First Hyperscaler to Feature NVIDIA’s Blackwell System with GB200

Microsoft Azure Becomes the First Hyperscaler to Feature NVIDIA’s Blackwell System with GB200

In March, NVIDIA unveiled its innovative Blackwell platform, which boasts a remarkable reduction of up to 25 times in cost and energy consumption compared to its previous model for training large language models. Anticipation is high among major cloud service providers and prominent AI companies, such as Amazon Web Services, Dell Technologies, Google, Meta, Microsoft, OpenAI, Oracle, Tesla, and xAI, who are expected to integrate Blackwell into their operations upon its release.

However, the launch of the Blackwell platform has faced a setback of up to three months due to a design issue. Recently, Microsoft announced on X that it has already begun receiving NVIDIA’s GB200 Blackwell chips and is currently optimizing its servers to fully utilize these new chips. This optimization leverages NVIDIA’s advanced Infiniband networking technology and innovative closed-loop liquid cooling systems.

Microsoft Azure is the first cloud platform to deploy @nvidia‘s Blackwell architecture with GB200-driven AI servers. We’re enhancing performance at every level to support the world’s leading AI models, utilizing Infiniband networking and cutting-edge closed loop liquid cooling. Discover more at MS Ignite. pic.twitter.com/K1dKbwS2Ew

— Microsoft Azure (@Azure) October 8, 2024

Moreover, Microsoft CEO Satya Nadella also shared an update regarding the GB200 deployment:

Our enduring collaboration with NVIDIA and ongoing innovation are setting the industry pace, empowering the most complex AI workloads. https://t.co/qaEoSv8dm5

— Satya Nadella (@satyanadella) October 8, 2024

Additionally, NVIDIA has recently delivered one of the initial engineering versions of the DGX B200 to the OpenAI team as well:

Check out what just arrived at our office. Thank you, @nvidia, for sending us one of the first engineering units of the DGX B200. pic.twitter.com/vy8bWUEwUi

— OpenAI (@OpenAI) October 8, 2024

Given the extensive interest from various potential clients for NVIDIA’s Blackwell platform, it makes sense that Microsoft and OpenAI are among the first to receive these chips. Unlike other major cloud providers such as Google and AWS, which have their own AI training infrastructures (Google with Tensor Processing Units and AWS by developing custom chips), Microsoft and OpenAI are fully reliant on NVIDIA’s technology, positioning them as some of NVIDIA’s largest clients.

Further details regarding the deployment of NVIDIA’s GB200 are anticipated from Microsoft during its Ignite conference scheduled for November.

Read more

Leave a Reply

Your email address will not be published. Required fields are marked *