Micron Unveils Next-Gen HBM4 Memory with 36 GB Capacity and Over 2 TB/s Bandwidth for Key Customers

Micron Unveils Next-Gen HBM4 Memory with 36 GB Capacity and Over 2 TB/s Bandwidth for Key Customers

Micron Technology has commenced the sampling of its next-generation HBM4 memory, making strides in high-performance speeds and substantial storage capabilities tailored for artificial intelligence (AI) platforms.

Micron Introduces HBM4 Memory to Key Clients: Trailblazing 12-Hi Solution at 36 GB Capacity & Over 2 TB/s Throughput

Press Release: Today, Micron announced the distribution of samples for its 36GB HBM4 memory configured in a 12-high stack to a select group of key customers.

This achievement reinforces Micron’s status as a leader in memory performance and energy efficiency for AI-driven requirements. Utilizing its advanced 1ß (1-beta) DRAM technology, along with its sophisticated 12-high packaging methods and an advanced memory built-in self-test (MBIST) feature, Micron HBM4 ensures seamless integration for clients and collaborators working on cutting-edge AI applications.

Significant Advancements in Performance

As the popularity of generative AI surges, efficiently managing inference processes becomes crucial. Micron’s HBM4 incorporates an expansive 2048-bit interface, delivering data speeds exceeding 2.0 TB/s per memory stack and showcasing over 60% enhanced performance when compared to its predecessor. This broader interface design fosters swift data communication, thus optimizing the inference capacity of large language models and sophisticated reasoning systems. In essence, HBM4 enables AI accelerators to operate with increased speed and enhanced reasoning capabilities.

Moreover, Micron HBM4 demonstrates an impressive 20% improvement in power efficiency relative to the earlier HBM3E models, which were already recognized for setting new standards in industry power efficiency. This enhancement ensures that data centers can achieve maximum throughput while minimizing energy consumption, further driving operational efficiency.

The Growing Influence of Generative AI

With the proliferation of generative AI applications, this innovative technology is poised to offer substantial societal advancements. HBM4 serves as a pivotal enabler by promoting rapid insights and discoveries, ultimately nurturing innovation across various sectors, including healthcare, finance, and transportation.

Micron’s Pioneering Role in AI Transformation

For nearly five decades, Micron has been at the forefront of memory and storage technology advancements. Today, the company continues to fuel AI progress by providing a comprehensive range of solutions designed to transform data into actionable intelligence, thereby empowering innovations from centralized data centers to edge computing environments.

By launching HBM4, Micron solidifies its commitment as a key driver of AI innovation and remains a trusted partner for clients facing complex technological challenges. The company anticipates ramping up HBM4 production in the year 2026, aligning with the rollout of its customers’ next-generation AI platforms.

Source & Images

Leave a Reply

Your email address will not be published. Required fields are marked *