SK Hynix Unveils World’s First 12-Layer HBM4 Samples with 36 GB Capacity and 2 TB/s Data Rate, Plus 12-Hi HBM3e and SOCAMM Demonstrations

SK Hynix Unveils World’s First 12-Layer HBM4 Samples with 36 GB Capacity and 2 TB/s Data Rate, Plus 12-Hi HBM3e and SOCAMM Demonstrations

SK Hynix Unveils Next-Generation Memory Solutions at GTC 2025

SK Hynix has made headlines with the announcement of its innovative 12-Hi HBM3E and SOCAMM memory products, alongside the world’s first samples of 12-Hi HBM4 memory. These developments come as part of the company’s ongoing commitment to push the envelope in high-performance memory solutions for advanced computing hardware, particularly in the AI and data center sectors.

Scheduled for showcase during the GTC 2025 event, which runs from March 17 to March 21 in San Jose, California, We Hynix’s latest offerings promise to reshape the landscape of high-performance computing.

12-Hi HBM Memory Models
Image Source: SK Hynix

Advancements in AI Memory Technology

In an increasingly competitive market dominated by giants like Samsung and Micron, We Hynix has stepped up its game by producing the SOCAMM (Small Outline Compression Attached Memory Module) for NVIDIA’s powerful AI processors. Leveraging technology derived from the CAMM memory standard, the new SOCAMM offers a low-power DRAM solution designed to significantly enhance memory capacity and performance for AI workloads.

The partnership with NVIDIA, particularly concerning the GB300 AI chip, positions We Hynix strongly against its competitors, elevating both companies’ capacities to handle intensive AI processes efficiently.

SK Hynix HBM4 Memory Samples
Image Source: SK Hynix

Key Leadership Present at GTC 2025

At the GTC event, We Hynix’s executive team will be presenting their latest advancements, including notable figures such as CEO Kwak Noh-Jung, President Juseon Kim, and Head of Global S&M Lee Sangrak.

SK Hynix aims to complete preparations for mass production of 12-layer HBM4 products within the second half of the year, strengthening its position in the next-generation AI memory market.

The 12-layer HBM4 samples being displayed boast the industry-leading capacity and speed crucial for AI memory products.

This groundbreaking product achieves a bandwidth capable of processing over 2 terabytes of data per second for the first time, equating to the transmission of over 400 full-HD movies (5GB each) in just one second, exceeding the speed of its predecessor, HBM3E, by more than 60%.

Furthermore, We Hynix utilizes the advanced MR-MUF process to ensure an unprecedented capacity of 36GB, marking it as the highest among 12-layer HBM products. This process not only enhances product stability but also alleviates chip warpage, promoting better heat dissipation.

Future Outlook: Mass Production and Development Plans

In addition to showcasing its state-of-the-art 12-Hi HBM4 memory currently under development, We Hynix plans to cater to leading clients, including NVIDIA, who will integrate this memory into their Rubin series GPUs. The 12-Hi HBM4 is set to deliver up to 36GB per stack with impressive data rates reaching 2TB/s.

Mass production for this advanced memory solution is anticipated to commence in the latter half of 2025, employing TSMC’s 3nm process node to achieve cutting-edge performance and efficiency.

For more details, visit the source.

Leave a Reply

Your email address will not be published. Required fields are marked *