SK hynix has unveiled the world's first 16-Hi HBM3E memory solution which comes packed with up to 48 GB capacity per stack.
SK hynix Unveils Next-Gen HBM3E Memory With 16-Hi 48 GB Stacks, Sampling To Commence In Early 2025
The announcement of the 16-Hi HBM3E memory was made by CEO, Kwak Noh-Jung, during the SK AI Summit 2024. Kwak introduced the 16-Hi HBM3E solution during the event, offering samples of 48 GB capacity, the highest capacity, and the highest number of layers in the industry for an HBM product. The first samples of this expanded memory solution are expected to be provided in early 2025.
Press Release: SK hynix CEO Kwak Noh-Jung, during his keynote speech titled “A New Journey in Next-Generation AI Memory: Beyond Hardware to Daily Life” at SK AI Summit in Seoul, made public development of the industry’s first 48GB 16-high – the world’s highest number of layers followed by the 12-high product – HBM3E.
Summary of Mr. Kwak’s comments:
- The market for 16-high HBM is expected to open up from the HBM4 generation, but SK hynix has been developing 48GB 16-high HBM3E in a bid to secure technological stability and plans to provide samples to customers early next year.
- SK hynix applied the Advanced MR-MUF process, which enabled mass production of 12-high products, to produce 16-high HBM3E, while also developing hybrid bonding technology as a backup.
- 16-high products come with performance improvement of 18% in training, and 32% in inference vs 12-high products. With the market for AI accelerators for inference expected to expand, 16 high products are forecast to help the company solidify its leadership in AI memory in the future.
- SK hynix is also developing an LPCAMM2 module for PC and data centers, 1cnm-based LPDDR5 and LPDDR6, taking full advantage of its competitiveness in low-power and high-performance products.
- The company is also readying PCIe 6th generation SSD, high-capacity QLC-based eSSD, and UFS 5.0.
- SK hynix plans to adopt a logic process on the base die from HBM4 generation through collaboration with a top global logic foundry to provide customers with the best products.
- Customized HBM will be a product with optimized performance that reflects various customer demands for capacity, bandwidth, and function and is expected to pave the way for a new paradigm in AI memory.
- SK hynix is also developing technology that adds computational functions to memory to overcome the so-called memory wall. Technologies such as Processing Near Memory(PNM), Processing in Memory(PIM), and Computational Storage, essential to processing enormous amounts of data in the future, will be a challenge that transforms the structure of next-generation AI systems and the future of the AI industry.