What are you looking for ?
Infinidat
Articles_top

SK hynix Provides Samples of Performing HBM3E Next-Gen of DRAM for AI Applications

Can process data up to 1.15TB/s, equivalent to processing more than 230 full-HD movies of 5GB-size each in 1s.

Summary:

  • Product to drive AI tech innovation with top performance to be produced in volume from 1H24
  • Launch of HBM3E to solidify firm’s leadership in AI memory market following success of HBM3
  • Expansion of HBM3E supply following industry’s largest scale of mass production of HBM to help accelerate business turnaround

SK hynix Inc. developed HBM3E (1), the next-gen of the highest-spec DRAM for AI applications currently available, and said a customer’s evaluation of samples is underway.

Sk Hynix Hbm3e 01

It said that the development of HBM3E, the extended version of HBM3 which delivers a world’s best specs, comes on top of its experience as the industry’s sole mass provider of HBM3. With its experience as the supplier of the industry’s largest volume of HBM products and the mass-production readiness level, the firm plans to mass produce HBM3E from 1H24 and solidify its leadership in AI memory market.

According to the company, the latest product meets the industry’s highest standards of speed, the key specif for AI memory products, and also all categories including capacity, heat dissipation and user-friendliness.

In terms of speed, the HBM3E can process data up to 1.15TB/s, which is equivalent to processing more than 230 Full-HD movies of 5GB-size each in 1s.

Sk Hynix Hbm3e 02

In addition, the product comes with a 10% improvement in heat dissipation by adopting the technology of the Advanced Mass Reflow Molded Underfill, or MR-MUF (2), onto the latest product. It also provides backward compatibility (3) that enables the adoption of the latest product even onto the systems that have been prepared for the HBM3 without a design or structure modification.

We have a long history of working with SK hynix on High Bandwidth Memory for leading edge accelerated computing solutions,” said Ian Buck, VP, hyperscale and HPC computing, Nvidia Corp. “We look forward to continuing our collaboration with HBM3E to deliver the next generation of AI computing.”

Sungsoo Ryu, head, DRAM product planning, SK hynix, said that the company, through the development of HBM3E, has strengthened its market leadership by further enhancing the completeness of HBM product lineup, which is in the spotlight amid the development of AI technology: “By increasing the supply share of the high-value HBM products, SK hynix will also seek a fast business turnaround.

(1) HBM (High Bandwidth Memory): A high-value, high-performance memory that vertically interconnects multiple DRAM chips, enabling an increase in data processing speed in comparison to earlier DRAM products. HBM3E is the extended version of the HBM3 and the 5thgen of its kind, succeeding the previous-gen HBM, HBM2, HBM2E and HBM3.
(2) MR-MUF: a process of attaching chips to circuits and filling the space between chips with a liquid material when stacking chips instead of laying a film to improve efficiency and heat dissipation
(3) Backward compatibility: an ability to allow interoperability between an older and a new system without modification to the design, especially in information technology and computing spaces. A new memory product with backward compatibility allows continued use of the existing CPUs and GPUs without modifications to design.

Read also :
Articles_bottom
AIC
ATTO
OPEN-E