SK하이닉스가 현존 HBM 최대 용량인 36GB(기가바이트)를 구현한 HBM3E 12단 신제품을 세계 최초로 양산하며, HBM 분야에서 압도적인 기술력으로 시장을 적극 선도한다.
The highest performance and capacity DRAM single chip is now 40% thinner
SK Hynix is introducing the world's first 12-layer stacked HBM3E and is actively leading the market with its overwhelming technological prowess in the HBM field.
SK Hynix announced on the 26th that it has begun mass production of the world's first new HBM3E 12-layer product that implements 36GB (gigabytes), the largest current HBM capacity.
The company plans to supply mass-produced products to customers within the year, once again proving its overwhelming technological prowess just six months after it was the first in the industry to deliver HBM3E 8-layer products to customers in March.
SK Hynix emphasized, “We are the only company that has developed and supplied to the market the entire lineup of all generations, from the world’s first HBM generation (HBM1) launched in 2013 to the fifth generation of HBM (HBM3E).” It also added, “We were the first to successfully mass-produce a new 12-layer product that meets the growing standards of AI companies, and are continuing our unrivaled position in the AI memory market.”
The company explained that the HBM3E 12-layer product meets the world's best standards in all areas essential for AI memory, including speed, capacity, and stability.
First of all, the company increased the operating speed of this product to 9.6 Gbps, the fastest memory speed available. This means that when running the large language model (LLM) 'Rama 3 70B' with a single GPU equipped with four of these products, all 70 billion parameters can be read 35 times per second.
The company also increased capacity by 50% by stacking 12 3GB DRAM chips with the same thickness as existing 8-layer products. To achieve this, the individual DRAM chips were made 40% thinner than before and were stacked vertically using TSV technology.
It also solved the structural problem that occurs when stacking thinner chips higher. The company applied its core technology, Advanced MR-MUF process, to this product to increase heat dissipation performance by 10% compared to the previous generation, and secured product stability and reliability through enhanced warpage control.
SK hynix President and CEO Kim Joo-sun (in charge of AI Infrastructure) said, “Our company has once again proven itself as an unrivaled AI memory leader that breaks through technological limitations and leads the era,” and added, “We will continue to prepare next-generation memory products to overcome the challenges of the AI era and maintain our status as the ‘global No. 1 AI memory provider.’”