SK hynix has announced that it has developed a 12-layer HBM3 product with a memory capacity of 24GB, which is currently the largest in the industry. The company has also said that customers' performance evaluation of samples is underway. HBM is a high-performance memory that interconnects multiple DRAM chips and increases data processing speed compared to traditional DRAM products. HBM3 is the fourth-generation product, succeeding the previous generations HBM, HBM2, and HBM2E.

The company has improved process efficiency and performance stability by applying Advanced Mass Reflow Molded Underfill (MR-MUF) technology to the latest product, while Through Silicon Via (TSV) technology has reduced the thickness of a single DRAM chip by 40%, achieving the same stack height level as the 16GB product. SK hynix's HBM3 that integrated this technology can process up to 819GB per second, meaning that 163 FHD (Full-HD) movies can be transmitted in a single second.

The HBM, first developed by SK hynix in 2013, has drawn broad attention from the memory chip industry for its crucial role in implementing generative AI that operates in high-performance computing (HPC) systems. The latest HBM3 standard is considered the optimal product for rapid processing of large volumes of data, and therefore its adoption by major global tech companies is on the rise.

SK hynix has provided samples of its 24GB HBM3 product to multiple customers that have expressed great expectation for the latest product, while the performance evaluation of the product is in progress. The company plans to complete mass production preparation for the new product within the first half of the year to further solidify its leadership in the cutting-edge DRAM market in the era of AI.