SK Hynix, the world’s second-largest memory chipmaker, has announced that its high-bandwidth memory (HBM) chips, key part in AI chipsets, are almost completely sold out for 2025. CEO Kwak Noh-Jung revealed that after fully booking their 2024 chips, the company is now preparing to send samples of their latest 12-layer HBM3E chips in May, with mass production scheduled for the third quarter.
Noh-Jung emphasized SK Hynix’s plans to advance AI chip technology, owing to the growing demand for high-speed, high-capacity, and low-power memory chips tailored for AI applications. He noted that while AI services currently revolve around data centers, there’s an anticipated expansion into on-device AI in devices such as smartphones, PCs, and automobiles, leading to a surge in demand for specialized memory chips.
SK Hynix CEO Kwak Noh-Jung highlighted the company’s technological prowess in various product areas, including HBM, TSV-based high-capacity DRAM, and high-performance eSSD, to provide industry-leading memory solutions through strategic partnerships with global collaborators. The surge in demand for HBM chips, propelled by the global proliferation of generative AI services like OpenAI’s ChatGPT, has led SK Hynix to fully book its HBM production for this year, with next year’s volumes also nearing full capacity.
During a recent press conference in South Korea, Kwak disclosed SK Hynix’s AI memory technology capabilities, market status, and investment plans for major South Korean and United States production sites. Anticipating the rapid expansion of AI technology into on-device applications such as smartphones, PCs, and automobiles.Â
Kwak attributed SK Hynix’s strong presence in the AI memory chip sector to the investment decisions and global networking abilities of SK Group Chairman Chey Tae-won, emphasizing that competitiveness in this field results from continuous technological development supported by substantial investment.
Looking forward to the future of AI memory, SK Hynix President Justin Kim highlighted the projected exponential growth in global data volume and the corresponding rise in revenue from AI memory technologies. Memory solutions tailored for AI applications, including HBM and high-capacity DRAM modules, are forecasted to comprise a large portion of the memory market by 2028. The company aims to enhance collaboration with leading partners in the system semiconductor and foundry sectors to ensure the timely development and delivery of products.Â
Additionally, SK Hynix emphasized its advanced packaging technology capabilities, particularly its MR-MUF technology. This technology has enhanced production efficiency and heat dissipation in high-layer stacking scenarios, positioning it as a key solution for future memory chip advancements.
In addition, SK Hynix is actively advancing its technological capabilities in memory chip development. The company plans to integrate advanced MR-MUF technology into its upcoming HBM4 chips, enabling 16-layer stacking and enhancing overall performance. Furthermore, SK Hynix has confirmed plans to construct an advanced packaging production facility for AI memory in Indiana, USA, aiming to commence mass production of next-generation HBM products in the latter part of 2028.
Also Read:
- South Korea Considers Joining AUKUS Partnership for Advanced Military Technology
- Naver, Intel, and KAIST Forge Alliance: Launch Collaborative AI Lab
- Samsung Boosts Chip Production Capabilities through Partnership with ZEISS Group
- Korean Air, Hyundai, KT Lead Successful UAM Operations in Test
- Hyundai Unveils ST1: Electric Work Vans with Futuristic Design