HBM4 (High Bandwidth Memory 4) is an ultra-high-throughput memory standard designed for advanced computing systems such as AI accelerators, supercomputers, and servers. It offers significantly higher data transfer speeds than previous generations, which is crucial for handling large language models and other compute-intensive AI workloads.
A source familiar with the supply chain told Reuters that Samsung has successfully completed HBM4 qualification with Nvidia as well as other chipmakers, and that initial batches are already being prepared for shipment starting in February. The company has not disclosed details on production volumes or delivery schedules.
News of Samsung’s production plans triggered a market reaction: Samsung shares rose (2.2% up) on the Seoul stock exchange, while shares of its main domestic rival, SK Hynix, declined (2.9% down) in morning trade. SK Hynix is currently one of the largest suppliers of HBM to Nvidia and other technology companies, but Samsung has been working for some time to expand its presence in this highly profitable segment.
HBM4 is widely seen as a critical component in the ecosystem of advanced AI processors, particularly those designed to run ever-larger language and deep-learning models. A broader base of HBM suppliers could help reduce risks linked to tight supply and surging demand in the high-end memory market, which has been under strain due to the rapid expansion of AI infrastructure.
Samsung, SK Hynix, and other memory manufacturers are expected to provide more details on their HBM4 strategies in upcoming earnings reports and investor presentations, which should offer clearer insight into the pace of development in this segment and potential shifts in the semiconductor market.
This could also signal that Samsung, like Micron’s brand Crucial, may increasingly shift its focus toward AI-oriented memory. Crucial recently exited the consumer DRAM module business to concentrate on higher-margin solutions for data centers and AI accelerators. If Samsung likewise prioritizes HBM for AI over traditional DRAM and NAND for PCs and mobile devices, the consumer memory market could feel the impact in the form of tighter supply and higher prices, especially amid growing demand driven by the build-out of AI infrastructure.

