Close Menu
    2digital.news2digital.news
    • News
    • Analytics
    • Interviews
    • About us
    • Editorial board
    • Events
    2digital.news2digital.news
    Home»News»Samsung begins HBM4 memory production. A key AI component to be delivered to Nvidia as early as February
    News

    Samsung begins HBM4 memory production. A key AI component to be delivered to Nvidia as early as February

    Mikolaj LaszkiewiczBy Mikolaj LaszkiewiczJanuary 26, 20262 Mins Read
    LinkedIn Twitter Threads

    HBM4 (High Bandwidth Memory 4) is an ultra-high-throughput memory standard designed for advanced computing systems such as AI accelerators, supercomputers, and servers. It offers significantly higher data transfer speeds than previous generations, which is crucial for handling large language models and other compute-intensive AI workloads.

    A source familiar with the supply chain told Reuters that Samsung has successfully completed HBM4 qualification with Nvidia as well as other chipmakers, and that initial batches are already being prepared for shipment starting in February. The company has not disclosed details on production volumes or delivery schedules.

    News of Samsung’s production plans triggered a market reaction: Samsung shares rose (2.2% up) on the Seoul stock exchange, while shares of its main domestic rival, SK Hynix, declined (2.9% down) in morning trade. SK Hynix is currently one of the largest suppliers of HBM to Nvidia and other technology companies, but Samsung has been working for some time to expand its presence in this highly profitable segment.

    HBM4 is widely seen as a critical component in the ecosystem of advanced AI processors, particularly those designed to run ever-larger language and deep-learning models. A broader base of HBM suppliers could help reduce risks linked to tight supply and surging demand in the high-end memory market, which has been under strain due to the rapid expansion of AI infrastructure.

    Samsung, SK Hynix, and other memory manufacturers are expected to provide more details on their HBM4 strategies in upcoming earnings reports and investor presentations, which should offer clearer insight into the pace of development in this segment and potential shifts in the semiconductor market.

    This could also signal that Samsung, like Micron’s brand Crucial, may increasingly shift its focus toward AI-oriented memory. Crucial recently exited the consumer DRAM module business to concentrate on higher-margin solutions for data centers and AI accelerators. If Samsung likewise prioritizes HBM for AI over traditional DRAM and NAND for PCs and mobile devices, the consumer memory market could feel the impact in the form of tighter supply and higher prices, especially amid growing demand driven by the build-out of AI infrastructure.

    Share. Twitter LinkedIn Threads

    Related Posts

    News

    $5 million on the line to prove quantum computers work in medicine. Results expected in April

    March 20, 2026
    News

    Amazon acquires Rivr to develop stair-climbing delivery robots

    March 20, 2026
    News

    Meta AI agent exposed company and user data. Incident lasted about two hours

    March 19, 2026
    Read more

    Three Mechanisms of Aging: Autophagy, Metabolism, and Stem Cells

    March 11, 2026

    “People Have Been Cyborgs for a Long Time — We’re Just Embarrassed to Admit It”: Enhanced Games Could Trigger a Revolution

    March 10, 2026

    When AI Gets a Body: Why Physical Intelligence Is Trickier Than It Seems

    March 5, 2026
    Stay in touch
    • Twitter
    • Instagram
    • LinkedIn
    • Threads
    Demo
    X (Twitter) Instagram Threads LinkedIn
    • NEWS
    • ANALYTICS
    • INTERVIEWS
    • ABOUT US
    • EDITORIAL BOARD
    • EVENTS
    • CONTACT US
    • ©2026 2Digital. All rights reserved.
    • Privacy policy.

    Type above and press Enter to search. Press Esc to cancel.