back
Get SIGNAL/NOISE in your inbox daily

SK Hynix is forecasting explosive growth in the high-bandwidth memory (HBM) market, projecting a 30% annual expansion through 2030 as AI infrastructure demand surges globally. The South Korean memory giant’s optimistic outlook comes as it solidifies its position as Nvidia’s primary HBM supplier, with the custom HBM sector expected to reach tens of billions of dollars by decade’s end.

What you should know: SK Hynix anticipates sustained AI demand will drive unprecedented growth in specialized memory technology over the next six years.

  • The company projects 30% annual growth rates for HBM through 2030, fueled by expanding AI infrastructure requirements from major cloud providers like Amazon, Microsoft, and Google.
  • HBM technology stacks memory chips vertically like floors in a building, reducing power consumption and physical space while improving data-processing speed—making it essential for advanced AI applications.
  • The specialized memory technology, first introduced in 2013, has become increasingly critical as AI workloads demand higher performance and energy efficiency.

Tariff concerns addressed: South Korean chipmakers appear insulated from potential US semiconductor tariffs due to their American investment commitments.

  • While President Donald Trump proposed approximately 100% tariffs on semiconductor chips from nations without US manufacturing operations, South Korean officials indicate both SK Hynix and Samsung Electronics would be exempt.
  • SK Hynix is investing in US manufacturing capacity, including an advanced chip packaging plant and an AI research facility in Indiana, which could help safeguard against trade disruptions.
  • South Korea’s chip exports to the US were valued at $10.7 billion last year, with HBM shipments to Taiwan for packaging increasing sharply in 2024.

Competitive dynamics: The HBM market is becoming increasingly customized, with companies developing harder-to-substitute products.

  • SK Hynix and competitors including Samsung and Micron Technology are developing HBM4 products that integrate a “base die” for memory management, making it more difficult for rivals to replace their solutions.
  • Larger clients like Nvidia receive highly customized solutions, while smaller customers often rely on standardized designs.
  • Samsung recently cautioned that near-term HBM3E production could exceed market demand growth, potentially pressuring prices and highlighting the cyclical nature of semiconductor markets.

What they’re saying: SK Hynix executives express confidence in meeting diverse customer requirements as AI demand remains robust.

  • “AI demand from the end user is pretty much, very firm and strong… Each customer has different taste,” said Choi Joon-yong, head of HBM business planning at SK Hynix.
  • “We are confident to provide, to make the right competitive product to the customers,” he added.
  • Choi suggested capital spending from major cloud service providers could be revised upward, reflecting the direct correlation between AI infrastructure expansion and HBM demand.

Why this matters: The HBM market’s explosive growth trajectory reflects the broader AI infrastructure boom, but success will depend on companies’ ability to navigate customization demands and cyclical pricing pressures in an increasingly competitive landscape.

Recent Stories

Oct 17, 2025

DOE fusion roadmap targets 2030s commercial deployment as AI drives $9B investment

The Department of Energy has released a new roadmap targeting commercial-scale fusion power deployment by the mid-2030s, though the plan lacks specific funding commitments and relies on scientific breakthroughs that have eluded researchers for decades. The strategy emphasizes public-private partnerships and positions AI as both a research tool and motivation for developing fusion energy to meet data centers' growing electricity demands. The big picture: The DOE's roadmap aims to "deliver the public infrastructure that supports the fusion private sector scale up in the 2030s," but acknowledges it cannot commit to specific funding levels and remains subject to Congressional appropriations. Why...

Oct 17, 2025

Tying it all together: Credo’s purple cables power the $4B AI data center boom

Credo, a Silicon Valley semiconductor company specializing in data center cables and chips, has seen its stock price more than double this year to $143.61, following a 245% surge in 2024. The company's signature purple cables, which cost between $300-$500 each, have become essential infrastructure for AI data centers, positioning Credo to capitalize on the trillion-dollar AI infrastructure expansion as hyperscalers like Amazon, Microsoft, and Elon Musk's xAI rapidly build out massive computing facilities. What you should know: Credo's active electrical cables (AECs) are becoming indispensable for connecting the massive GPU clusters required for AI training and inference. The company...

Oct 17, 2025

Vatican launches Latin American AI network for human development

The Vatican hosted a two-day conference bringing together 50 global experts to explore how artificial intelligence can advance peace, social justice, and human development. The event launched the Latin American AI Network for Integral Human Development and established principles for ethical AI governance that prioritize human dignity over technological advancement. What you should know: The Pontifical Academy of Social Sciences, the Vatican's research body for social issues, organized the "Digital Rerum Novarum" conference on October 16-17, combining academic research with practical AI applications. Participants included leading experts from MIT, Microsoft, Columbia University, the UN, and major European institutions. The conference...