SK Hynix posted record quarterly profits and announced plans to boost spending this year as the Nvidia supplier works to address investor concerns about slowing AI chipset demand. The South Korean memory chipmaker reported a 69% jump in operating profit to 9.2 trillion won ($6.69 billion) for Q2, driven by strong demand for high-bandwidth memory (HBM) chips essential for AI processing.
What you should know: SK Hynix is doubling down on AI chip production despite market uncertainties, positioning itself as the dominant player in advanced memory technology.
- The company plans to double HBM chip sales for the full year compared to 2024, though it didn’t quantify its new investment spending for 2025.
- Revenue climbed 35% to 22.2 trillion won during the quarter, beating analyst expectations of 9.0 trillion won in operating profit.
- Shares jumped more than 3% in early trading and are up 55% year-to-date, significantly outperforming the broader Korean market’s 32.7% rise.
The big picture: SK Hynix has emerged as the world’s top memory chipmaker, overtaking Samsung Electronics in Q1 due to its leadership in HBM technology crucial for AI applications.
- The company’s quarterly profit is double what Samsung Electronics expects to report, as Samsung faces a projected 56% plunge in operating profit due to weak AI chip sales.
- “SK Hynix foresees that increasing competition among big tech companies to enhance inference of AI models would lead to higher demand for high-performance and high-capacity memory products,” the company said in a statement.
In plain English: HBM chips are specialized memory components that work like super-fast storage units for AI systems. When companies like OpenAI or Google train their AI models, these chips help process massive amounts of data quickly—think of them as the high-performance engine that keeps AI running smoothly.
Why this matters: The memory chip industry sits at the center of the AI boom, with companies like SK Hynix supplying the specialized components that enable training and running large language models.
- HBM chips are essential components in AI chipsets designed by companies like Nvidia, a leading AI hardware manufacturer, that process vast amounts of data for AI model training.
- The company is in talks with a major customer regarding sales next year which are “proceeding as planned,” though it didn’t elaborate on specifics.
Tariff tensions: U.S. trade policy uncertainty is creating both opportunities and risks for the Korean chipmaker.
- SK Hynix said earnings were helped by customer demand to increase inventory ahead of potential tariffs after President Trump threatened to introduce semiconductor tariffs.
- A much-anticipated meeting between U.S. and South Korean officials to discuss tariffs was cancelled, despite hopes rising after Japan and the U.S. reached a tariff deal this week.
- While SK Hynix said in April that its U.S. export proportion wasn’t high, analysts warn the company could face pricing pressure from customers squeezed by tariffs.
Competitive pressures: Despite record results, SK Hynix faces headwinds from both market dynamics and rival competition.
- Goldman Sachs, a major investment bank, downgraded the stock to “neutral” last week, expecting HBM prices to decline for the first time next year.
- The company is bracing for rising competition from rivals in supplying advanced chips to Nvidia, according to analysts.
- Senior analyst Ryu Young-ho from NH Investment & Securities noted that SK Hynix’s proactive investment response reflects confidence as it faces “major competition from Samsung Electronics.”
Recent Stories
DOE fusion roadmap targets 2030s commercial deployment as AI drives $9B investment
The Department of Energy has released a new roadmap targeting commercial-scale fusion power deployment by the mid-2030s, though the plan lacks specific funding commitments and relies on scientific breakthroughs that have eluded researchers for decades. The strategy emphasizes public-private partnerships and positions AI as both a research tool and motivation for developing fusion energy to meet data centers' growing electricity demands. The big picture: The DOE's roadmap aims to "deliver the public infrastructure that supports the fusion private sector scale up in the 2030s," but acknowledges it cannot commit to specific funding levels and remains subject to Congressional appropriations. Why...
Oct 17, 2025Tying it all together: Credo’s purple cables power the $4B AI data center boom
Credo, a Silicon Valley semiconductor company specializing in data center cables and chips, has seen its stock price more than double this year to $143.61, following a 245% surge in 2024. The company's signature purple cables, which cost between $300-$500 each, have become essential infrastructure for AI data centers, positioning Credo to capitalize on the trillion-dollar AI infrastructure expansion as hyperscalers like Amazon, Microsoft, and Elon Musk's xAI rapidly build out massive computing facilities. What you should know: Credo's active electrical cables (AECs) are becoming indispensable for connecting the massive GPU clusters required for AI training and inference. The company...
Oct 17, 2025Vatican launches Latin American AI network for human development
The Vatican hosted a two-day conference bringing together 50 global experts to explore how artificial intelligence can advance peace, social justice, and human development. The event launched the Latin American AI Network for Integral Human Development and established principles for ethical AI governance that prioritize human dignity over technological advancement. What you should know: The Pontifical Academy of Social Sciences, the Vatican's research body for social issues, organized the "Digital Rerum Novarum" conference on October 16-17, combining academic research with practical AI applications. Participants included leading experts from MIT, Microsoft, Columbia University, the UN, and major European institutions. The conference...