Samsung Electronics reported a preliminary operating profit of 57.2 trillion won — approximately $37.9 billion — for the first quarter of 2026. That is a 755% jump year-on-year, and it is the highest quarterly result in the company’s history. The Device Solutions division, which houses Samsung’s semiconductor business, accounted for an estimated 95% of that total. KED Global called it higher than Samsung’s full-year earnings for 2025.
The headline number is striking. But what it reveals about the structure of the AI economy is more important than the number itself.
HBM Is the Bottleneck Everyone Pays Through
High-bandwidth memory chips are the critical component enabling AI accelerators. Every GPU cluster from NVIDIA, AMD, and Google’s TPU line requires HBM to achieve the memory bandwidth that training and inference at scale demands. There is no substitute. Production capacity is constrained. And the two companies that can make HBM at volume — Samsung and SK Hynix — sit at the centre of a structural supply shortage that every AI company, hyperscaler, and cloud provider is managing around.
Samsung’s HBM revenue is expected to triple year-on-year in Q1 2026. The company had spent 2024 catching up to SK Hynix, which had a head start qualifying its HBM3E chips for NVIDIA’s H100 and H200 systems. By late 2025, Samsung had qualified its own HBM4 chips and secured design wins for the next generation of NVIDIA accelerators. The Q1 2026 results reflect that qualification arriving in revenue.
The AI Boom’s Real Address
The public narrative around the AI economy centres on model developers. OpenAI passed $25 billion in annualised revenue. Anthropic is growing. Google, Microsoft, Meta, and Amazon are all racing to build foundational AI capabilities. These are the companies generating the press coverage, the regulatory debate, and the cultural conversation about artificial intelligence.
Samsung’s results are a reminder that the economic value capture in this cycle is happening somewhere else. The DS division’s estimated 54 trillion won in operating profit is not coming from subscription fees for a chatbot. It is coming from selling memory at elevated margins into a market where demand persistently exceeds supply. The AI companies that generate the headlines are, at this moment, net payers to the memory supply chain. They buy Samsung’s chips. Samsung does not buy their models.
This is not a new pattern in technology. During the PC boom, component suppliers — Intel above all — captured a disproportionate share of the value compared to the PC manufacturers assembling and selling the final product. During the smartphone era, suppliers of display panels, processors, and camera modules extracted reliable margin while handset makers competed aggressively on price. The AI era appears to be replicating that structure, with HBM chips playing the role that Intel processors played in the 1990s.
Samsung’s Position in the HBM Race
SK Hynix still holds the lead in HBM market share, having shipped the first commercially viable HBM3E chips and retaining preferred supplier status with NVIDIA for the Blackwell architecture. Samsung is in second place but closing. Its HBM4 qualification secures it a meaningful position in the next generation of accelerators, and its scale in standard DRAM and NAND flash — where it remains the global leader — gives it leverage that a HBM-only competitor would lack.
Micron Technology, the only non-Korean HBM manufacturer at scale, is expanding its production but remains a distant third in volume. Its presence is strategically important to Western governments seeking supply chain diversification away from Korea, but it is not yet a material constraint on Samsung’s pricing power.
The POV
Every quarter in which HBM demand exceeds supply is a quarter that Samsung and SK Hynix effectively levy a tax on the entire AI economy. The 755% earnings jump is that tax, rendered visible in a quarterly filing. It will moderate as supply capacity comes online and as the extraordinary demand spike from the initial AI infrastructure buildout stabilises into more predictable refresh cycles. But for now, the AI boom’s most reliable economic beneficiary is not the company whose name is on the product your colleagues are arguing about. It is the company making the memory that makes the product possible.
Samsung’s results also illuminate a broader structural shift in how AI value is distributed. While software companies command the headlines, it is the memory and semiconductor manufacturers quietly sitting at the base of every AI transaction who are capturing extraordinary margins. Every inference call, every training run, every model deployment flows through HBM chips. Samsung now makes a chip so critical to AI that the major cloud providers have little choice but to pay its price. That leverage, built over decades of manufacturing discipline, is proving more durable than any single software platform.
Sources
- Samsung makes history with record-breaking Q1 profit, higher than full 2025 earnings — KED Global, April 7, 2026
- Samsung Electronics Q1 2026 Record Profit $38B: AI Chips Drive Best-Ever Results — IBTimes AU
- Samsung’s profit surged 8x in Q1 2026, driven by AI data centre boom — SamMobile
- Samsung’s profit triples, beating estimates as AI chip demand fuels memory shortage — CNBC