Entry-Level GPUs Are In Trouble As DRAM Prices Go Nuclear
DRAM has evolved from a boring background commodity to the part that dictates who even bothers to build affordable GPUs. AI data centers are consuming vast amounts of memory, manufacturers are converting lines to HBM, and DDR5 prices are climbing so rapidly that 2020-era GPUs appear like a happy accident. The result is simple. Entry-level graphics cards are on the chopping block, and the people who lose out are budget PC builders and anyone who just wants a sensible 1080p box.
DRAM is no longer cheap and boring
For most of the last decade, DRAM has been a dull cost in a bill of materials. Prices spiked sometimes, collapsed at others, and OEMs rode the cycle. That world is gone for now. Contract DRAM prices have jumped by well over 100 percent year on year, with some reports putting the figure around 170 percent as AI demand takes priority over everything else. At the retail end, DDR5 kits that were a mid-range impulse buy a few months ago now sit in the “do I really need this” bracket, and there is no obvious sign that the curve is turning any time soon.
The core problem is not just demand. It is where the capacity is going. Samsung, SK hynix, Micron, and the rest are pouring capex into high-bandwidth memory, because every serious AI accelerator now ships with large slabs of HBM stacked on a package. The same sets of fabs and engineers that used to build mountains of standard DRAM are increasingly allocated to HBM and high-value server parts. Consumer DDR5 and GDDR are not the priority. When you have finite wafers and the choice is between high-margin HBM for hyperscalers or cheap VRAM for a £199 graphics card, you do not need a spreadsheet to see which wins.
AI training is eating the DRAM world
AI data centers are memory hogs. A single rack full of accelerators can consume terabytes of DRAM between HBM on the packages and DDR5 in the hosts. Multiply that by thousands of racks across cloud providers, national AI projects, and startups trying to look credible, and you get a demand curve that looks like a launch, not a business. Memory makers see that and respond the only way they know how. Allocate more production to the parts that ship on premium contracts and raise prices for everything else.
The irony is that the AI boom arrived just as DRAM had finished one of its classic troughs. After years of weak PC demand and oversupply, prices in early 2025 were cut to the bone. Now they are sprinting in the other direction. That makes manufacturers even less willing to invest in raw capacity. No one wants to be the one who built a multi-billion-dollar fab just as the party ends. So instead of expanding, they are tuning what they already have and converting existing lines from commodity DRAM to higher margin products. That is great for their balance sheets and terrible for anyone who wanted cheap VRAM for GPUs.
Where GPUs fit into this mess
GPUs are not just silicon. A mid-range or high-end graphics card has a large, expensive die on an advanced node, a non-trivial PCB, power stages, cooling, and eight to thirty-six gigabytes of VRAM strapped around it. On an entry-level card, the balance looks very different. The GPU die is smaller, the cooler is simpler, the board is cheaper, and the fixed costs of VRAM are a much larger share of the total bill.
Take a very rough example. An 8 GB card built around GDDR6 used to carry perhaps forty dollars of memory at contract prices. If DRAM pricing pushes that to fifty or sixty dollars, you have just added a big chunk to the cost of a product that maybe retailed for two hundred. Even if you are conservative and say the increase is ten or fifteen dollars in cost, by the time you pass that through, everyone taking a cut, you are looking at twenty-five to forty dollars on the shelf. On a low-end card, that is the difference between a price point builders will tolerate and a product that sits on shelves.
On a fifteen-hundred-dollar halo card, that same VRAM hike is a rounding error absorbed by margin and marketing budget. On a budget model, it breaks the equation. That is why DRAM inflation matters much more at the bottom of the stack than at the top. AMD and Nvidia can hide it in this year’s flagship. They cannot hide it in a £200 card without making the card £250 or worse.
Why entry-level GPUs are first on the chopping block
When margins get squeezed, vendors start from the bottom. You do not cancel the products that fund your datacenter push or keep your brand in the headlines. You cancel the products that make the least money and require the same support as everything else. Entry-level GPUs are exactly that. They are complex to launch, require driver work, board partner coordination, validation, and support for years, and after all that, they return relatively little profit per unit.
The incentives line up brutally. Board partners can sell fewer SKUs, focus more on higher margin designs, and keep inventories smaller. GPU vendors avoid cannibalising previous generations at the low end and push buyers up the stack. Retailers avoid stocking products that swing from “great value” to “why is this so expensive” in one quarter because DRAM moved again. Everyone in the chain benefits from pruning the bottom. The only group that loses is the end user, especially those looking for a cheap upgrade.
Add in the fact that older cards and integrated graphics have become “good enough” for a lot of casual play, and you have a convenient excuse. If a Ryzen APU or an Intel Core Ultra iGPU can handle 1080p eSports with settings nudged down, why waste DRAM on a low-end discrete board that competes with your own processors or last year’s mid-range?
AMD’s position: APUs to the rescue, but not for everyone
AMD is accidentally well-positioned here. Their APUs have been quietly getting more capable every generation. A modern Ryzen with a decent integrated GPU can handle 1080p in lighter games and 720p or low settings in more demanding titles without a discrete card at all. As DRAM costs rise and low-end GPU margins shrink, AMD can lean into that story. Buy the chip, skip the card, and use the integrated graphics until you can afford something better.
That works for new builds and small form factor systems where a discrete card was always a squeeze. It works less well for people with older platforms who want a drop in graphics upgrade without replacing their entire PC. For them, the disappearance of affordable boards is a direct loss. AMD could fill that gap with cut-down Navi parts, but the economics argue against it. Those wafers can become midrange cards with higher margins, or even get diverted to custom silicon for consoles and handhelds that sell in big batches. In a world of expensive DRAM, spending it on low-margin parts looks like charity.
Nvidia’s position: all in on margin
Nvidia has already spent most of this cycle signalling that it is not especially interested in the bottom of the gaming market. It is chasing AI money, workstation money, and what remains of the high-end consumer space. The company will still ship something that looks like an entry-level card, but it is easy to imagine a future where the cheapest new GeForce with modern features is priced where old midrange once lived, and anything below that is left to old stock and integrated graphics.
That is even more likely when VRAM costs climb. Nvidia’s margins are famously high. There is no appetite to cut them just to keep a cheap board in the catalogue. If DRAM pricing says an 8 GB card needs to cost £250 to keep everyone happy, then Nvidia can simply move the floor to £250. It hurts, but it hurts the buyer more than anyone else. Plenty of people will either pay up or wander into the used market instead.
Consoles, handhelds, and why PC gamers are squeezed from both sides
Consoles and handheld PCs are not immune to DRAM inflation, but they have better leverage. Sony, Microsoft, Valve, and others sign very long contracts and buy memory in bulk. They can plan capacity with suppliers years ahead, and they ship relatively few SKUs compared to the chaos of the PC market. If DRAM stays expensive, expect new console revisions and handheld refreshes to bump prices gently or cut costs in other places rather than vanish.
PC gamers, especially those building on a budget, do not have that protection. They buy retail DRAM and retail GPUs at whatever price the market decides that week. If entry-level cards quietly disappear and DDR5 kits keep climbing, many people will find it cheaper and simpler to buy a console or a handheld that “just works” than to piece together a PC that feels compromised from the start. That is not a great outcome if you care about the health of the open PC ecosystem.
The used market pressure valve
One obvious escape route is the used GPU market. As older midrange cards filter down, a lot of them offer better performance than any theoretical new “entry level” board that manages to survive this cycle. If DRAM costs keep rising, expect the gap between new and used to grow. A manufacturer cannot reprice a three-generation-old card that it no longer makes. A seller on an auction site can, and will, undercut whatever is on the shelves.
The downside is obvious. Used hardware comes with no warranty worth talking about, power efficiency can be mediocre, and supply depends on previous cycles of overbuying. If mining booms and AI spinoffs keep absorbing cards at the high end, there may be fewer clean midrange boards dropping into the second-hand market than in the past. You also have the simple fact that not everyone wants to gamble on used gear, especially for a machine that has to last several years.
What this means for PC builders in practical terms
For anyone building or upgrading on a budget, the practical picture looks like this. Expect the price floor for new discrete GPUs to move up. Expect 8 GB boards to become less generous at the low end and more common in what used to be the midrange. Expect 4 GB and 6 GB cards to hang around too long or vanish entirely, depending on how the vendors want to spin the marketing. Expect prebuilt systems to use APUs and integrated graphics more aggressively, especially in office and light gaming machines.
The smart move for builders is to think in total system terms. If you cannot justify the cost of a discrete card, consider a Ryzen APU build and spend the money on more DRAM and a better SSD instead. If you can stretch to a midrange GPU, accept that you are paying a DRAM tax and pick something that will last more than one cycle. It is better to buy slightly above your comfort zone once than to buy twice in quick succession, as memory keeps getting more expensive.
How this could be resolved, and why it will take time
DRAM is a cyclical market, but this cycle is different because HBM and AI workloads distort the usual patterns. Manufacturers are not simply waiting for demand to fall. They are actively retooling lines and chasing long-term contracts for high-margin products. That means the old “just wait and prices will crash” advice may not hold this time, or at least not on the same timeline. Analysts are already talking about tight DRAM supply into 2026 and 2027, and that assumes the AI sugar rush does not get worse.
In theory, sustained high prices should eventually justify new fabs and capacity expansions. In practice, the risk is that if demand falls back to more normal levels just as new factories come online, we end up with another brutal crash. No executive wants to be the one who greenlit that. So the industry is cautious, and in the meantime, everyone downstream pays the price.
My take
I do not think entry-level GPUs vanish entirely, but I do think the definition of “entry level” is changing in a way that does not favour PC gamers. The days of a cheap new card with modern features, decent VRAM, and a price tag that feels friendly are numbered, while DRAM remains expensive. Vendors will keep a token low-end part around so no one can say they abandoned the segment, but the real focus is going to be midrange and up, where the extra cost of memory hides inside the margins.
If you are building on a budget, it is time to adjust expectations. Look harder at APUs and integrated graphics. Treat used GPUs as a serious option rather than a last resort. When you do pay for a new card, buy something that will last several years rather than chase small generational gains. The DRAM market will calm down eventually. Until it does, the cheapest part of the stack will continue to look fragile, and entry-level GPUs will remain the easiest line for AMD and Nvidia to quietly erase.

Leave a Reply Cancel reply