I wonder if this will result in writing more memory-efficient software? The trend for the last couple of decades has been that nearly all consumer software outside of gaming has moved to browsers or browser-based runtimes like Electron. There's been a vicious cycle of heavier software -> more RAM -> heavier software but if this RAM shortage is permanent, the cycle can't continue.
Apple and Google seemed to be working on local AI models as well. Will they have to scale that back due to lack of RAM on the devices? Or perhaps they think users will pay the premium for more RAM if it means they get AI?
Or is this all a temporary problem due to OpenAI's buying something like 40% of the wafers?
What do you mean it can't continue? You'll just have to deal with worse performance is all.
Revolutionary consumer-side performance gains like multi-core CPUs and switching to SSDs will be a thing of distant past. Enjoy your 2 second animations, peasant.
It would be nice if it were creeping up generation to generation. But if this keeps up I fear the opposite.
The promised AI metaverse is still a long way off and in the meantime people still want the best smartphone.
Nah. The marginal utility of more smartphone ram is near zero at this point. The vast majority of people wouldn't even notice if the memory in their phone tripled overnight.
And if you think that somebody buys an iPhone because they compare the specs with Android :)))))
"What do you mean my status flagship iPhone costs only half as much as a flagship Android???"
> PC market contract by 4.9% compared with a 2.4% year-on-year decline in the November forecast. Under a more pessimistic scenario, the decline could deepen to 8.9%.
The deal was inked on October 1, 2025, and rumors of it started swirling in September. Take a look at the RAM price charts. Anyone who attributes this just to "AI growth" has no idea what they're talking about. AI has been growing rapidly for three years and yet this price increase just happened exactly when Altman signed this deal.
https://pcpartpicker.com/trends/price/memory/
It's also worth noting that IDC, who published this report, is wholly owned by Blackstone, who is also heavily invested in OpenAI. It would be prudent to be cautious about who you believe.
The wafers are not DRAM. This is more likely burning oil wells so your enemy can't use them. Wafers are to chips what steel blanks are to engines. You basically need clean rooms just to accept delivery and entire fabs to do anything. Someone who doesn't own a fab buying the wafers is essentially buying them to destroy them.
Boomers might be out there consuming those AI youtube videos that are just tiktok voice over with a generated slide show but Millennials think since they can identify this as slop that they are not affected. That is incorrect, and just as bad.
Edit: It's similarly frustrating about the zoomers. Parents are derelict of duty by not defending their kids and preparing them for the world they are in.
Just wait until the next great collapse, a disaster big enough to force change. Hopefully we'll have the right ideas lying around at the time to restructure our social communication system.
Until then, it's slow decline. Embrace it.
I find it very odd when people proudly proclaim they used, say, Grok to answer a question. Their identity is so tied up in it that if you start talking about the quality of the information they get incredibly defensive. In contrast: I have never felt protective of my Google search results, which is basically the same thing given how most people use these tools currently.
It’s kind of wild how hostile some people get if you attempt to open the discussion up at all.
Every functionality be will subscription-based. You'll own nothing and you'll be happy.
Or consuming 2 GB of RAM to have Teams running in the background doing nothing?
Yeah, if we got rid of that as a result of RAM shortages, that’d be great.
The economy says nothing about requiring humans to exist.
How scarce does memory have to get before it makes health care half as expensive?