----------------
1. Within a few months, these manufacturers will likely raise desktop mobo and CPU prices with the justification that "volumes are too low".
2. If you're upgrading from an older machine, it likely has a format of RAM that's not compatible with newer boards. Upgrading the cheap parts now and waiting for the expensive bits to come down is simply not an option. It's all or nothing.
Game and application developers should be paying close attention to this. You're used to the average user's system spec going up every year. That's stopped for now. The average memory in new systems may actually retreat!
I think we will see an abandonment of consumer grade PC components and individuals are either pushed towards closed hardware like Playstation, MacBooks, and Android devices or they are pushed towards server grade components. I already have home sever rack, and would recommend it for other people.
I just want to warn people who haven't heard server-grade hardware in-person before: this is only for people who can put a server rack somewhere unpopulated like a garage or basement. Servers will make you think "wow, leafblowers sure are quiet". They are not suitable for apartment dwellers such as myself. When I was setting up my 1U before shipping it off to a colo, I wrote scripts and had detailed plans of the things I needed to run so I could minimize the time it was making my ears bleed.
I have a 4U NAS with a supermicro board and an i3 chip with 6 WD Red NAS drives and it’s very quiet. The chassis came without fans so I installed the brand I like.
1Us have the most compromised ventilation and compensate with loud fans running at high speeds.
This is all built to be put in a place where noise is not an issue
It was a learning experience, and I think everyone should experience that kind of industrial noise at least once to appreciate how quiet consumer hardware is.
In the whole history of computing PC is the only platform where buying a computer means crazy number of options and configuration mixes to choose from and expect it to work! And warranty would support it too! You can run any OS of your choice on it and that's also reasonable expectation.
Any other platform (SUN, Be, Amiga, NeXT, Apple) it was always buying it from one company only from its list of products. And even running with a different version of OS means warranty doesn't cover it.
At least, that's what I hope happens. What will probably happen is people will continue to migrate away from the PC platform and towards closed platforms for the convenience, if history is any indication.
Is this for people trying to start the next netflix out of their garage before they have any money to put the servers in a colo?
Most home users need a small amount of compute, and are sensitive to noise and power use.
Realistically a Mac Mini will probably blow a lot of things out of the water on price / performance. Even an older one.
I'm interested to know, WHY is PC so open? what led to that?
At a high level, the IBM PC platform were very well documented & sold well, to the effect of producing tons of software and peripherals add-ons ("PC Compatibles"). This led some other computer companies to reverse engineer the proprietary IBM BIOS, allowing them to run the same software and use the same peripherals. Because these were clean room reimplementations, IBM didn't have a legal case to prevent their sale.
Fast forward a bit, IBM's attempt at a new, closed platform, PS/2, flopped. People wanted their more open hardware. Windows became dominate enough that all the demand was for x86 based hardware that could run Windows. Microsoft was happy to work with many vendors.
The PC is very open today, but Apple survived. Atari ST and Amiga probably survived longer than you think as well.
An actual rack with noisy 1U or 2U servers may be a bit overkill but on the plus side there's a guaranteed endless supply of such used servers.
Now there's a happy middle ground: used workstations with ECC memory, that you then use as servers.
People would be really wise to not underestimate what a 12 years old dual-Xeon, 14 cores each, 56 threads in total can do, for example. And such a complete workstation can basically be found for less than what it takes to fill my car's gas tank (granted it's got a big tank and it's fancy car whose manufacturer recommends to only use 98+ octane).
A single Xeon workstation with shitload of memory in a tower form factor is basically silent. Mine is. Dead quiet, next to the vaccuum cleaner and the cat's foot in a tiny room. I use it as a headless server.
And that's with the default PSU and fans. There are, of course, people modding these with adapters for regular consumer PSUs and then putting ultra-quiet PSUs in those. Same with Noctua fans etc.
And as for the usual complain: "but a server that is on 24/7 consumes too much electricity"... I only turn on my servers at home when I begin to work: I don't need these to be on 24/7.
So yeah: "Server CPU + ECC" doesn't imply noise. And "Server CPU + ECC" doesn't imply it has to be on 24/7 neither.
I like my Dell Precision T7910 (dual-socket Xeon FTW) a lot.
What are you using?
ok....
> PC in general is becoming more open than it's been in a long time as heavy MacOS/Android/iOS competition is creating a focus on open standards ...
I'm so confused by what you're trying to say here.
But RAM prices went to the moon, so I instead opted to repair the desktop. (It's only ~15 years old.) It's alive, again, and performs well enough.
The HDD in it is pretty old (not as old as the rest of it, it's on its second drive; 15 years would be quite impressive!), and still works for now, but there too, prices are silly and well above inflation. (I looked it up again: the same HDD is 50% more expensive today than when I bought it, in real, accounting-for-inflation dollars.)
Since this mess started, I've bought dozens of unused and like-new systems for clients. All with modern hardware - in the $250-$600 range.
Entry level motherboards are still $100.
$300+ is a very high end motherboard.
The existence of very high end products is confusing because it can give the impression that you have to buy a $300 motherboard because it exists. If you compare features side by side you're rarely missing anything important for the entry level motherboards.
Some people really want the best of the best and feel the need to buy motherboards with Thunderbolt 4 and other future-proofing measures just in case they might need them, but it's premium and luxury territory.
It's smarter to buy a cheap motherboard that meets your needs now. If in the future you find the need for USB4 or some other feature, upgrade the motherboard.
More often than not, builders will try to future proof for eventualities that don't arrive before it's time to upgrade to the next CPU socket anyway. There are a lot of people with expensive, outdated "futureproofed" builds who would have been better off saving the money on the original purchase so they could upgrade sooner instead.
Wanna guess how many times I've used that USB-C port? Maybe once or twice in the 9 years I've owned it. Never needed it. I also couldn't tell you what X370 is getting me that B350 wouldn't have gotten me.
"$1600 is too much for a video card" - me a few years ago on not buying an RTX4090 from nvidia's website.
"I only need 32Gb of RAM. If I want more later, I'll just updgrade" - Me a year ago.
Both mistakes, with hindsight. I will always future proof from here on out.
"$100 is a reasonable amount for a video card, I know this is on the budget side but at least I have a card this way" — me 12 years ago.
"I guess it's worth it to spring for 8GB of RAM..." — me 12 years ago.
Still using the same machine, with no regrets (just the occasional bit of envy).
Different people have different expectations and requirements.
Then you get a new board designed for the new features instead of something several years old and you come out $100 on top.
Futureproofing is nonsense. PCs just don't work that way, and haven't for decades.
Right, but the problem is that by now your $100 new motherboard requires a new CPU and new RAM. Which is very much not $100.
In the past we got away with PCI cards to add features without changing the motherboard, but we still ended up changing everything every 2 years anyway…
Windows 10 LTSC + Firefox + uBlock Origin on an i5-9400 feels faster than my M4 Pro MBP. Probably same or better on Linux.
I don't remember Win10 being particularly lean (although I'm sure 11 is worse). And the M4 is definitely a much more powerful CPU. Can you not run Firefox and uBO on that? Or have they really weighed things down that much with the OS somehow?
> Probably same or better on Linux.
Even with the Cinnamon desktop environment I can vouch it uses considerably less RAM for just the desktop (ordinary applications are probably about the same) and offers much faster filesystem access by default. I'm sure this is at least partly due to not being weighed down by built-in anti-malware (that would do basically nothing for people who are comfortable using Linux in the first place).
I would also say that most consumers, who are almost exclusively buying gaming-oriented boards, do not need anything high end. They can pretty much buy the cheapest board available.
I am shopping around for a mini ITX board and the difference between something at $180 and something at $400 is basically one to two faster USB ports, which are pretty much irrelevant on desktop computers, and a few minor conveniences that I imagine most people can do without.
The higher-end chipsets add no discernible advantage and there are no CPUs that are unsupported by the lower end chipsets (on the AMD side, at least).
The high end stuff is just available for people with a lot of money.
This is obnoxiously difficult to shop for in the desktop/workstation space.
Personally, I see little reason to upgrade from my AM4 platform. It's never been easier to hold on to aging hardware with the advent of DLSS stretching older cards further, diminishing returns on the newer gen GPUs, and the 'realism' of video games plateauing.
Last year I said I should have upgraded my 1060 last year.
I bought it second hand 7 years ago and it is still the same price.
I don't do much gaming, and it runs Immich / etc light inference just fine. One thing I don't regret is getting 32GB of DDR4 when I built the system around the time of the GPU upgrade.
7 years ago it was the same price, but then again, the last 7 years have involved accelerated inflation. So, the same price is actually a lower price.
If you're looking for a card in the sane $300 area, the Intel ARC B580 (12GB) or the RX 9060XT (8GB) are a reasonable value. If you want 12GB+ from Nvidia or AMD the used market in previous generations is a good place to look: maybe something like a RTX 3060Ti (12GB) or RX6800XT (16GB).
I personally don't think the GPU market is incredibly miserable. Maybe I am just used to the pain or something? Nvidia has a bit of a tax where but something like the RX 9070XT is basically the 3rd fastest gaming GPU money can buy and it's around $700. (I'm not sure why the 5070ti costs $200 more even given Nvidia's software advantages. It performs almost identically it just doesn't make purchase sense)
It may sound like pseudo-Buddhist claptrap, but it's also true. Or, I suppose, Fight Club claptrap. It's still true.
The choice is "do you want to participate in society, its benefits and drawbacks". You can't have only one side of that.
I used to think the plateau was here when the Xbox 360 and PS3 came out.
I've kept playing games and upgrading my GPU every other generation, and they're still fully utilized, but I can't really see where the additional compute and money is going. My biggest visual upgrade during that time was actually going from LED to HDR OLED which is something that requires virtually no additional processing power.
I don't mind that graphics have plateaued, because they aren't the important bit. If anything, I would rather that devs stop trying to chase graphics and make more games with shorter dev cycles.
Partially this is because there was usually an overlap in sales for early PS4 and late PS3, etc. if you have to support both console generations, it won’t truly be able to take advantage of the newer gen stuff.
13-14gen Intel Cores are still more than enough for your average home gamer, Zen 5 shows only marginal improvement over Zen 4 except for a very narrow range of workloads, getting wider than 128bit memory bus is prohibitively expensive while relatively cheap consumer boxes like Mac Mini run circles around dual-channel DDR5 setups, so on, so forth.
Sure, presenting this as a consequence of AI boom is convenient for a news outlet, but even before the craze both Intel and AMD were dragging their feet.
I'm not buying it. Both the premise and the new motherboard, that is.
If you really want to see a radical shakeup that would have some very exciting effects, could I interest you in a little Total Atomic Anihilation?
Technofascism
It’s bad, but it’s not “literally own nothing”.
people will own an increasing number of dumb terminals connected to rented services.
does that reduce the number of computers? well, no..
so, imo : the trick isn't to reduce physical ownership of devices, the trick is to make it so that you need Big Iron in order to do anything.
One way that might be achieved is by forming social and cultural dependence on models so large that no one individual could possibly run them...
But in a way I do agree with you, I doubt it is as organized as you imply. Yes, companies and governments do not way anyone on a General Computing Device at all. They want to see exactly what content you are viewing and responding to.
Microsoft and Apple have been slowly adding various forms of spyware and locking down what applications you can use. And Cell Phones ? Those are the Holy Grail of what Microsoft and Apple want to move your Laptop/PC to.
Right now Linux and BSD are the only games in town for non-spyware systems. But the new Age Verification Laws seems to be a first attempt to lock-down even Linux :( Since the Linux Foundation is owned by large corporations, I feel that will succeed. For the BSDs ? Right now seems they are flying under the radar.
Why when emails from discovery in labor disputes between google and apple in the 2010s revealed they engage in exactly the sort of manipulation you disbelieve?
The mega-rich are 100% decoupled from physical reality. May as well treat them more like tribal shaman, priests, preachers, and rabbis.
Just parroting memes the likewise idiot politicians believe are the magic chants that keep gravity itself pulling together the Earth.
"Omg he said the thing! Cut his taxes! Give him welfare!"
Our generation of leaders were raised in a pre-science and information as world. They rely entirely in cult of personality as their meat suit never sees itself engage in the labor it relies on to live. It's well aware intuitively how fucked it is. Must continue to stand in the pulpit!
You can see the divide everywhere. People with lots of money think supply and demand, congestion pricing, etc. are great tools because it doesn't impact them at all compared to people on the bottom. Those are only good solutions if you're not the one falling off the bottom rung of the ladder.
Is it really shocking that people are upset to see the supply of resources being cornered and hoarded by the ultra rich with the most likely outcome being the only way to get access to those goods will be to pay forever?
The possibility of AI becoming a must-have knowledge repository or memory assistant is scary if you couple it with the idea of never being able to own it. How much is your memory worth? What if you can't compete in terms of productivity without having access to AI? What about the people that can't afford the "first month of rent"?
People come in and make angry posts like the GP because they know they're getting disenfranchised and don't have the power to do anything to change it.
It still is, but nobody gives a shit anymore, we are in the financialization and rent-seeker world now.
Now we are just playing with fire.
Why associate them with roles that have a degree of positive association and human connection?
Treating them as faeries, vampires, or demons seems more accurate.
I think you conflate informed consent with "brainwashed as children into fealty via allegory of the end times, and threats of violence if they don't comply."
Computers were incredibly more expensive when I was growing up. People bought them anyway.
Is a computer that lasts 5-8 really productive years (and is still serviceable for another 5-7) and costs $1500 really a deal-breaker just because it was $1000 and on sale for $850 a year ago? Even if it doubled again, it still doesn’t price normal people out, IMO.
Maybe it's different in the US. In Canada, the median income for 25-54 years old was just under $60k / year in 2024. When you're talking about a $3k USD computer, it's pushing 10% or more of the median after tax income. My gut reaction to that is that most people don't even end up with that much disposable income in total, let alone for a single purchase.
HN is skewed with people way at the top end of income earners, especially on a global scale. Imagine getting $30k / year to spend on everything you need and then consider how much $3k on a computer is.
My dad had to take a loan to buy our first computer. Who wants that? It's dumbfounding to see the number of people cheering on backwards progress where we end up where we were 3+ decades ago.
If it lasts for 10 years, it's more like 1% of the after tax income of a median individual earner over that period.
I think a computer is clearly valuable enough that people will entirely rationally spend 1% of their income on it if that's what it costs. (I'm not "cheering it on"; I'm just observing and predicting that lots of normal people will still buy computers.)
If the cheapest useful computer ends up costing $3K, it will still be purchased and will still be worth it at around $1/day of useful life.
[0] https://web.archive.org/web/20160306232450/http://www.pageta... [1] https://www.gartner.com/en/newsroom/press-releases/2026-1-20...
Starbucks' revenue was almost $10B in the last quarter. Most people can clearly afford $1/day for something as useful as a computer.
Now, per unit costs is rising faster than inflation. The WD HDD I bought in 2017 for $65 real ($49 nominal) is now $95 real, 50% more expensive after inflation.
Trust me when I say my income has not increased by 50% post-inflation since then! (Also … I really should not have checked that number. Needless to say, it's not positive.)
That has never before happened in the history of computing, and it violates long-held, fundamental assumptions.
Maybe we'll get a chinese hardware black market.
[1] https://www.reuters.com/sustainability/boards-policy-regulat...
The napkin math resulted that renting is around 27 times cheaper than owning (not including power). I think we're really screwed when it comes to having owned access to AI unless intel comes out swinging with a c series card that has 128gb vram so we can run them in a 4x128gb configuration, but seems unlikely since nvidia has a large share in them.
This was calculated expecting around 30tok/s, of course you can get 2-5tok/s much much cheaper, but it's unusable for my workflow.
Everyone else charges a ridiculous amount but Deepseeks API is $0.003625 / M tok.
I'm surprised no one talks about this because of how significant it is. GPT 5.5 for example costs a ridiculous $0.50 / M tok cached. It's literally almost 140 times cheaper which matters a lot for tool calls.
> The deepseek-v4-pro model is currently offered at a 75% discount, extended until 2026/05/31 15:59 UTC.
However even when the discount ends its still very cheap. It will go back to $0.0145 / M cache hit. That's still 34x cheaper than GPT 5.5.
If you serve a single user you'll never get your electricity price back, nevermind hardware costs.
What's that? Egg prices are back down after suppliers cranked up their output? Surely nothing like that is possible with hardware... Personal computing is dead forever...
https://www.reuters.com/world/asia-pacific/sk-hynix-invest-a...
> SK Hynix has reportedly broken ground on a new advanced memory packaging facility in West Lafayette, Indiana, that should boost the supply of US-made high-bandwidth memory (HBM)
https://www.theregister.com/on-prem/2026/04/22/sk-hynix-brea...
> Samsung to advance mega-fab expansion by 6 months to get ahead in capacity race; SK Hynix follows suit
https://www.kedglobal.com/korean-chipmakers/newsView/ked2026...
On modern systems (all 64 bit AMD, and Intel Core "i" onwards, so quite old now) the memory controller is integrated into the CPU, so what the CPU supports is what you get, and the latest CPUs are DDR5 only. Intel did have a transitional phase of CPUs that can do both DDR4 or 5 depending on motherboard, but AMD it's AM4 = DDR4, AM5 = DDR5.
A Rolex Daytona today is known as a very fancy and even hard-to-get watch. In the 70s they were practically giving them away with other watch purchases because electronic watches were taking their lunch.
The bigger takeaway, I think, is the destruction and folding eventually lead to the Swatch group. People forget Rolex, Omega, et. al. were tool watches that were expensive but fairly attainable. Even into the 90s you could walk into a Rolex store and walk out with the watch you wanted. Nowadays you basically have to buy a watch to prove you're good enough to get the one you want.
I forsee a similar thing happening with computing hardware. There will be a small high-end side industry for non-datacenter customers.
The digital watch user will be renting time for a thin client via a datacenter provider. The wealthy or high status user will be able to purchase the expensive boutique home computing hardware they want.
The only reason you have those watch brands to mention is because they are non-functional status symbols. People that want a watch buy something else.
The same way, people that want a computer will buy from whoever is actually selling them. Manufacturers that want to sell only to datacenters won't last for long.
even if volume and hype decreases from the general pop there doesn't seem to be much of a cap on model requirements -- so at least one sector will be pushed into purchases one way or another.
It’s brutal. I’ve just built a workstation with DDR4 and two-gen old cpu. I paid more for the ddr4 than it originally cost, four years ago. The same amount of ram for the latest motherboard would have been 10x ($10,000). So used DDR4 has gone through the roof, which impacts hobbyists who used to rely on “hand-me-downs”.
My high-end HEDT would now be +$2300 to build mostly due to memory and SSD pricing. 96GB of memory going from $430 -> $1800 is wild. One community member literally wouldn’t be able to buy their Mac Mini configuration anymore, plus the self-upgrade SSD would be price hiked.
Where I blanche most is my storage server running TrueNAS. Built it 3.5 years ago, future-proofing in mind. Strong SSD cache layer, plus two spare HDDs as spares. It wasn’t cheap then, but I think between disks, storage, ECC memory, etc. it’s +$7000 now to rebuild it again, +$9000-$10000 on last generation hardware.
It's really wrong that the common people have access to things like PCs. It leaves a lot of money on the table the corporations can extract, and makes control much harder. PCs should cost at least as much as a car, so only the right people can afford them.
Own nothing and be happy.
Those who earn their living from their labor, and those whose income is derived simply by owning things they (often) didn't create themselves and charge for access.
If any of these people don't work or don't work enough, they undeserving immoral moochers and should be miserable and in pain.
> and those whose income is derived simply by owning things they (often) didn't create themselves and charge for access.
It totally fine if these people never lift a finger in their lives. In fact, they deserve it. NEVER question that. N-E-V-E-R! It's great! Capitalism is great! Capitalism is fair!
That confident "will" in that prognosis may ultimately stimulate a consensus "why?" response in the population to explore alternative outcomes ..
I spent the last half a century making sure they have no leverage and I am not interested in being coerced.
It's called security.
if you are living mobile, you probably need gas or batteries for warmth or cooling. if your climate is currently comfortable, temperatures can be raised.
or maybe you are a nomad hunting and gathering your own food? the wilderness can be pillaged and sold and "secured" until there's nothing left to eat.
there is no perfect security.
No lairs necessary. You can read up on people who do FIRE.
Have you intended to say "because reasons"? There should be a long chain of reasoning connecting "LLMs will never be able to strictly follow instructions written in natural language (as agreed by 90% consensus of experts or some such, because you can't formally verify adherence to informal natural language instructions)" and "physics doesn't allow that." And I can't find it anywhere. Neither in your comment history, nor in literature.
But the fact is that there's plenty of literature out there on hallucination and unreliability of LLMs already. If you know otherwise, let us inform Dario before next funding round.
So what are you ranting against?!
> Own nothing and be happy.
Ah, here it is. Only governments can confiscate our property and force us into that. Governments and politicians that keep telling us how evil corporations are…
It isn’t only conspiracy theorists who should be disturbed by whatever politico-corporate freemasonry that goes on in Davos.
... Do you want corporations to have that power too or something? What are you saying here?
Why did we listen to the Worldcoin guy again?
They might be 20% of the price (because they don't have to invest that much in training), but are probably not 20% of the resources (ie. inference), considering they take more tokens to do the same task, and have slower inference speeds.
No one really resists or pushes back. When I resist I hear "that's what consumers want", "it's for security", or that I'm the problem. There is no one to complain to even, except to low paid kiddos in customer service.
Market forces will probably bring the price of hardware down in the next decade. Whether it is in a form that is useful for regular people/hobbyists is another question. If not, then hopefully the "cloud" starts to look a lot different.
Of course in 20 years we'll be using more compute than today (99% likely).
EDIT: Of course cryptocurrencies provide a floor compute pricing.