https://fabiensanglard.net/the_beautiful_machine/index.html
[Edit] Maybe not completly off-topic since it would be my dream PC.
I love it. It's beautifully engineered. Top quality. It sits at the corner of my desk proudly silent.
I'm likely about to upgrade the pc within but the case will remain a strong feature of my desk.
Edit: I guess this is a senseless question if the case really only uses passive cooling. I was assuming there would still be fans somewhere.
I despise my current PC's fan noise and I'm always on the lookout for a quieter solution.
Currently inside is an i7-9600 which I limit to 3.6ghz and a cheap 1050ti.
The CPU is technically over the TDP limit of the case but with the frequency limit in place I never exceed about 70degC and due to my workloads I'm rarely maxing the CPU anyway.
There is zero noise under any load. There is no moving parts inside the case at all, no spinning HDD, no PSU fan, no CPU fan, no GPU fan.
Are there senseless questions?
It can be used for gaming if your demands are met by a Nvidia 1650.
MonsterLabo built passive cases that could cool hotter components, seems defunct though, sadly.
Now my CPU idles at ~35 usually, which is just 5 degrees above the ambient temperature (because of summer...), and hardly ever goes above 70 even under load, and still super quiet. Realize now I should have done the upgrade years ago.
Now if I could only get water cooling for the radiator/GPU I'm using. Unfortunately no water blocks available for it (yet) but can't wait to change that too, should have a huge impact as well.
The CPU is AIO (and the radiator fans are loud). The GPU has very loud fans too, but is not AIO.
It's four years old at this point and I might just build something else rather than try to retrofit this one to sanity (which I doubt is possible without dumping the GPU anyway).
The GPU still gets kind of loud during intense graphics gaming sessions but when I'm not gaming the GPU fans often aren't even spinning.
The CPU fan is rarely an issue (it mostly just goes bananas when IntelliJ gets its business on with gradle on a new project XD).
The GPU is the main culprit and I'm not sure there's any solution there that doesn't involve just replacing it.
https://news.ycombinator.com/item?id=44021824 May, 2025 (86 comments)
https://news.ycombinator.com/item?id=44023088 May, 2025 (0 comment)
https://news.ycombinator.com/item?id=44026363 May, 2025 (1 comment)
Reading through the post, sadly nothing worked the first time round (bravo to the poster for his perseverance), and while things got slightly better, IT "stuff" is still surprisingly fiddly and fragile.
The quality of the build and the technical detail of the handbooks are areas where things got remarkably worse - how could we let that happen? How can children learn how stuff works without schematics of the devices they own and love?
The thing is, I never played those consoles after purchasing them. I don't have any nostalgic feelings towards except for NES. I actually felt sorry for myself because I discovered my inner kid died a long time ago when I tried to wake him up.
I'll probably give them to a friend's kid if he so wish, or donate to some local museums.
I often look fondly at the hardware I have.
I recently build one pc for each PC generation of the 90s. (486,Pentium 1-2,Athlon)
Still love them even after having built them.
Finding back into DOS is quite interesting, since its so different to PCs today.
If the latter ("the highest end CPU tech of the particular day"), I think it's going to keep getting harder and harder, with more top end options like the M4 Max being "prebuilt only", but I don't think it'll go to 0 options in as short as 10 years from now.
If the former ("newer and better CPU tech than before") I think it'll last even longer than the above, if not indefinitely, just because technology will likely continue to grow consistently enough that even serving a small niche better than before will always eventually be a reasonable target market despite what is considered mainstream.
NVIDIA not selling cutting edge other than in bulk is a phenomenon of the AI bubble, which will eventually deflate. (I'm not saying it will go away, just that the massive training investments are unsustainable without eventually revenue catching up)
If the product succeeds and the market starts saying that this is acceptable for desktops, I could see more and more systems going that way to get either maximum performance (in workstations) or space/power optimisation (e.g. N100-based systems). Then other manufacturers not optimising for either of these things might start shipping soldered-together systems just to get the BoM costs down.
No need to pick on Framework here, AMD could not make the chip work with replaceable memory. How many GPUs with user replaceable (slotted) memory are there? Zero snark intended
There are high speed memory module form factors. It just adds thickness, cost, expense, and they’re not widely available yet.
Most use cases need the high speed RAM attached to the GPU, though. Desktop CPUs are still on 2-channel memory and it’s fine. Server configs go to 12-channel or more, but desktop hasn’t even begun to crack the higher bandwidth because it’s not all that useful compared to spending the money on a GPU that will blow the CPU away anyway.
Maybe some modularization will survive for slow storage. But other than that demand for modular desktops is dead.
Cases will probably survive since gamers love flashy rigs.
If you're compiling code, you generally want as much concurrency as you can get, as well as great single core speed when the task parallelism runs out. There aren't really any laptops with high core counts, and even when you have something with horsepower, you run into thermal limits. You can try and make do with remoting into a machine with more cores, but then you're not really using your laptop, it might as well be a Chromebook.
I've historically built my own workstations. My premise is that my most recent build may be my last or second to last. In ten years, I will still have a workstation - but not one that I build from parts.
So the desktop developer market is for those who are not willing to use cloud. And this is a very small minority.
(FYI I am not endorsing cloud over local development, I just state where the market is)
This doesn’t contradict your minority point, but it really does make me appreciate local-first.
I disagree. My premise isn't that desktops are going away. It's that DIY custom-build desktops are destined for the trash heap of history since you'll no longer be able to buy CPUs and memory. We will be buying desktops like the HP Z2 Mini Workstation - or the 10 years from now equivalent.
>Cases will probably survive since gamers love flashy rigs
But only as a retro theme thing? Would enthusiasts just put a Z2 Mini, for example, inside the case, wire up the lights, and call it a day?
There's literally no reason for shareholders not to demand this from every computer manufacturer. Pay up, piggie.
Good luck then. Some of us build our own computers to be upgreadable.
Got kicked right in the nostalgia I guess
and yes: the supplied pc docs back then >>>>>>>> supplied pc docs today
My first "PC" was a Sinclair ZX80. I got my soldering iron out.
Much later on (1986ish) my Dad bought a Commodore 64, unfortunately he plugged the power lead into the video socket, when me and my brother arrived home for Chrimbo. Dad got it repaired and it served us very well for several years.
I still have that C64 and it was repaired again a few years ago (re-capped). It now has a USB interface etc. I also have an original Quickshot II joystick and it still works fine.
My first "real" PC was a 80286 based thing. A maths co pro (80287) was a Chrimbo prezzie too and costed something like £110. It had a whole 1MB RAM and the co processor enabled me to run a dodgy copy of AutoCAD. Yes, AutoCAD used to run in 1MB of RAM! The next version needed something mad like 32MB minimum.
I have a modern mouse and mechanical keyboard, but I tried to make everything as beige as possible...
My first ever build was a 386 though.
What fond memories.
With no latency of course because USB hadn't been invented yet.
>My SC-55ST came without a power supply. That was the opportunity to understand better the power requirement marking on the back. Voltage and Amperage are obvious but one must also pay attention to the polarity sign. The SC-55ST uses a negative center[7].
This is the "standard" for guitar effects pedals due to the ordinary switching power socket component on their PCB. The outer connector of the barrel jack does the switching by pushing the conductor away from the internal battery pole and over to the external supply when it is plugged in. This would switch the same way physically whether it was positive or negative, except these are often very sensitive or high-gain audio circuits and every bit of earth ground integrity can be essential for the metal enclosures and coaxial cables to shield the inner audio signal properly.
This SC-55ST may not have an internal 9V battery like a guitar pedal would have, but it was designed to run on a Roland "Boss" A/C adapter anyway which is the top shelf wall wart having highly regulated clean power for studio use. Roland set the standard for center ground with their Boss pedals and adapters which basically steamrolled everyone else. Since for this application it's not the power supply that's using any shielding at all, but the audio needs as much shielding as it can get.
USB is a shitshow. The idea is good but, the latency, for small files, is terrible.