That's no longer true since Windows 11 went 64 bit only (with no in place upgrade path as 16->32 bit had). E.g. the linked video starts a clean slate at XP in order to switch to 64 bit from that point on.
Still damn impressive one can make it all the way to Windows 10 (32 bit), but equally crazy IMO the upgrade story finally ended with 11.
Windows 95 was really the one which made the story interesting (the important bridge between 16->32), so no help to those that preferred NT or other OSs at the time and wanted an excuse to bash it :p.
If you really want to buy something, there is https://www.tuxedocomputers.com/ - they've been around for a long time, though personally i never used them since i prefer to build my own desktops.
I have owned an AMD APU that only had its 3D capabilities and hardware video decoding fully supported on the Windows drivers during a couple of years until reaching parity, OpenGL 3.3 instead of 4.1, and no VAAPI.
Thanks for the typical Linux forum answer, some things never change.
Other things however do change. It is pointless to make current decisions on the state of software and drivers from years - let alone decades - ago. Back in the 2000s and early 2010s AMD's OpenGL story on Linux was pathetic, if you really wanted any sort of decent OpenGL on Linux you had to use Nvidia.
Things are very different nowadays and i'm saying that as someone who used to buy Nvidia because of their better OpenGL drivers and especially their Linux support.
More or less every Ryzen motherboard is guaranteed to work (as is Intel, just they aren't bang for bucks)?
Meaning Worten, FNAC, Public, MediaMarkt, Saturn, Dixons, CoolBlue,...
I am no strange to UNIX, having started with Xenix in 1993, going through all major ones, and knowing Linux since Slackware 2.0, but I have better things to do in 2025 with my spare time than installing Linux distros.
Hence why I rather stick with VMWare Workstation, WSL, or cloud instances nowadays.
It's pretty straightforward to re-image Linux onto any common x86-64 or arm64 hardware, not sure why one would shy away from this - nowadays it's actually easier than installing Windows.
As long as hardware with GNU/Linux is something special not available on general stores, or when, only in the form of Raspberry PI kits for kids, adoption will stay as it is.
Anyway, the major distros nowadays aren't Slackware. They work from the box, sans Nvidia (but unless you need CUDA that's just a reason not to go with Nvidia) and maybe Broadcom's WiFi (the firmware is in "non-free" repos because of licensing). An installation wouldn't take more than ten minutes, of which eight minutes is just watching the load bar. Then, of course, if you hate GNOME3 as much as I do it will take a bit more - but preinstalled wouldn't save you here either.
You had os/2 not doing that well, Amiga not doing great, NeXT hurting, riscos kinda floundering and the failures of Windows 1 and 2.
The risk of it not slapping was immensely high. A huge amount of the company was bet on it.
They were risk adverse all over the place with the design, it's clear to see when you do a deep dive.
And even then, people balked at things like the 8MB ram "recommendation", which was seen as expensive at the time.
Even after the release and into 1996, large vendors were still shipping some computers with windows 3.1 - it was an option.
It was really both a risky product and had to be done.
[1] we had one T1 for internet and one T1 PRI for the modem pool, and outsourced to MegaPath for out of area dialup, and then shunted all the customers to that when the PRI stopped working
What was ever wrong with these? I never actually used them but everything I know about them sounds fantastic.
It ran Windows applications natively and crashed less than Win 3.1 — but still had some hardware compatibility issues.
It had a fancy scripting language and a lot of neat stuff already built in — unlike Windows at the time.
I really wanted to switch to it, but Win95 won…
Rexx mentioned! https://en.wikipedia.org/wiki/Rexx
However that price tag was horrendous as OS for home computers.
For example with a Cobra expansion card, https://amiga.resource.cx/exp/cobra
Yes it was amazing, but not worth the extra money for many of us.
I did happen to have a machine with a monster CPU (for the time), but I knew many people with lesser CPUs. It really wasn't the CPU but the RAM. You had to have 4 MB, better to have 8 MB.
In those days, RAM was the most expensive thing on any computer. Also in those days a lot of the inexpensive clone machines like the Gateway and Dell were still using 30-pin (8-bit wide) RAM, so you had to use 4 sticks to get 32-bit width, and there were only 8 total slots (two banks). 1 MB SIMMs were at least obtainable, which means your practical limit was an 8 MB machine. 4 MB SIMMs were incredibly expensive, almost unobtainable, if your system board even supported them.
OS/2 would run very comfortably on an 8 MB machine, meaning all you really had to do was come up with the scratch for some 1 MB DIMMs and have a machine with the full 8 sockets. It was slightly upmarket for 1992 and 1993, but very far from high end. A ton of people in the BBS scene used OS/2 because it allowed you to run your own BBS in the background, or to connect to a BBS and be downloading files while still being able to use the computer, like say a word processor to write a paper.
By the time Windows 95 came out in late (September) 1995, 4 MB was considered the minimum and 8 MB was considered better. By then the Pentium had been released, but 486SX systems were pervasive and cheap. If you slapped more RAM in them, they would indeed run either OS/2 or Windows 95 just fine. Software rarely needed an FPU. System requirements between the two were basically the same.
The failure of OS/2 came down to software compatibility. The killer feature of OS/2 is that it could run all your DOS programs and all your Windows programs and unlike real DOS or Windows you could have multiple programs open at the same time without bogging-down or crashing the system. Heck, you could even run full-screen VGA games like Doom and task-switch out of them and return. You could be gaming while downloading.
But Windows 95 came out with an even better feature: the ability to run Win32 software that was formerly limited to Windows NT. And that turns out to be a way more important feature than being able to run lots of older software simultaneously. And as far as stability goes, if you only ran Win32 software on Windows 95 it was actually incredibly stable. As long as all the applications themselves are reasonably well-behaved, the inherently unsafe Windows 95 architecture of a large amount of globally shared unprotected memory hosting critical system data structures isn't a big problem.
So what did I do in 1996? Well, I got a true monster machine, a Pentium Pro 200 with I think 64 MB of RAM, and I ordered it with Windows NT 4. By then, Windows NT needed 32 MB minimum, but RAM was getting cheaper so it wasn't as much of a barrier.
So the irony of saying that you needed notably higher-end hardware for OS/2 is that notably higher-end hardware becoming the norm is what really killed OS/2 even among die hard fans.
Cheap RAM, cheap enough to run the even more stable Windows NT, was the last straw. OS/2 was mortally wounded when IBM failed to deliver Windows 95-on-OS/2. I thought at the time they should have done that, and I know now they could have done it. If they had done it, I think OS/2 could have competed with Windows 95. Instead it only limped along among die-hard fans like me. But once hardware caught up and I was able to run Windows NT, there really wan't much point in OS/2 anymore.
1MB $30-50.
4MB $150 January 1992, lowest in went would be $100 in December 1992, and back to $130 in December 1994.
Sane combinations were 4x1, 8x1, 4x4, and quite insane 8x4. 1992 was also when non IBM vendors started using 72pin SIMMs, for example Dell Precision 386DX/33 4x72pin while 25MHz model shipped with 8x30pin simm sockets. Edit: Looks like DELL started switching to 72pin simms in early 1990 with 325P/333P/433P models, and in 1992 committed unconscionable abomination by releasing 333s/L 386SX with 72pin simms :o
NeXT was way expensive, OS/2 was way too business-oriented until too late, Amiga was mismanaged financially, Apple's Macintosh was too expensive, etc.
It's sad to see looking for technological reasons happening here in the replies to your question, because that means that subsequent history has been forgotten. The technology turned out later in the decade never to have been the deciding issue, when things like the "Halloween documents" came to light. It was business agreements and marketing that sunk such products (albeit that one can argue that one can find traces of NeXT around even to this day), not technology. There were exclusionary pre-loading agreements with Microsoft, infighting inside IBM between two divisions, some utterly self-destructive litigiousness by some companies, and a whole bunch of Apple politics.
Windows 3.x could also run in “standard mode”, where it used a 16-bit only DPMI server. The Windows GUI system could run in real mode or as a DPMI client.
As far as “nested virtualisation”… Windows 3.1’s DOS windows could actually in turn run more instances of Windows 3.1, because Windows/386 relied on virtual 8086 mode only and Windows 3.x used DPMI which can be nested.
The former's VMM and VxDs were largely a superset of the latter's. The only real trick is that there was a secret handshake between the DOS program WIN and the Windows KRNL386 program that has to be satisfied by whatever tries to start it in that second VM, otherwise KRNL386 just exits immediately. One cannot use WIN to start KRNL386 in the second VM. WIN does too much other than running KRNL386. It has to be something else that does the secret handshake and only tries to run KRNL386.
This article is 3-4 years old - any hint of Part 2? I'm assuming those nested copies of 3.1 are stuck in Standard Mode.