My parents didn't have a lot of money, but my great-grand father passed and they used some of the inheritance to buy the computer. I was instantly hooked. In hindsight I see how much of a gift my family gave me.
The announcement reminded me of article John Dvorak wrote around the same time. 1GB hard drives had just come out, and he asked what all the extra space would be used for. Even as a young teenager, I remember thinking how short sighted that comment was. That was before I realized how the tech press tends to get stuck in local optimizations, and can't understand the bigger picture.
It's all a good reminder that cutting edge today doesn't stay cutting edge very long, and the world figures out how to squeeze every ounce ounce of power out of hardware. (Also, yes, that leads to bloat...)
True for many, many of us, I suspect. My family bought a 286 in the early 90s and it cost something like $2000 CAD then, which is nearly $4000 now; but salaries were lower then, this would have been something like 5-6% of my single income family's yearly post-tax earnings for the year, and if you think about it as the % of "disposable" income it was probably more like 60% of it for the year.
Obviously it paid off in that it set me on the path for my career, hard to make any other investment as good as that, but who would have known that at the time? I'm glad that there were so many ads positioning computers as being educational and not just game machines; even though in reality I think it was learning about the computer to make the games work that taught me way more than any educational software ever did.
I’ve been thinking a lot about these inflation-adjusted prices due to the big Apple Computer anniversary — an Apple // cost $5000 in 2026 dollars, meanwhile a $600 Macbook Neo cost $150 in 1980 cash!
What helped me reconcile this was an observation that we’ve inverted the prices of necessities and luxury goods. Rent and mortgage in particular were a much smaller slice of income back then, but luxury goods were very expensive, so one would save up for a year or two to buy a new TV or a computer for the kids.
Now the necessities take a much larger slice of our income, but TVs and computers are incredibly cheap. It takes very little money to get a nice computer, and not-buying it barely makes a dent in the bills. This isn’t a good thing.
I do disagree a little with your observation regarding the industry “squeezing every ounce of power out of hardware”. Beyond local LLM stuff, there’s basically nothing a modern computer can comfortably do that any laptop since the mainstreaming of SSDs can’t.
Gotta tack on to this thread showing appreciation for parents. We could never afford new computers in the 90s, but luckily my dad could bring home obsolete equipment from work. We were thus always at least a generation behind. I remember my friend's Pentium feeling like sci-fi compared to our 386, but my goodness it completely molded my life!
Later, towards the end of the 90s, those sci-fi Pentiums were obsolete, so I got a few to run "that weird Linux stuff" on. Since it was considered junk, nobody cared what I did with it. To this day, if I happen to hear Metallica play and there's early winter's first smell of snow in the air, my mind will be transported back to that school night I secretly stayed up wayyy too late and discovered SSH for the first time. Haven't looked back.
Thank you, dad! I just hope general computing devices owned by regular people are still natural by the time my children come of age.
Suddenly, it was possible to imagine running advanced software on a PC, and not have to spend 25,000 USD on a workstation.
[1]: https://docs.redhat.com/en/documentation/red_hat_enterprise_...
https://ftp.gwdg.de/pub/gnu/www/directory/all/rhide.html
:-)
And you could use VESA linear framebuffer above 256KB - this was a breakthrough back then :-))
While emulating an FPU results in a huge performance penalty, it is only required in certain domains. In the world of IBM PCs, it was also possible to upgrade your system with an FPU after the fact. I don't recall seeing this option for IBM compatibles. While I have seen socketed MMUs on other systems, I don't know whether they were intended as upgrade options.
The 486 DX2 66MHz was the target platform for gaming during almost two years (1992-1994). That was an huge achievement back in the days to be at the top that long.
Before it, you could claim that a 68040 was kinda-sorta keeping up with the 486 and that the nicer design and better operating systems of other computers made up for the delta in raw performance, but the DX/2 66 running Doom was the final piece of proof that the worse-is-better approach of using raw CPU grunt to blast pixels at screen memory instead of relying on clever custom circuitry was winning.
Faced with overwhelming evidence, everyone sold their Amiga 1200s and jumped ship to that hated Wintel platform.
And sometimes the DX/4 100Mhz would be slowest of all those at 25Mhz bus.
The full name on the chip on some of them is ”Am5x86-P75 DX5-133” which implies a lot of things, some of which are flat out misleading (it does not get very close to ”P75” performance)
The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
Reading in hindsight there was probably too many structural issues for Commodore to remain competitive anyhow, but an alt-history where they would've seen the needs for 3d rendering is tantalizing.
The intention was good, but the Akiko chip was functionally almost useless. It was soon surpassed by CPU chunky to planar algorithms. I don't think it was ever even used in any serious way by any released games (though it might have been used to help with FMV).
1: https://forum.amiga.org/index.php?topic=51616.msg544232#msg5...
This 16-bit memory (2 megs) is also where the framebuffer and audio lives, so the stock CPU in A1200 has to share bandwidth with display signal generation and the graphics and audio processing.
All-in-all, it meant the Amiga 1200 had only about twice the memory throughput of the Amiga 500. (About 5 megabytes/s vs about 10 megabytes/s)
If the A1200 had at least some extra 32-bit memory (it existed as a third party add-on) the CPU could have had its own uncontested memory with a troughput of about 20-40 megabyte/s.
Imagine the difference it would have made if the machine had just a little extra memory.
That's just a tiny detail. That the chipset wasn't 32-bit was another disappointment.
The bigger problem was that Commodore as a company was aimless.
I agree. Unfortunately, even with chunky graphics and/or 3D foresight, 68k would still have been a dead end and Commodore would still have been mismanaged into death. It’s fun to dream though…
Had the Amiga retained relevance for longer and without a push for PowerPC I don't see a reason why 68k wouldn't have been extended. Heck the FPGA Apollo 68080 would've matched end of 1990s P-II's and FPGA's aren't speed monsters to begin with.
Maybe if these theoretical new 68k Amigas became a huge market hit they could have taken the arch further and it could have remained competitive, but all the other 68k shops had already pretty much given up or moved on already (Apple was already going PPC, Sun went SPARC, NeXT gave up on their 68k hardware, Atari was exiting the computer business entirely, etc) so I don’t know that the market would have been there to support development against the vast amount of competition from both the huge x86 bastion on one hand and the multitude of RISC newcomers on the other.
https://en.wikipedia.org/wiki/Amiga_Hombre_chipset
It was going to be HP PA-RISC based and have an AGA Amiga SoC, including a 68k core.
Edit: I don't mean that their success was certain if they executed better. I mean they did almost nothing and got the guaranteed outcome: failure. (And their engineers were brilliant but had very little resources to work with.)
If it's a sin or feature can of course be debated but I remember playing games on an Amiga in the early 90s and until Doom the graphics capabilities didn't look outdated.
By 1992 with AGA however I agree, flicker and planar graphics(with 8 bitplanes any total memory bandwidth gains were gone) was a downside/sin that should've been fixed to stay relevant.
(Also, a Pentium 60 is barely faster than a DX/2 66 at many tasks — it is a Bad Processor — but that’s another conversation ;)
Second gen Pentiums, starting with the 75 MHz, were great.
Now we have these amazing displays and graphics cards and there's literally no way to make my Mac have different window titlebars or anything. So boring
Another factor for the later P1s being better IIRC was improved chipsets.
I think my next computer came with an AMD Duron 900Mhz, an entry level at the time but the jump from the pentium 100Mhz was such a huge gap it still felt like a formula 1.
From a 486 with VLB to a Pentium with PCI everything became a lot nicer.
I think it was 1994. It was a loaded 486 with the best 17" CRT monitor money could buy at the time. I think he spent over $7000.
That's really long compared to 1yr refresh cycles we have today with phones etc.
Back then, 10 years of technological advancement made a huge difference. Today, you can get by just fine with a 2016-era laptop.
• Ran my first real UNIX at home on a PA-RISC (HP 9000-715/75 with HP-UX 9.03 and 96 MB RAM) in 1997, 20" color CRT.
• Today, Linux is still here, but on a 2-CPU, 140-core AMD server with 2 TB RAM, hundreds of TB NAS and a 40" TFT... (and it still takes too long to open the bloated Web browser! - keenly awaiting Ladybird to the rescue in August.)
chromium browsers launch pretty fast. If you're talking about memory usage, Ladybird isn't aimed at minimal memory usage from what I've seen.
The CPU I'm working with is Celeron M 900MHz single core no HT struggling to build wheels for python (several hours)
- tinkered for HOURS to get enough EMM/XMM memory by tweaking Config.Sys & Co to get whatever game running (and having dedicated boot options configured, because you could unload some drivers from mem and could then run other games)
:-D
Flat framebuffers and "powerful" CPU's also enabled easier software rendering (Doom/Duke) of 3d, compared to the Amiga where writing textured rendering for an Amiga is a PITA due to video memory layout with separted bitplanes spreading bits of each pixel into different memory locations (the total memory bandwidth reduction in 1985 by using 5 or 6 bitplanes became a fatal bottleneck at this point).
It wasn't really always full framerate though and the 2d chipsets did help in "classic" actiongames that were still much in the rage.
The Pentium further widened the gap, but at the same time consoles gained hardware 3d acceleration (PSX/Saturn/Jaguar) yet the Pentium could do graphics better in some respects (As shown with Quake).
Once 3d accelerators landed, PC's has more or less constantly been ahead apart from when it comes to price (and comfort/ease).
To a significant degree, the 486DX2 was the primary computing platform that created the foundation I needed to learn computing at depth and enabled my later career, and really set many of the formative moments in my life. Thanks Intel, even though you suck now as a shadow of your former self you were a beast in the 90s.
In other words, faster hardware was needed because the quality and performance of the software dropped. I was doing spell-checking with WordStar on an CP/M Apple II with zero lag -- and WordStar fit on one side of a 5' floppy.
Word 97 also had as-you-type grammar checking, which wordstar never had. Wordstar did have an add in extra cost grammar checker whose name escapes me at the moment. But again, it was never real time.
Yes, programs have become bloated, but it is worth it to compare apples to apples.
One might argue that real time isn’t necessary, and one might be right. But that’s different from poorly written.
It was a life-changing machine.
Ordered, I believe, from the depths of a Computer Shopper magazine.
Nowadays, 486 computers are getting rare and relatively expensive. CPUs themselves are 25, 30, 40, sometimes 50 bucks on eBay. Whole working systems are in the low hundreds, and fully working 486 laptops can fetch 400 or 500 bucks.
sigh
Through the magic of saying something different in actuality, which really ended up being proven incorrect. From the blogpost above, verbatim, italicizing the relevant bits:
> Writing in the May 8, 1989 issue of Infoworld, Michael Slater warned that the sixfold speed increase seen from 1981 to 1989, going from 5 MHz to 33 MHz, would not be repeated.
I built a 486 Compaq Novell server for the company I worked for and named it Godzilla - gives a sense of how the 486 was seen.
The 386 SX was a crap, 16 bit wide bus IIRC.
If you were running 16-bit software they were little slower than a 386DX at the same clock and significantly faster than a 286 because of higher clocks (286's usually topped out at 12MHz though there were some 16MHz options, the slowest 386s were running at 16MHz with some as fast as 40MHz), but also in part, when not blocked by instruction ordering issues, to the (albeit small by modern standards) instruction pipeline which the 286 lacked.
32-bit software was a lot slower than on a DX because 32-bit data reads and writes took two trips over the 16-bit data bus, but you could at least run the code as it was a full 386 core otherwise (full enhanced protected mode, page based virtual memory, v8086 mode, etc).
The SX also only used 24 bits of the address bus, limiting it to 16MB of RAM compared to the original's 4GB range, though this was not a big issue for most at the time.
I know my 286 you could pair with a 287 next to it.. not sure if it really made a difference you could discern outside of hyper-specific uses though.
Very little, if any, “home” or small-business software would make use of a floating-point unit though (maybe some spreadsheet apps did?). The most common use for them was CAD/CAM, and those doing scientific modelling without a budget that would allow for less consumer-grade kit.
Played some awesome games, like DOOM, Wolfenstein. Later duke3d was the shit. But i cant remember if i run on the same setup or something newer.
The lack of imagination is just disturbing.
In the 2000s through now we've mostly had improvements - 4k Youtube is much better than realplayer, but it's still just "online video". AI is definitely a "new" thing and it's somewhat awoken a similar spirit to the 80s/90s - but not the same breadth. Dad bringing home a computer because he wants to do spreadsheets and you finding it can run DooM or even play music.
All we really have to look forward to in the future of increasing-performance personal computing is doing the same things as yesterday, but doing them faster.
The future after today will probably turn out more interesting than that, of course, but we can't know that until it happens.
And the future after 1988 certainly turned out to be a very interesting time in computing -- but they had no idea what was in store. Perhaps you can use your time machine to go back and let them know?
The Pentium is the first one, I think, that this didn't happen, because by then it turned out that people need a computer that can do what they are currently doing—but faster—much more often than they need servers.