It's good to see that the latest GPUs can still be used in "dumb framebuffer" mode, are mostly VGA-compatible, and have VESA VBE support. I suspect AMD / NVIDIA might still have some sort of DOS-based factory tooling when bringing up new GPUs for the first time. In sadder news, I've read that the latest Intel integrated GPUs no longer have a VBIOS and are UEFI-only; although it might only be a matter of time before someone vibe-codes (vibe-ports?) one based on those from an older model.
People from the time would be astonished by the hardware we have now yet bloated software globs up every ounce of performance. What a waste! </granny mode=off>
Someone will explain to me the business and economic reasons, but that just flies over my caveman brain that asks "why does bashing rock feel slower?"
I can imagine that your particular workload doesn't require all those bells and whistles, and I think it's probably true that only running the bare minimum software like you would back in the day is horrifically inefficient on modern operating systems. But, at the same time, kernels don't crash as often, disks encryption is actually a thing now, file downloads are no longer expressed in kilobits per second and the much prettier screens render much smoother media for a fraction of the performance impact.
Of course there are inefficiencies that could be fixed (like how chat apps are skins around browsers now) but a lot of efficient software from back in the day cost an arm and a leg to build. In the end, the software industry found out that customers are happier to pay when you deliver new features faster than when you deliver new features later (which still run on the old hardware, though the customer may have already replaced said hardware at the time you release your feature).
With current prices for RAM and other system components, I hope companies will once again feel the pressure to build for limited hardware. Then again, when I look at the hardware developers are lugging around, I highly doubt things will change quick enough.
Animated GIF is a format that was designed for playback on late 1980s PCs with a 20 MHz 386 and VGA graphics…
If anything, this example proves the point that we’ve made the simple stuff much too complex. The GIF format hasn’t changed, but somehow getting those indexed color frames to screen on time now requires a GHz core.
About twenty years ago I was generating long animated GIFs. They worked fine in Firefox. In Internet Explorer they started fine but became jankier as playback progressed. I realised that every time IE displayed a frame it was rereading the entire file from the beginning to get to the current frame. Which took longer and longer as the current frame advanced.
It's just so easy to squander performance without noticing.
GIF is an awful format for its modern usage that will easily waste tens of megabytes for even a short and small file. That's why many services secretly convert GIF files and serve them as video files, or other animated files that are more efficient (such as WebP).
The difference in opinion between "the simple stuff" and "missing the bare basics" seems to come down to what year you were born and what kind of services you grew up with. I don't need 90% of what Discord has to offer me but when reading along with discussions of Discord users looking for alternative platforms, fleeing their age verification and such, I find that most Discord users will absolutely demand the features I didn't even know chat apps support.
I use two editors now. VS Code as full IDE when I want to code heavily. And a homemade FLTK based editor with just basic syntax coloring for writing notes and doing quick things.
Here's a related article: https://news.ycombinator.com/item?id=16001407
In terms of apparent responsiveness, Win 3.1x, NT <4, and 2k felt the fastest.
Someone with connections to NVIDIA support should really file a bug about this!