At the same time I'd add the S3 ViRGE and the Matrox G200. Both mattered a lot at the time, but not long term.
I know this because I wrote a UE texture repacking tool with a "DXT detection" feature so that I wouldn't be responsible for losing DXT compression on a texture which had already paid the price, only to find that this situation was already hyperabundant in the ecosystem.
Many games could have their size robotically halved just by re-enabling DXT compression in any case where this would cause zero pixel difference. This was at a time before Steam, when game downloads routinely took a day, so I was very excited about this discovery but the first few developers I emailed all reacted with hostility, so I lost interest in pushing and it went nowhere. Ah well.
it seems to have helped path tracing by a lot.
I remember the main noticeable difference being ray traced reflections. However that was mostly on immovable objects in extremely simple scenes (office building). Old techniques could've gotten 90% there using cubemaps, screen space reflections, and/or rasterized overlays for dynamic objects like player characters. Or maybe just completely rasterize them, since the scenes are so simple and everything is flat surfaces with right angles anyways. Might've looked better even because you don't get issues with shaders written for a rasterized world on objects that are reflected.
Games that heavily advertise raytracing typically don't use traditional techniques properly at all, making it seem like a bigger graphical jump than it really is. You're not comparing to a real baseline.
Overall that was pretty much the poorest way to advertise the new tech. It's much more impressive in situations where traditional techniques struggle (such as reflections in situations with no right angles or irregular surfaces).
The "office building" setting meant resticted areas, sure, but it features TONS of reflections - especially transparent reflections (which are practically impossible to decently approximate with screen space techniques).
Oh, and: The Northlight Engine already did more than most other engines at the time to get "90% there" with a ton of hybrid techniques, not least being one of the pioneers regarding realtime software GI.
Released before the Voodoo 1 with glquake and gl support for Tomb Raider.
not that it was an awesome product, but certainly it was flexible.
a good (albeit tiny) demo of that is that vquake has the same wobbling water distortion of the software renderer quake but rendered entirely through the gpu. Perhaps with some interpretation this could be called the "caveman discovered fire" of the pixel shading era.
And that brought to mind my older dream machine, an 8800 GT from generations past, before which we made do with a Via Unichrome that worked sufficiently enough on the OpenChrome driver that I could edit open software (Freespace only needed a few constants changed) so it would render (though some of the image was smeared and so on I could play!).
I've been running the worst gaming set up I can get away with, which atm is a 3080 10gb, using random DDR3 ram, a budget WD 512gb ssd, and an i5 of the same socket as the i7-4790k that doesn't even support hyperthreading and can't do more than 4 tasks in parallel.
It's absolutely laughable at this point, but I'm unironically looking for a deal on that cpu lmao, it would be a huge upgrade.
I'd put the 5700xt at #2 for being the longest lived GPU I've owned by a very wide margin. It's still in use today.
One day one of my friends from school wanted to optimize airflow in our computer, and re-did the cabling, but he managed to block the CPU-fan from spinning. I am not sure how, but we didn't realise it for a couple of months.
When I got my own PC, it had an AMD Barton chip, and it allowed me to play Half-Life 2.
I'm on a 3060 currently and the changes in the 4xxx and 5xxx just aren't appealing to me. As soon as iGPUs get 3060 performance I'll probably switch. And they aren't far off.
It was a good budget option those decades ago.
If I can at least tell myself that our technological achievements come with efficiency gains instead of just apeing power throughput, I can rest a little better
About a decade ago, I discovered that the HD 530 iGPU included with my budget-oriented i3-6300 CPU was better-performing than the physically-impressive SLI pair of 9800GTs I had been using, at something like 1/10th the power consumption.
(It didn't do PhysX, but nobody cared.)
also, the gpu did not exist until 1999
looks like this was created for engagement
I have to say that this site is complete low-effort slop.
Combined with the color scheme of this site, this might be a cleverly disguised Nvidia ad.
Edit: Clicking through to their main page [1]: yeah, that's definitely an Nvidia ad.
There is strong evidence. Click on the link above. It was posted by a viral marketing company. They even feature the GPU story on their website: https://sheets.works/data-viz
> I was surprised to see the Intel Arc A770, a GPU I've never heard of, included on this list.
Yes, because otherwise the ad would be too obvious.
I can't remember last time I've seen such a confused design.
So no, the most important AI card isn't AI card, it's gaming GPUs that funded that mess