Font rendering with ClearType on Windows used to be so awful you had to install an external library[0] to get something closer to a Mac experience.
Now, some technical comments:
It is physically impossible to render size-accurate characters on a bitmap display and not have blurry text. If the font size means the line of "l" should be 1.5 pixels wide, then there's only two possibilities to render sharp text: make it 1px wide or make it 2px wide. Any of these two choices doesn't respect font size (1.5), and any other rendering method means at least one pixel column that is some form of gray or color fringing. On LCDs with vertically alligned sub-pixels, even 1px has fringing, but at least it's sharp.
Traditionally Windows fonts were manually hinted and had rendering limitations on size. This way, all characters kept the same line thickness regardless of character line angles. And at the same time, to prevent ugly rendering, fonts weren't available in all sizes. Sizes that didn't have hinting info weren't available.
For example, there is was no Arial size 13, 15 or 17. The glyph lines would be 1px wide for sizes 8 to 14 and then suddenly jump to 2px thickness at size 16. That prevented blurry text or characters where line thickness varies whithin the same character or between characters of the same size.
Manual font hinting takes DAYS for a single character! It's almost like making bitmap fonts. Linux didn't (and still doesn't) have hinted fonts. Because of that, Linux doesn't restrict font sizes. Not even for fonts that are hinted. All sizes are available. This leads to: 1) blurry fonts with aliasing and sub-pixel rendering or 2) ugly text where line thickness varies randomly between characters or even within the same character.
There is no solution and there can't be any, because restricting font sizes to just those with hinting info excludes the vast majority of Linux fonts, which have no hinting at all, at any size. The only way to make text look decent is to enable antialiasing that causes blurry text and/or sub-pixel rendering which causes color fringing.
Hardware will, once again, save us from the crippled sorry state of the software. Once hi-dpi displays become common, this won't be a problem anymore. Character lines can be several px wide and a variation of one extra px here and there in line thickness won't matter anymore.
At least we can still simu late it whenwe get n0staIgic.
tldr: "FREETYPE_PROPERTIES="cff:no-stem-darkening=0 autofitter:no-stem-darkening=0"
Also this blog post is 5 years old and woefully out of date anyway.
wtf...? I know it's only their opinion, but jeez that is one hot take.
I've been using OSX/macOS since ~2005, and in comparison, Windows (and heaven forbid you had to use Linux) had the absolute worst font rendering ever. Even today it looks horrible.
Update:
Hah, just read their FAQ, quote:
"Q: Why bother, just buy a HiDPI screen?
A: In my opinion and experience, HiDPI is a niche gimmick similar to 3D movies."
Maybe that's just me b/c I always found the color fringes irritating, but I am very happy that 4K displays are now more or less standard.
If HiDPI is what made fonts on macOS what they are, then there's now way around it, we have to rid ourselves of none HiDPI display.