16 pointsby neveroddoreven9 months ago12 comments
  • tadfisher9 months ago
    Almost all of this is out-of-date with newer freetype/fontconfig/toolkits. Subpixel positioning is a default thing on most distros, grayscale AA is the sane distro default (due to the proliferation of alternative subpixel layouts), and Harfbuzz shaping has become near-universal.
  • politelemon9 months ago
    I don't agree that this should be flagged. Put the year in the title, that's all. Blog posts go out of date all the time, the OSes simply have different issues now.
  • zzo38computer9 months ago
    I have some problems with fonts on Linux, but none of the stuff mentioned there is the stuff that I care much about (compared with the other problems I have). Problems I have is that some programs are not capable of using bitmap fonts (some, such as Firefox, can use bitmap fonts in some places but not in others), and some programs are incapable of using non-Unicode fonts. (Newer programs might also support colourful emoji, which I also don't want, but fortunately I do not that problem because I do not have any such programs on my computer.)
  • DDayMace9 months ago
    So I'm sorry to hear about some of the problems with Linux font rendering in this article, it goes beyond my knowledge. I want to say though as a user, I have, between Linux, Mac and Windows machines over the years, literally always preferred Linux font rendering over cleartype and now even Mac not caring about lower resolutions and non-mac screens. You shouldn't need a 4k screen to get crisp and smooth fonts and sometimes even when you do they still look like shit, especially cleartype.
  • rasse9 months ago
    I know it's subjective but...

    Font rendering with ClearType on Windows used to be so awful you had to install an external library[0] to get something closer to a Mac experience.

    [0]: https://github.com/CoolOppo/GDI-PlusPlus

  • M95D9 months ago
    The author focuses on ClearType and other methods of rendering size-accurate and style-accurate fonts, but blurry text. I hate blurry text, with or without color fringing. I am willing to sacrifice size and glyph style for text clarity/sharpness, while the article author, apparently does not. We all have different preferences.

    Now, some technical comments:

    It is physically impossible to render size-accurate characters on a bitmap display and not have blurry text. If the font size means the line of "l" should be 1.5 pixels wide, then there's only two possibilities to render sharp text: make it 1px wide or make it 2px wide. Any of these two choices doesn't respect font size (1.5), and any other rendering method means at least one pixel column that is some form of gray or color fringing. On LCDs with vertically alligned sub-pixels, even 1px has fringing, but at least it's sharp.

    Traditionally Windows fonts were manually hinted and had rendering limitations on size. This way, all characters kept the same line thickness regardless of character line angles. And at the same time, to prevent ugly rendering, fonts weren't available in all sizes. Sizes that didn't have hinting info weren't available.

    For example, there is was no Arial size 13, 15 or 17. The glyph lines would be 1px wide for sizes 8 to 14 and then suddenly jump to 2px thickness at size 16. That prevented blurry text or characters where line thickness varies whithin the same character or between characters of the same size.

    Manual font hinting takes DAYS for a single character! It's almost like making bitmap fonts. Linux didn't (and still doesn't) have hinted fonts. Because of that, Linux doesn't restrict font sizes. Not even for fonts that are hinted. All sizes are available. This leads to: 1) blurry fonts with aliasing and sub-pixel rendering or 2) ugly text where line thickness varies randomly between characters or even within the same character.

    There is no solution and there can't be any, because restricting font sizes to just those with hinting info excludes the vast majority of Linux fonts, which have no hinting at all, at any size. The only way to make text look decent is to enable antialiasing that causes blurry text and/or sub-pixel rendering which causes color fringing.

    Hardware will, once again, save us from the crippled sorry state of the software. Once hi-dpi displays become common, this won't be a problem anymore. Character lines can be several px wide and a variation of one extra px here and there in line thickness won't matter anymore.

  • init2null9 months ago
    This is giving me flashbacks to the days of bad fonts and bad rendering. Back when using the web on Linux was a totally different and grossly inferior experience.

    At least we can still simu late it whenwe get n0staIgic.

  • daoistmonk9 months ago
    After following the advice in this article [1] to my eyes, my linux fonts look better than my MacBook's.

    tldr: "FREETYPE_PROPERTIES="cff:no-stem-darkening=0 autofitter:no-stem-darkening=0"

    [1] https://news.ycombinator.com/item?id=41643573

  • greenthrow9 months ago
    This is a terrible post. Most professional typesetters/publishers use macOS for a reason and have for decades.

    Also this blog post is 5 years old and woefully out of date anyway.

    • cosmotic9 months ago
      This article also points a finger at macos for having bad rendering too.
      • bee_rider9 months ago
        I think that is what they are commenting on: it seems odd that the typesetters all use MacOS if it has some terrible font rendering issue.
  • nixosbestos9 months ago
    Oh come on, put the date in the headline. I have none of these kerning issues on my boxes.
  • butz9 months ago
    Font rendering on GTK3 was pretty decent, I'd say on par with Windows and better than Mac (without retina screen). Then GTK4 was released and fonts looked ugly for quite a while. Now they have been improved, but still not at GTK3 level. But hey, now you can rotate text, for whatever reason.
  • slater9 months ago
    "For decades OS X remained a very ugly baby"

    wtf...? I know it's only their opinion, but jeez that is one hot take.

    I've been using OSX/macOS since ~2005, and in comparison, Windows (and heaven forbid you had to use Linux) had the absolute worst font rendering ever. Even today it looks horrible.

    Update:

    Hah, just read their FAQ, quote:

    "Q: Why bother, just buy a HiDPI screen?

    A: In my opinion and experience, HiDPI is a niche gimmick similar to 3D movies."

    https://pandasauce.org/get-fonts-done/

    • krona9 months ago
      Doesn't HiDPI "Retina display" prove the point; ClearType won when it comes to rendering. Due to patents, the solution was to quadruple the number of pixels, making subpixel hinting and anti-aliasing mostly redundant.
      • iSnow9 months ago
        I am a bit confused, as I don't think it proves the point. HiDPI is so much easier on the eye than subpixel rendering. HiDPI screens look closer to print than to 96 DPI Windows 7.

        Maybe that's just me b/c I always found the color fringes irritating, but I am very happy that 4K displays are now more or less standard.

    • mrweasel9 months ago
      That's my experience, I strongly dislike ClearType, it always looked blurry, even with all the tweaking Microsoft allowed you to do it new became really clear and sharp like on the mac. Sadly Linux seems to have gone the Windows route and smudged the fonts, rather than making them clearer and sharper.

      If HiDPI is what made fonts on macOS what they are, then there's now way around it, we have to rid ourselves of none HiDPI display.

    • amluto9 months ago
      And yet current Safari on current MacOS renders text in the HN comment box very poorly.