259 pointsby dmalcolm5 days ago15 comments
  • munificent5 days ago
    Using a hierarchy to show template errors is brilliant and I'm sort of surprised compilers haven't always done that.

    I was investigating C++-style templates for a hobby language of mine and SFINAE is an important property to make them work in realistic codebases, but leads to exactly this problem. When a compile error occurs, there isn't a single cause, or even a linear chain of causes, but an potentially arbitrarily large tree of them.

    For example, it sees a call to foo() and there is a template foo(). It tries to instantiate that but the body of foo() calls bar(). It tries to resolve that and finds a template bar() which it tries to instantiate, and so on.

    The compiler is basically searching the entire tree of possible instantiations/overloads and backtracking when it hits dead ends.

    Showing that tree as a tree makes a lot of sense.

    • pjmlp4 days ago
      Only if stuck in C++17 and earlier.

      Past that point SFINAE should be left for existing code, while new code should make use of concepts and compile time execution.

    • eddd-ddde4 days ago
      I really wish templates didn't work on a dumb "replace at call site until something compiles" manner.

      All template requirements should be verified at the function definition, not at every call site.

      There is concepts. But they are so unwieldy.

      • munificent4 days ago
        It's honestly a seriously hard problem.

        Yes, it's definitely nice to be able to typecheck generic code before instantiation. But supporting that ends up adding a lot of complexity to the typesystem.

        C++-style templates are sort of like "compile-time dynamic types" where the type system is much simpler because you can just write templates that try to do stuff and if the instantiation works, it works.

        C++ templates are more powerful than generics in most other languages, while not having to deal with covariance/contravariance, bounded quantification, F-bounded quantification, traits, and all sorts of other complex machinery that Java, C#, etc. have.

        I still generally prefer languages that do the type-checking before instantiation, but I think C++ picks a really interesting point in the design space.

      • trashburger4 days ago
        Please no. Templates being lazy is so much better than Rust's eager trait evaluation, the latter causing incredible amounts of pain beyond a certain complexity threshold.
        • eddd-ddde4 days ago
          How so? I'd really like to see an example. If you can't explain what requirements your function has before calling it then how do you know it even does what you expect?
      • jenadine4 days ago
        C++0x concepts tried to achieve that but that didn't work.

        (But Rust traits works like that)

        • pjmlp4 days ago
          There were lots of politics evolved, moreso than technical issues.

          While not perfect concepts lite, alongside compile time evaluation do the job.

  • mwkaufma5 days ago
    Usability improvement request for this article: don't hijack the browser back button :P
    • bigstrat20035 days ago
      Browser makers should straight up remove the JS API for interacting with history. There are legitimate uses for it, but the malicious actors far outweigh the good ones at this point. Just remove it.
      • jart4 days ago
        That's a biased thing to say, since you're never going to notice the times when the history api is being used appropriately. Just as often I find myself raging when a webpage doesn't rewrite history at times when it should. Good taste is hard to come by.
      • nightfly4 days ago
        This type of thinking is what doomed uBlock Origin. I strongly disagree.
        • jenadine4 days ago
          The difference is that uBlock Origin is an extension you intentionally trust and install, while the JS API we talk about are something any websites (untrusted) can use.
        • matheusmoreira3 days ago
          To be fair, uBlock Origin has always been a special case. It's so good and so important and so trusted that it should have access to browser internals that normal extensions can't access.

          Honestly, uBlock Origin shouldn't be an extension to begin with, it should be a literally built in feature of all browsers. Only reason it's not is we can't trust ad companies to maintain an ad blocker.

        • goku124 days ago
          Perhaps the users should be given an option to opt out (enabled by default) for such APIs on a per-site basis. That way, users can intervene when they're abused, while their fair use will remain transparent.
          • nightfly4 days ago
            This seems like a good compromise. Similar to requesting location information, and/or denying popups after a few have been spawned
        • PaulDavisThe1st4 days ago
          How is uBlock Origin "doomed" ?
          • nightfly4 days ago
            • PaulDavisThe1st4 days ago
              So UBO isn't doomed, just UBO on Chrome. While that's significant given Chrome's market share, I and everyone else on the planet have the option to use something else, and will continue to do so.
          • encom4 days ago
            An advertising company controls the user agent everyone uses to access the internet, and wants to shove more ads into your eyeballs. uBlock exists as long as they allow it. Anyone who disagrees with this, works for them or own shares in the company.
      • Dracophoenix4 days ago
        Ditto for unprompted redirects.
        • bonzini4 days ago
          Redirects are used for stuff like POST->GET or canonicalizing URLs (adding slashes on directories), would you get rid of that too?
          • Calavar4 days ago
            That's done with HTTP headers though, right? Not JavaScript.
            • bonzini4 days ago
              Ah, I see what you mean. The canonicalization is, whereas redirects after processing forms could be done in JavaScript from and on click or on submit handler.
      • TZubiri4 days ago
        But wouldn't you be able to replicate the same issue by using a redirect?
      • nullc4 days ago
        Ditto for rewriting anchor destinations onclick, allowing sites to show one destination for hover but send you somewhere else.
      • panzi4 days ago
        I mean, without `history.pushState()` and `window.onpopstate` things wouldn't be as nice. Ok, I guess one could do about everything with `location.hash = ...` and `window.onhashchange`, like in the before times. But the server will not get the hash part of the URL so a link to such an page can't be server side rendered and has to fetch the actual page content in JavaScript, evaluating the hash. When I browse things like image search it is really handy that you can use the browsers back button to close an opened image without loosing any dynamic state of the page and that the x button on the page will close the image and remove the history entry just the same way, so a later back won't re-open that image.

        For me the back button wasn't hijacked.

        But I am for disallowing the use of `history.go()` or any kind of navigation inside of `onpopstate`, `onhashchange`, `onbeforeunload` or similar or from a timer started from one of those.

        • bigstrat20034 days ago
          Like I said: I recognize there are legitimate uses. But unfortunately, they are majorly outnumbered by people doing things like overwriting my history so that when I hit "back", it stays on the site instead of going back to my search engine. I would love to live in the world where malicious dark patterns didn't exist, and we could have nice things. But we don't, and so I would rather not have the functionality at all.
        • mort964 days ago
          How about we just navigate to new pages by .. navigating to new pages? Browsers have perfectly functional history without javascript shenanigans
      • spookie4 days ago
        same for clipboard events.
    • optionalsquid5 days ago
      In Firefox, you can prevent this by setting `browser.navigation.requireUserInteraction` via about:config. I've been told that it breaks some stuff, but to date I haven't noticed any downsides
      • teddyh5 days ago
        > you can prevent this by setting `browser.navigation.requireUserInteraction`

        Setting it to what?

      • budmichstelk5 days ago
        Doesn't answer your question, but in Tor Browser it would decrease your anonymity set.
    • llm_nerd5 days ago
      It's already egregious when a site adds history pushState entries for just clicking through a gallery or something, but wow adding them just for scrolling down on a page is simply bizarre, especially on a page about usability.
      • nottorp5 days ago
        It's in the spirit of adding emojis to compiler output...
        • ender3413415 days ago
          I actually quite like the emojis they put in the output, it helps alleviate the balance of providing enough context while also giving a clear visual indicator for the actual error message.

          They aren't going overboard on it, they just put a warning emoji in front of the error message.

          • nottorp5 days ago
            How do you grep for it without taking your hands off the keyboard?
            • perching_aix4 days ago
              Windows key + dot, then type in warning / if it was among the last ones you used, use the arrow keys, and hit Enter to insert it.

              If you use whatever OS other than Windows, I'm sure there are similar flows available if you search for it. And since it's just Unicode, I'm sure there are numpad based keybinds available too.

              • nottorp4 days ago
                why shouldn't i just type grep warning instead of bringing up some GUI that possibly takes forever to initialize?

                how does it work over ssh? :)

                • alisonatwork4 days ago
                  Emoji selector is fast and works perfectly fine over SSH, it's no different to any other input method that needs to use characters beyond 7-bit ASCII.

                  grep is a bit more iffy. UNIX command line tools seem to be a bit of a crapshoot in how or if they support Unicode, especially if you switch between different systems like Linux, BSD, Cygwin etc. You might need a bit of experimenting with the LANG variable to get it work (e.g. Git Bash on Windows needs LANG=C.UTF16 to match an emoji). I've also had cases where grep or sed works, but awk doesn't, or vice versa. On the whole it works a lot better nowadays than it used to, though, and that's a win for non-English users of the command line as well as emoji fans.

                  • nottorp4 days ago
                    > that's a win for non-English users of the command line

                    I am non english. I use ăâîșț. When writing a text document for humans.

                    I also use :) but not .

                    I'm still of the opinion that anything not 7-bit ascii doesn't belong in something that may be machine processed. Which includes compiler output.

                    Edit: hey, HN erased my emoji. The dot was supposed to be the :) emoji style.

                    • alisonatwork4 days ago
                      Didn't you ever grep a text document written in a language other than English? No processing of CSV files in different charsets? Not even encountered a file with a non-English character in the name? Not to mention the folks who deliberately set LANG so that their compiler and everything else will give them localized error messages. This stuff was all much worse 25 years ago, even 15 years ago. Like them or not, I do think emojis have helped drive forward much better Unicode handling across the whole stack, including the command line.
                      • nottorp4 days ago
                        > Didn't you ever grep a text document written in a language other than English?

                        Yes but it was for humans originally? Not for machine processing.

                        > Not to mention the folks who deliberately set LANG so that their compiler and everything else will give them localized error messages.

                        The horror! Who even works on translations for compiler error messages ?!? It makes absolutely no sense!

                        Next they'll want to localize programming language keywords. I wonder how well that will work at this current project of mine that has people native to 3 countries, none english speaking ...

                        • perching_aix4 days ago
                          It's not a next, but more a previously :) Localized development tooling is an old timer thing.
                          • nottorp4 days ago
                            How old? I'm old enough that my first Linux experience included a stack of floppy disks and recompiling a 0.9x kernel in late high school, and my first c++ program was done with Borland C++ 3.1. It never occured to me to look for a localized UI for either of them.

                            Older than that?

                • perching_aix4 days ago
                  I don't know, you wanted to grep for the emoji.
                  • nottorp4 days ago
                    That's what you think?

                    I wanted something I could grep for easily. Which doesn't seem to be the emoji, since it needs extra software to get on the grep command line...

                    • perching_aix4 days ago
                      It's not a "what I think" thing, these were your literal words:

                      > How do you grep for it

                      And then how badly or well this works will depend on your build of grep and your environment variables, as the other user noted. I did not consider this, because I'd expect grep to just work with Unicode symbols like this when my stdin is set to UTF-8, which I'd further expect to always being the case in 2025, but it appears that's not an expectation one can reasonably have in the *nix world.

                      It was and continues to be unclear to me why you'd want to grep for the warning emoji though, since according to the article these are inserted somewhere deep in the console-visual explanations. They do not replace the slug denoting the compiler message type at the start of these, which as you said, can (still) be found by just grepping for "warning".

                      • nottorp4 days ago
                        > xpect to always being the case in 2025

                        Oh but in the real world you vpn into a server that privately tailscales to some boxes that are hard to reach inside a factory and no one has physically touched them since 2018 at best ...

                        What's this 2025 you speak of? Not in production.

                        • perching_aix4 days ago
                          Don't remind me lol, even worse when it's a fresh VM but the software is still ancient.
                • johnisgood4 days ago
                  It makes too much sense.
              • yazantapuz4 days ago
                TIL that winkey+dot in kde launchs the emoji selector ... but it is copied into the clipboard, so more keystrokes
                • nottorp4 days ago
                  At least i didn't get a link to an youtube tutorial as an answer :)
    • perching_aix5 days ago
      Not an issue with JS disabled ;)
    • 4 days ago
      undefined
    • DyslexicAtheist5 days ago
      Usability improvement request for this site: don't promote a comment that isn't related to the content to the top :P
    • TZubiri4 days ago
      works fine for me (MS Edge)
  • porphyra5 days ago
    With GCC error messages usually the only part I wanna see is "required from here" and yet it spams zillions of "note: ...". With the nesting enabled, it's way easier to read --- just look at the top indentation level. Hooray!!!
  • jmclnx5 days ago
    I hope gcc remains the default in Linux due to the GPL. But I expect someday clang will become the default.

    Plus I heard COBOL was merged in with the compiler collection, nice!

    • pxc4 days ago
      > I expect someday clang will become the default [compiler for the Linux kernel].

      Why? I don't personally use GCC except to compile other people's projects in a way that's mostly invisible to me, but it seems like it's still widely used and constantly improving, thanks in part to competition with LLVM/Clang. Is the situation really so dire?

      • wahern4 days ago
        > Is the situation really so dire?

        I for one don't think so. From my perspective, there's at least as much momentum in GCC as clang/LLVM, especially on the static analysis and diagnostics front over the past 5 or so years. That was originally one of the selling points for clang, and GCC really took it to heart. It's been 10 years since GCC adopted ASAN, and after playing catchup GCC never stopped upping their game.

        Perhaps the image problem is that LLVM seems to be preferred more often for interesting research projects, drawing more eyeballs. But by and large these are ephemeral; the activity is somewhat illusory, at least when comparing the liveliness of the LLVM and GCC communities.

        For example, both clang/LLVM and GCC have seen significant work on addressing array semantics in the language, as part of the effort to address buffer overflows and improve static analysis. But GCC is arguably farther along in terms of comprehensive integration, with a clearer path forward, including for ISO standardization.

        More importantly, the "competition" between GCC and clang/LLVM is mutually beneficial. GCC losing prominence would not be good for LLVM long-term, just as GCC arguably languished in the period after the egcs merger.

        • pxc4 days ago
          > More importantly, the "competition" between GCC and clang/LLVM is mutually beneficial. GCC losing prominence would not be good for LLVM long-term, just as GCC arguably languished in the period after the egcs merger.

          You're right to note that "competition" here is more like inspiration than a deathmatch. But I vaguely remember two things that seem similar to motivation via competitive pressure to me: (1) when GCC 5 came out, it had way nicer error messages, and I immediately thought "Oh, they wanted to make GCC nice like Clang" and (2) IIRC the availability of a more modular compiler stack like LLVM/Clang essentially neutralized Stallman's old strategic argument against more a more pluggable design, right?

    • ahartmetz4 days ago
      Clang:

      - has about the same quality of error messages as GCC now

      - is now almost exactly as slow (/fast) as GCC at compiling now

      - sometimes produces faster code than Clang, sometimes slower, about the same overall

      I see no reason why the default would change.

      • almostgotcaught4 days ago
        well for one clang uses way less memory (RAM). also ld.lld is wayyyyyyyy faster than ld (and also uses way less memory).
        • ahartmetz4 days ago
          ld.lld works with any compiler, and anyway, mold is even faster and also works with any compiler.
    • jklowden3 days ago
      Yup, COBOL is that overnight sensation 4 years in the making. GCC COBOL is foremost an ISO COBOL compiler, with some extensions for IBM and MicroFocus syntax. We also extended gdb to recognize COBOL, so the GCC programmer has native COBOL compilation and source-level debugging.
    • enasterosophes4 days ago
      Unless you're talking about a compiler built into the kernel, I don't see that anyone is in a position to dictate to each distro what compilers they package.
    • mort965 days ago
      GCC can honestly only blame itself for its inevitable increasing obsolescence. LLVM only has the attention it has because it can be used as a building block in other compilers. GCC could've made a tool and library which accepts IR, performs optimizations and emits machine code, but the project avoided that for ideological reasons, and as a result created a void in the ecosystem for a project like LLVM.
      • aengelke5 days ago
        I'd add code quality as a reason. I find it much easier to understand and modify code in LLVM compared to GCC. Both have a fairly steep learning curve and not too much documentation, but often I (personally) find LLVM's architecture to be more thought out and easier to understand. GCC's age shows in the code base and it feels like many concepts and optimizations are just bolted on without "required" architectural changes for a proper integration.
      • pjmlp5 days ago
        The embedded compiler vendors, UNIX and consoles are quite happy with it.

        How much do you think they contribute back upstream regarding ISO compliance outside LLVM backend for their hardware and OS?

        • mort965 days ago
          Embedded compiler vendors and UNIXes want a possibly slightly patched C or C++ compiler, maybe with an extra back-end bolted on. I'm talking about use-cases like Rust and Zig and Swift, projects which want a solid optimizing back-end but their own front-end and tooling.
          • pjmlp5 days ago
            And FOSS folks most likely would like to enjoy those patches as well on their installations.
            • umanwizard5 days ago
              And they do! You can choose not to contribute your changes back to permissively-licensed software, but in actual practice most people do contribute them. It's not like the Rust compiler is proprietary software with their own closed-source fork of LLVM...
              • trelane4 days ago
                > You can choose not to contribute your changes back to permissively-licensed software, but in actual practice most people do contribute them.

                They contribute some things, sure. But the also don't contribute some things. It is hard to know how much because it's kept secret from all of us, even their own customers.

                > "It's not like the Rust compiler is proprietary software with their own closed-source fork of LLVM..."

                Rust no, hut there are a lot of proprietary, semi-incompatible proprietary forks out there.

            • mort965 days ago
              I haven't advocated for re-licensing GCC to be permissively licensed. And patching GCC is necessarily going to be much easier for vendors than to build a new C front-end which generates GCC IR. So I'm not sure what difference you think what I'm proposing would make with regard to your concerns.
          • vbezhenar5 days ago
            gccrs is a rust implementation for gcc. Just because Rust developers don't want their users to be fully free doesn't mean there are any problems with gcc. And clang is developed by Apple which is a huge warning sign by itself.
            • LeFantome4 days ago
              LLVM/Clang is evolving more quickly than and is a much richer base for innovation than GCC is. LLVM spawned Rust, Swift, and Zig. The most recent GCC languages are Spark and COBOL.

              One of the reasons that LLVM has been able to evolve so quickly is because of all the corporate contribution it gets.

              GCC users want Clang/LLVM users to know how dumb they are for taking advantage of all the voluntary corporate investment in Clang/LLVM because, if you just used GCC instead, corporate contributions would be involuntary.

              The GPL teaches us that we are not really free unless we have taken choice away from the developers and contributors who provide the code we use. This is the “fifth freedom”.

              The “four freedoms” that the Free Software Foundation talks about are all provided by MIT and BSD. Those only represent “partial freedom”.

              Only the GPL makes you “fully free” by providing the “fifth freedom”—-freedom to claim ownership over code other people will write in the future.

              Sure, the “other people” are less free. But that is the price that needs to be paid for our freedom. Proper freedom (“full freedom”) is always rooted in the subjugation of others.

              • vbezhenar4 days ago
                GPL gives more freedom to software users. BSD gives more freedom to software developers.

                I think that's the core difference.

              • pjmlp4 days ago
                Amazon and the like appreciate indeed their freedom not to contribute back, while pumping the cash register.
              • wolvesechoes4 days ago
                Some slaves were treated well, fed nicely and overall had rather good life. Yet, they remained slaves, so master could take all of that from them at any moment.

                But hey, if you try hard to be nice to your master and do not demand anything, for sure they will always treat you well!

              • orlylola day ago
                [flagged]
            • viraptor4 days ago
              It's an early implementation of a part of rust. It's not even close to being able to use as the daily compiler.
            • mort964 days ago
              And gccrs is not really very widely used and is not the "official" Rust compiler. The Rust project chose to base their compiler on LLVM, for good technical reasons. That's bad news for the GCC project.
              • tmiku4 days ago
                I thought that most FOSS projects took multiple compiler implementations as a sign of a healthy language environment, without much prestige associated with being the "premier" compiler and instead having more of an it-takes-a-village attitude. Granted, I'm mostly extrapolating from Go and Python here - is it a sharper divide in the Rust community?
                • mort964 days ago
                  No, and I think the gccrs project is great and a sign that the language is maturing. I'm just saying that LLVM was chosen instead of GCC to build the official Rust compiler for very good technical reasons, and that ought to worry GCC.
            • bigstrat20035 days ago
              Rust developers don't "not want their users to be fully free", they disagree with you on which license is best. Don't deliberately phrase people's motivations in an uncharitable way, it's obnoxious as hell.
        • umanwizard5 days ago
          What exactly do you mean by "UNIX"? Commercial UNIX vendors other than Apple have basically a rounding error from 0% of market share.
          • encom4 days ago
            Apple is not a UNIX vendor. They checked off enough of a compliance list to pass in the legal and marketing sense, but in any real practical sense it's not usable as a UNIX system. It's not designed to serve anything. Most of it is locked down and proprietary. No headless operation, no native package manager, root is neutered... and so on. It's not UNIX.
          • pjmlp4 days ago
            Yet IBM, Oracle, HP and POSIX RTOS vendors keep enough customers happy with that rouding error.
          • anthk4 days ago
            Are you from 2001? Sorry, but Apple at servers' market it's non-existent. RH and Canonical make tons of money thanks to being THE platform of the ubiquituous internet. Support is not free.
            • mort964 days ago
              Are you calling Linux a UNIX? I mean GNU/Linux is certainly a UNIX-like system, but aren't "UNIXes" typically used to refer to operating systems which stem from AT&T's UNIX, such as the BSDs and Solaris and HP-UX and the like? GNU's Not UNIX after all
              • anthk4 days ago
                GNU isn't Unix as NT isn't Win32.

                Gnu promotes Unix, but also promotes Emacs on top. NT can run Win32 on top of it, but there's far more than Win32 with NT systems. Just get ReactOS and open the NT object explorer under explorer.exe.

                Far more advanced than Windows 95/98.

                • mort964 days ago
                  Win32 is an API, while UNIX, NT and GNU are all operating systems made by AT&T, Microsoft and the GNU Project respectively. I agree that the GNU operating system isn't the UNIX operating system, and I agree that the NT operating system isn't the Win32 API, but I have no idea what comparison you're trying to draw.

                  It almost sounds like you think UNIX is an API like Win32, and that GNU is an operating system which "implements UNIX" like NT is an operating system which "implements Win32"? Are you confusing UNIX with POSIX?

                  GNU was made to replace UNIX, not to promote it.

                  • anthk4 days ago
                    No, I'm not confusing Unix with POSIX, but POSIX today superseded old Unix. What I mean both NT and GNU enhanced and modernized their own subsystems.

                    Even OpenBSD is not as 'pure Unix' as Unix V7.

              • johnisgood4 days ago
                I have fond memories associated with OpenSolaris. Oracle ruined it.

                I have not tried OpenIndiana, however.

            • umanwizard4 days ago
              Gnu/Linux and UNIX are not the same thing. Yes I understand that RHEL and Ubuntu are popular.
        • bobmcnamara5 days ago
          Apple, Arm, OpenBSD, FreeBSD have all switched to Clang.
          • johnisgood4 days ago
            Why did OpenBSD switch to clang? Licensing issues?
            • bobmcnamara3 days ago
              Could be. They stuck with GPL2 GCC 4.x for a while. IIRC, they never switched to GPL3 GCC5+
          • pjmlp4 days ago
            Swichting wasn't the point.
        • duped5 days ago
          It's not like GCC or the embedded toolchains are a shining beacon of ISO compliance... and if you mean video game consoles, are any of them using GCC today? Sony and Nintendo are both LLVM and Microsoft is Microsoft
          • uecker4 days ago
            GCC has very good ISO compliance and ISO C23 is better supported than in Clang (which I expect to catch up soon though)
            • throwaway20374 days ago
              Knowing the history of GCC (more focused on C) and Clang (more focused on C++), it makes sense to me that GCC has better ISO C compliance.

              Clang has a very nice specific page for ISO C version compliance: https://clang.llvm.org/c_status.html#c2x

              I could not find the same for GCC, but I found an old one for C99: https://gcc.gnu.org/c99status.html

              CppRef has a joint page, but honestly, I am more likely to believe a page directly owned/controlled/authored by the project itself: https://en.cppreference.com/w/c/compiler_support/23

              Finally, is there a specific feature of C23 that you need that Clang does not support?

              • uecker3 days ago
                I would not say that GCC is more focused on C. Also in GCC there is a lot more effort going into C++ than C. C is generally neglected, which is sad given its importance.

                GCC's support for C23 is essentially complete. Clang is mostly catching up, but features I need that I am missing are storage class in compound literals and tag compatibility. It is also sad that Clang does not implement strict aliasing correctly (it applies C++'s rules also in C).

            • bobmcnamara3 days ago
              "substantially complete" C99 even
          • pjmlp5 days ago
            The difference being that vendors forks not compliant with GPL can be legally asked for the changes they haven't sent upstream.
            • duped5 days ago
              With LLVM they don't need to fork it in the first place. But still, it doesn't matter because ISO compliance is a frontend problem.

              The one vendor who forks LLVM and doesn't contribute their biggest patches back is Apple, and if you want bleeding edge or compliance you're not using Apple Clang at all.

              If you say "isn't it great vendor toolchains have to contribute back to upstream?" I'm going to say "no, it sucks that vendor toolchains have to exist"

              • mort964 days ago
                Forcing vendors to contribute back upstream is an attempt at making vendor toolchains not exist.

                If a company makes a new MCU with some exciting new instruction set, they need to make a compiler available which supports that instruction set and make that compiler available to their customers.

                With LLVM as the base, the vendor could make their toolchain proprietary, making it impossible to integrate it back into LLVM, which means the vendor toolchain will exist until the ISA gets wide-spread enough for volunteers to invest the time required to make a separate production-quality LLVM back-end from scratch.

                With GCC as the base, the vendor must at least make their GCC fork available to their customers under the GPL. This, in theory, allows the GCC community to "just" integrate the back-end developed by the vendor into GCC rather than starting from scratch.

                Now I don't know how effective this is, or how much it happens in practice that the GCC project integrates back-ends from vendor toolchains. But in principle, it's great that vendors have to make their toolchains FOSS because it reduces the need for vendor toolchains.

                • LeFantome4 days ago
                  Myth: GPL means more companies contribute to GCC

                  Previous reality: companies write fully proprietary code to avoid GCC

                  Current reality: companies choose Clang over GCC because of the license and then contribute many of their changes back.

                • dzaima4 days ago
                  gcc requires that contributors assign copyright to FSF (or at least "certify the Developer Certificate of Origin")[0], so a fork isn't gonna get upstreamed without the approval of the fork's authors. So that limits the benefit from gcc-forks to individuals keeping the fork alive.

                  [0]: https://gcc.gnu.org/contribute.html#legal

                • charcircuit4 days ago
                  >making it impossible to integrate it back into LLVM

                  Code getting open source is not impossible. Companies do it all of the time because it's expensive to rebase.

                  • mort964 days ago
                    Dude you can't just quote a fraction of a sentence and then argue against that fragment. Read the whole sentence. The part about how "the vendor could make their toolchain proprietary" is kinda important.
                    • charcircuit4 days ago
                      My comment was about proprietary software being open sourced.
                      • mort964 days ago
                        My comment was about forcing companies to not make proprietary software.
              • pjmlp4 days ago
                So are you aware how many contributions have actually been upstream by Sony, Nintendo, IBM, ARM, Green Hills, Codegear, TI, Microchip, Codeplay, NVidia, AMD, Intel, HP,.... ?

                Because Apple certainly isn't alone.

                • trelane4 days ago
                  The question is less what's upstream than what's not.
                  • pjmlp4 days ago
                    The question is who is leeching clang while pumping the cash register.
            • bluGill5 days ago
              Venders who use llvm quickly discover the cost of mainaining their own fork is too high and end up contributing back.
              • pjmlp4 days ago
                The point was the whole clang package, and no they don't, plenty of examples.
                • LeFantome4 days ago
                  Except they do, plenty of examples (many already provided)
                  • pjmlp4 days ago
                    Examples yes, I provided the examples of everyone taking advantage while not contributing.

                    I am still waiting for the counter examples regarding clang contributions.

                    • bluGill4 days ago
                      Clang is in git with good tracking of who each contributor is - there are more than 4000.
      • codeshaunted5 days ago
        Is it still the case that the Linux kernel cannot be compiled using clang, or can you do that now?
        • bombcar5 days ago
          I believe clang has worked for years now.
          • codeshaunted5 days ago
            Why would anyone bother with GCC then? :P
            • jart4 days ago
              GCC is still marginally superior at both code size and performance in my experience.

              The main thing I like about clang is it compiles byzantine c++ code much faster.

              Some projects like llama.cpp it's like eating nails if you're not using clang.

              So with projects like llamafile I usually end up using both compilers.

              That's why my cosmocc toolchain comes with -mgcc and -mclang.

            • bombcar5 days ago
              I believe GCC still wins by far on support for weird CPUs and embedded systems.
              • codeshaunted5 days ago
                With LLVM I doubt this will remain true. LLVM makes custom code generation far too easy for GCC to remain viable imo.
                • SubjectToChange4 days ago
                  The problem is that "weird CPUs and embedded systems" are not mainstream platforms, nor do they have a lot of money behind them (particularly for open source development). Hence, there is little motivation and/or resources for anyone to develop a new backend for LLVM when a mature GCC backend already exists. Moreover, the LLVM developers are weary to accept new backends for niche platforms when there is no guarantee that they will be maintained in the future.
                  • bombcar4 days ago
                    Exactly - often the GCC branch is barely maintained by whoever is selling the CPU/chip, and it is patches against a specific version that you must use, or it goes haywire.

                    At least with GCC they can sometimes merge the patches.

                • mort965 days ago
                  Hm, is it so much easier to write an LLVM back-end than a GCC back-end? I haven't looked into GCC's code gen infrastructure that much, but I looked into making a new back-end to LLVM some time ago and it seemed extremely complex and entirely undocumented. All guidance I found on it basically amounted to, "copy an existing back-end and bang it against a rock until you've made it generate the machine code you need". And it certainly wasn't a case of having clean and simple enough code to make documentation unnecessary. It would honestly surprise me if GCC is more difficult to add a back-end to.
                • jcranmer5 days ago
                  I don't have much experience with writing a custom gcc backend, but my experience with LLVM is that its model ends up being a somewhat poor fit for smaller or weirder architectures. For example, LLVM has this annoying tendency to really aggressively promote the size of everything--if you have a 32-bit core with 64-bit memory indexing, LLVM tends to make anything that eventually becomes a memory index a 64-bit computation, even if it would be more efficient to keep everything as 32-bit.
              • bobmcnamara3 days ago
                It's still a crapshoot.

                GCC on xtensa adds extra instructions in several places.

                GCC on Arm Cortex-M0 has awful register allocation, and M0 has half as many registers as most ARMs...

            • SubjectToChange5 days ago
              GCC still for Linux distributions using glibc or any other library with many "GCCisms". Also, I'm not sure whether or not Clang is ABI compatible enough for enterprise customers with some rather extreme backwards compatibility requirements. Still, I can imagine a future where glibc can be built with Clang, possibly even one where llvm-libc is "good enough" for many users.
            • IshKebab4 days ago
              Mainly because it was the default for so long, and Clang isn't so much better that distros really need to switch.
        • homebrewer5 days ago
          https://static.lwn.net/kerneldoc/kbuild/llvm.html

          I do remember reading about LTO not working properly, you're either unable to link the kernel with LTO, or get a buggy binary which crashes at runtime. Doesn't look like much effort has been put into solving it, maybe it's just too large a task.

        • LeFantome4 days ago
          There are a few distros that use Clang as the system compiler (SerpentOS and Chimera Linux for two).
      • Tpt5 days ago
        There is now libgccjit that aims at allowing to embed gcc https://gcc.gnu.org/onlinedocs/jit/

        There is an alternative backend to rustc that relies on it.

        • aengelke5 days ago
          libgccjit is, despite its name, just another front-end for GIMPLE. The JIT-part is realized through compiling the object file to a shared library and using dlopen on this.

          One big problem with libgccjit, despite its fairly bad compile-time performance, is that it's GPL-licensed and thereby makes the entire application GPL, which makes it impossible to use not just in proprietary use-cases but also in cases where incompatible licenses are involved.

      • dralley5 days ago
        Yet another reason why I'm not a fan of Richard Stallman.

        Most of the decisions he made over the past 25 years have been self-defeating and led directly to the decline of the influence of his own movement. It's not that "the GCC project" avoided that for ideological reason, Stallman was personally a veto on that issue for years, and his personal objection led to several people quitting the project for LLVM, with a couple saying as much directly to him.

        https://gcc.gnu.org/legacy-ml/gcc/2014-01/msg00247.html

        https://lists.gnu.org/archive/html/emacs-devel/2015-01/msg00...

        (both threads are interesting reading in their entirety, not just those specific emails)

        • kstrauser5 days ago
          I think that's an unreasonable lens for viewing his work. Of course he values purity over practicality. That's his entire platform. His decision making process always prioritizes supporting Free Software over proprietary efforts, pragmatism be damned.

          Expecting Stallman to make life easier for commercial vendors is like expecting PETA to recommend a good foie gras farm. That's not what they do.

          • dralley5 days ago
            >Expecting Stallman to make life easier for commercial vendors is like expecting PETA to recommend a good foie gras farm. That's not what they do.

            He threw open-source developers under the bus in the process. As a result approximately nobody writes GCC plugins, open source or otherwise.

            • kstrauser5 days ago
              Alternatively, if you join a Stallman-led Free Software project and hope he'll accept your ideas for making life easier for proprietary vendors, you're gonna have a bad time. I mean, the GNU Emacs FAQ for MS Windows (https://www.gnu.org/software/emacs/manual/html_mono/efaq-w32...) says:

              > It is not our goal to “help Windows users” by making text editing on Windows more convenient. We aim to replace proprietary software, not to enhance it. So why support GNU Emacs on Windows?

              > We hope that the experience of using GNU Emacs on Windows will give programmers a taste of freedom, and that this will later inspire them to move to a free operating system such as GNU/Linux. That is the main valid reason to support free applications on nonfree operating systems.

              RMS has been exceedingly clear about his views for decades. At this point it's hard to be surprised that he’ll make a pro-Free Software decision every time, without fail. That doesn't mean you have to agree with his decisions, of course! But to be shocked or disappointed by them is a sign of not understanding his platform.

              • umanwizard5 days ago
                But the person you're responding to never said he was shocked or disappointed by the views; he just said the views are bad/counterproductive.
                • kstrauser4 days ago
                  But they're only bad/counterproductive when assessed against their own goals. They're a lot more reasonable if you assess them against Stallman's, where you could say "yeah, these actions kept GCC from becoming a handy tool for proprietary interests who don't want to share their work back with us".
                • Keyframe4 days ago
                  I found out the older I get the more sense rms makes.
                  • kstrauser4 days ago
                    On the subject of software freedom, I hate how accurately prophetic he is.

                    RMS: Here’s how they'll get ya!

                    Me: Nice, but that'd never happen.

                    Vendor: Here’s how we got ya!

                    Me: Dammit.

                    Seriously, he must have a working crystal ball.

                    Now, my agreement with him starts and ends on that subject. He says plenty of other things I wholly disagree with. But his warnings about proprietary software lock-in? Every. Single. Time.

                    • Keyframe2 days ago
                      for the record, I meant software freedoms and sometimes technical stuff only!
              • johnisgood5 days ago
                I mean, I think he is right. If everyone stopped supporting Windows, it would have long died out in favor of easy-to-install Linux distributions... probably.
                • codedokode4 days ago
                  In reality though Microsoft just added a WSL; not bad: Linux code can run on Windows now.

                  Also, if you want Windows to die you need to work with OEMs: I assume that most users simply use whatever OS is pre-installed.

                  • pjmlp20 hours ago
                    As gray dog around the prairie, had Microsoft actually been serious with POSIX subsystem on Windows NT/2000, instead of some marketing material and low level effort, GNU/Linux adoption would never taken off, at least not at a level that would have mattered.

                    With OS X on one side, and POSIX subsystem on Windows NT/2000 side, everyone would be doing their UNIX like workflows without thinking once to try out GNU/Linux.

                    At my university we only cared about Linux on its early days, Slackware 2.0 days, because naturally we couldn't have DG/UX at home, and that POSIX support was really unusable beyond toy examples.

                  • johnisgood4 days ago
                    I agree.
                • 5 days ago
                  undefined
          • Analemma_4 days ago
            > His decision making process always prioritizes supporting Free Software over proprietary efforts, pragmatism be damned.

            No, this is giving him too much credit. His stance on gcc wasn't just purity over pragmatism, it was antithetical to Free Software. The entire point of Free Software is to let users modify the software to make it more useful to them, there is no point to Free Software if that freedom doesn't exist - I might as well use proprietary software then, it makes no difference.

            Stallman fought tooth and nail to make gcc harder for the end user to modify, he directly opposed letting users make their tools better for them. He claims it was for the greater good, but in practice he was undermining the whole reason for free software to exit. And for what? It was all for nothing anyway, proprietary software hasn't relied on the compiler as the lynchpin of its strategy for decades.

        • JoshTriplett5 days ago
          Don't forget that the LLVM folks actually went to GNU and offered it to them, and they failed to pay attention and respond. (It's not even that they responded negatively; they just dropped it on the floor completely.) There's an alternate history where LLVM was a GNU project.

          With the benefit of hindsight, I'm glad that that didn't happen, even though I have mixed feelings about LLVM being permissively licensed.

          • alienthrowaway4 days ago
            Gcc is the flagship of the GNU Project - allowing an endrun of the the spirit of the GPL in gcc was never going to happen. The project paid more attention than you give them credit for because allowing closed source plugins and improvements that use gcc as a frontend is anathema to Free software.

            There's an impedance mismatch between people who think gcc should have maximized user utility vs. the actual GNU philosophy. The actions of the gcc project make a lot of sense if you consider the FSF/GNU are monomaniacal about maximizing users freedoms, and not popularity, momentum or other ego-stroking metric.

            • JoshTriplett4 days ago
              Whether you care about number of users or not, there's value in considering whether what you're doing is actually advancing the cause of Free Software or not.

              GCC today has a very interesting license term, the GCC Runtime Library Exception, that makes the use of runtime libraries like libgcc free if-and-only-if you use an entirely Free Software toolchain to compile your code with; otherwise, you're subject to the terms of the GPL on libgcc and similar. That is a sensible pragmatic term, and if they'd come up with that term many years ago, they could have shipped libgccjit and other ways to plug into GCC years ago, and the programming language renaissance that arose due to LLVM might have been built atop GCC instead.

              That would have been a net win for user freedoms. Instead, because they were so afraid of someone using an intermediate representation to work around GCC's license, and didn't do anything to solve that problem, LLVM is now the primary infrastructure people build new languages around, and GCC lost a huge amount of its relevance, and people now have less software freedom as a result.

            • pjmlp20 hours ago
              Also note that GCC adoption only took off when Sun introduced the concept of UNIX developer SDK, where developer tooling was an additional license.
            • steveklabnik4 days ago
              > allowing an endrun of the the spirit of the GPL in gcc was never going to happen.

              You're right:

              https://gcc.gnu.org/legacy-ml/gcc/2005-11/msg00888.html

              > If people are seriously in favor of LLVM being a long-term part of GCC, I personally believe that the LLVM community would agree to assign the copyright of LLVM itself to the FSF and we can work through these details.

            • orlylola day ago
              [flagged]
        • cherryteastain5 days ago
          Without GNU, GPL and Richard Stallman, FOSS would not exist in the first place. The GPL forced companies to make FOSS a thing, whether they liked it or not.
          • bluGill4 days ago
            Bsd was long a thing proving you wrong. Yes gpl got more press but the other options existed for long.
            • pjmlp20 hours ago
              Yes, and UNIX vendors and MS were happilly taking pieces of it without upstreaming.

              I think without GNU/Linux, most likely all UNIX vendors would have a better market share today.

          • azinman24 days ago
            And then the world started moving on, with other licenses that gave even more freedom and flexibility.

            It was a great start, but you need to adapt or you perish.

            • trelane4 days ago
              > other licenses that gave even more freedom and flexibility

              That gave vendors more freedom and flexibility (to lock their software away from their customers.)

              As usual, customers got less freedom and flexibility.

              • azinman24 days ago
                I disagree. They got software that’s open under a different license that otherwise would have just been purely proprietary, or not created at all (due to the compounding effects of open source)
                • trelane4 days ago
                  We are currently talking about GCC, where they would have gotten the source prior to LLVM.

                  > They got software that’s open under a different license that otherwise would have just been purely proprietary

                  This is not a given, even outside of compilers. It's a heck of a cope there.

                  • azinman24 days ago
                    > Without GNU, GPL and Richard Stallman, FOSS would not exist in the first place. The GPL forced companies to make FOSS a thing, whether they liked it or not.

                    I was responding to this, which is more widespread than GCC (although it was one of the first wins of the GNU).

                    There were various companies who wanted to add on backends and other bits to GCC, but wouldn’t due to the license. That’s one of the reasons LLVM is so popular.

            • wolvesechoes4 days ago
              As with any other achievement of civilization, younger generations will at some point find why previous ones fought for something and how it sucks to loose it.

              But when this realization comes, it will be too late.

            • anthk4 days ago
              When VSCode et all beging shipping DRM and who knows what in their extensions, then we'll se what with happens with these half-shareware semilibre projects.

              Specially when propietary dependencies kill thousands of projects at once.

              • spookie4 days ago
                Well, you are already unable to get most of the functionality out of vscode without Microsoft's extension store.

                I, for one, am happy that there are still a couple of people here and there that you can really trust on this stuff.

          • IshKebab4 days ago
            I doubt that. People often say "without <person that started thing> we wouldn't have <thing>!" but it's nonsense. Someone else would have done it just slightly later.
            • azinman24 days ago
              The question is if that person would have dedicated their life in the same way.
        • knowknow5 days ago
          Kind like how GPL 3 makes it infeasible for most companies to use/support free software. At least Stallman gets to feel morally superior though
          • dralley5 days ago
            And Free Software does not benefit from a morass of mutually-incompatible copyleft licenses that may as well have been proprietary since you can't use them together.

            None of the permissive licenses have this problem.

            • gkbrk5 days ago
              Which open-source licenses can't be used together?
              • dralley5 days ago
                GPLv2 and GPLv3, for one example... It's sad when two licenses from the same organization are incompatible.

                GPLv2 and Apache.

                • teddyh5 days ago
                  The number of significant software projects which are licensed as GPLv2-only, and which are therefore incompatible with GPLv3, can probably be counted on one hand. Normal GPLv2 projects are licensed – as instructed by the text in the license itself – as “GPLv2 or later”. Which means that any normal GPLv2-licensed software can be relicensed to GPLv3, and can therefore be combined with any other GPLv3-licensed program without any issue whatsoever.
                  • gus_massa4 days ago
                    For GPL v2 only, let's start the list with Linux and Git...

                    The "or later" has been used in creative ways, like relicencing all the Wikipedia content, or the Affero to AGPL transition. Nothing shady, but unexpected.

                    Do you trust RMS to avoid doing shady things in the later GPL licence? I do, but he is not longer in the FSF.

                    Do you trust the current members of the FSF to avoid doing shady things in the later GPL licence? I don't know them.

                    Do you trust the future members of the FSF to avoid doing shady things in the later GPL licence???

                    • CorrectHorseBat4 days ago
                      >Do you trust RMS to avoid doing shady things in the later GPL licence? I do, but he is not longer in the FSF.

                      Yes he is: https://www.fsf.org/about/staff-and-board

                    • ndiddy4 days ago
                      Section 14 of the GPL says "The Free Software Foundation may publish revised and/or new versions of the GNU General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns." Given that the preamble to the GPL explicitly says "the GNU General Public License is intended to guarantee your freedom to share and change all versions of a program--to make sure it remains free software for all its users", I don't think that a judge would find that a hypothetical GPLv4 that was basically MIT or something is "similar in spirit" to the present version.

                      If you're worried about the other direction (i.e. a hypothetical GPLv4 that had some bizarre restriction like "all users must donate to the FSF"), the "or any later version" means that as long as you don't decide to update the license you use yourself, people can continue to use it under the GPLv2 or v3 indefinitely.

          • raverbashing5 days ago
            Honestly at least Linus has his head in the right place
  • codeshaunted5 days ago
    Big fan of all of this except for the emojis in my console
    • Karliss4 days ago
      Few versions ago GCC added `-fdiagnostics-text-art-charset=[none|ascii|unicode|emoji]` so feel free to disable them.

      https://godbolt.org/z/bYc7EqYWE

    • staplung5 days ago
      On the one hand I feel like the #WARNING_SIGN is a welcome addition. On the other, I'm sensitive to the idea that it will be totally obnoxious if you need to grep/search for it in whatever tool you're using.

      Was a little surprised to learn that the warning sign is generally considered an Emoji; I guess I don't think of it that way. Was even more surprised to learn that there is no great definition for what constitutes an Emoji. The term doesn't seem to have much meaning in Unicode. The warning sign - U+26A0 - goes all the way back to Unicode version 4.0 and is in the BMP, whereas most Emoji are in the SMP.

    • pxc4 days ago
      If you're using an OS that honors your font choices (i.e., not macOS), you can use a font like Symbola to provide emoji so that they're stylized, monochrome, and vectorized, rather than incongruous color bitmap images. That helps them play nice with terminal colorschemes and things like that.

      I'm not aware of any emoji fonts like Symbola which provide a monospace typeface, though. That would be a great option.

      • lyorig3 days ago
        By honoring font choices, do you mean the ability to overwrite emojis as well? I’ve never had issues using custom/nerd fonts, but it’s true that the emojis have stayed true to the Apple style so far.
        • pxc2 days ago
          Indeed. Emoji are just characters, rendered via some font just like any text character. On non-Apple operating systems, you can select emoji sets via font configuration.

          You can do it on macOS as well, but you have to disable SIP and modify/replace the files for the Apple Color Emoji font, because some widely used GUI libs are hardcoded to use it.

          Idr the situation on Windows except that emoji glyphs are inherited from your other font choices, if your chosen font includes emoji. But on Linux it's generally easy to configure certain font substitutions only for some groups of characters, like emoji.

    • guenthert5 days ago
      I actually appreciate a #\WARNING_SIGN, but then, why is there no #\STOP_SIGN ?
    • dlachausse5 days ago
      It should be easy to patch out or possibly add a command line switch to disable it if desired.
      • cassepipe4 days ago
        Already exists : -fdiagnostics-text-art-charset=[none|ascii|unicode|emoji]
    • high_na_euv5 days ago
      Why?
      • codeshaunted5 days ago
        Makes me feel like I'm writing JavaScript, if that makes any sense. Also I hate fun.
      • umanwizard5 days ago
        Well, other than the fact that it's hard to search for, things should be kept as simple as possible to improve reliability/interoperability. Are you confident that multi-column-wide non-ASCII characters work reliably without causing rendering issues on every possible combination of terminal/shell/OS, including over SSH? I'm certainly not.
        • gpderetta5 days ago
          You can already put multi-column-wide non-ASCII characters in your source code, so the horse has left the barn already.
          • umanwizard5 days ago
            Of course you can. I'd argue you shouldn't, but that's beside the point. There is clearly a difference between people being able to do something, and the compiler forcing it on them.
            • IshKebab4 days ago
              But the compiler isn't forcing it on anyone?
              • jcelerier3 days ago
                How isn't it? It's setting it by default which is equivalent to forcing it for the huge majority of users who won't even know that such a thing as a flag to disable these emojis exists
        • aseipp4 days ago
          They are quite easy to search for actually, because they are sparse and you tend to not get so many spurious results due hitting things that contain your query as a substring. Even then, the glyphs appear inline in the warning message, they are not the anchor, you could still just search for "warning:" like you or your editor have been doing for years. Every operating system comes with an emoji picker, it takes like 2 seconds to use. I'm not sympathetic, tbqh.
        • dzaima5 days ago
          If anything, an emoji is easier to search for, as less things would be using it than a general "warning" (or whatever would make sense, as the emoji'd thing isn't the warning itself). Do have to get a copy from somewhere to search for though. It's also much easier to "visually" search for it.

          At least in the blog there are a two spaces after the emoji so it can freely draw past its boundaries rightwards without colliding with anything for a good bit; and nothing to its right is used assuming monospace alignment. So at worst you just get a half-emoji.

      • javier_e064 days ago
        I started putting emoji's in my bash scripts.

        gitk broke! Can't parse emojis

        I want emojis in my code. They are a superior form of communication (non-linear)

  • levodelellis4 days ago
    Please gcc, let me have a `~/.config/gcc` config file or an ENV variable so I can ask for single lined error messages.

    I literally do not need ascii art to point to my error, just tell me line:col and a unique looking error message so I can spend no more than 1 second understanding what went wrong

    Also allow me to extend requires with my own error messages. I know it'll be non standard but it would be very nice tyvm

  • meisel5 days ago
    The template error messages look great. I wonder if it’s worth writing a translator from clang/gcc messages to these ones, for users of clang and older gcc (to pipe one’s error messages to)
    • charcircuit5 days ago
      I mean why not show a proper image instead of doing fancier ASCII art. Or skip it entirely and have an LLM describe the issues and fix it for you.
  • kazinator4 days ago
    I can't imagine a piece of software easier to use than gcc 3.4.6, sorry.

    We are now entering a Rococo period of unnecessarily ornate compiler diagnostics.

    Getting these elaborate things to work is a nice puzzle, like Leetcode or Advent of Code --- but does it have to be merged?

  • Night_Thastus5 days ago
    I'm all up for better error messaging. :)

    Can't use Clang where I'm at, but I do get to use fairly cutting-edge GCC, at least for Windows development. So I may get to see these improvements once they drop into MSYS.

  • bmn__4 days ago
    The writer David Malcolm tells a story in his article about compiling C17 code and pretending it's C23. Duh, no wonder it breaks!

    You C standard authors have it bass-ackwards. The version of the code must accompany the code, not the compiler invocation.

  • taschenorakel4 days ago
    Why all this waste in Unicode art that nobody will ever see and that confuses your IDE.

    Why not spend time on helping IDEs to understand error messages? That would be billion times more useful.

  • jart4 days ago
    Here's the pending GCC 15 release notes: https://gcc.gnu.org/gcc-15/changes.html (since the link in the article points to GCC 14)

    - I'd love to see godbolt examples of the sort of optimizations [[unsequenced]] and [[reproducible]] can do.

    - GCC has always been in my experience smart enough to know how to optimize normal C code into ROL and ROR instructions. I've never had any issues with it. So what's the point of __builtin_stdc_rotate_left()? Why create a builtin when it is not needed? What I wish GCC and Clang would do instead, is formally document the magic ANSI C89 incantations that trigger the optimization, e.g. `#define ROL(x, n) (((x) << (n)) | ((x) >> (64 - (n))))`. That way we can have clean portable code with less #ifdef hell along with assurances it'll go fast when -O is passed.

    - What is "Abs Without Undefined Behavior (addition of builtins for use in future C library <stdlib.h> headers)."?

    - What is "Allow zero length operations on null pointers"?

    - Re: "Introduce complex literals." How about some __int128 literals?

    - "The "redzone" clobber is now allowed in inline assembler statements" wooo I've wanted something like this for a while.

    Great work from the greatest compiler team!

    • fuhsnn4 days ago
      > How about some __int128 literals

      There are _BitInt literals (wb and uwb), look adequate https://godbolt.org/z/xjEEM5Pa4 despite clang's "is a C23 extension" noise.

      • jart3 days ago
        Wait I missed that. They finally implemented _BitInt? YESSS
  • brcmthrowaway5 days ago
    What is the status of GCC plugins?
    • o11c4 days ago
      Is there any reason to ask this? GCC plugins have been good since 4.8. 4.5 lacked some essential features; 4.6 would be fine if not for the annoyance of trying to support a plugin that works across the C++ transition. Of course, you can just use the Python plugin to save yourself a lot of sanity ...
    • aseipp4 days ago
      They work. You can use them? They haven't gone anywhere and the Linux kernel relies on them extensively, among other things.
  • wiseowise5 days ago
    Already looking forward to `grep`ing warning emojis in output. What's next? Poop emoji for errors?
    • db48x4 days ago
      Why would you grep for the emoji? The actual pattern you want to match is the standard format of the error message that proceeds it, which hasn’t changed in eons:

          my $line = "infinite-loop-linked-list.c:30:10: warning: infinite loop [CWE-835] [-Wanalyzer-infinite-loop]";
          grammar gcc {
              token filename { <-[:]>+ };
              token linenumber { \d+ };
              token colnumber { \d+ };
              token severity { info|warning|error };
              token message { .* $$ };
              regex diagnostic {        <filename>
                                 [\:]   <linenumber>
                                 [\:]   <colnumber>
                                 [\:]\s <severity>
                                 [\:]\s <message>
                               };
              token TOP { <diagnostic> };
          }
          
          say gcc.parse($line);
      
      which when run produces the obvious output:

          「infinite-loop-linked-list.c:30:10: warning: infinite loop [CWE-835] [-Wanalyzer-infinite-loop]」
           diagnostic => 「infinite-loop-linked-list.c:30:10: warning: infinite loop [CWE-835] [-Wanalyzer-infinite-loop]」
            filename => 「infinite-loop-linked-list.c」
            linenumber => 「30」
            colnumber => 「10」
            severity => 「warning」
            message => 「infinite loop [CWE-835] [-Wanalyzer-infinite-loop]」
      
      I’ll leave handling filenames containing colons as an exercise for the reader.

      The emoji just focus the reader’s eye on the most critical line of the explanation.

    • johnisgood5 days ago
      I do not want any emojis from my compiler's output, heck, in my terminal in general. Please, do not make it the default behavior.

      (BTW my terminal does not even support emojis.)

    • OptionOfT2 days ago
      Same with their ‘’ quotation marks. I've had cases where I was searching for "'something'", and it didn't find anything, because it was printed as "‘something’".
    • aseipp4 days ago
      Why wouldn't you just keep search for the "warning:" or "error:" anchor, like people (and editors) have been doing forever? Even if that wasn't true it's not like searching for emojis is hard anyway. If it takes longer than 2 seconds for you to open a picker then you should ask for a refund for your computer.
  • budmichstelk5 days ago
    I like a these modernization efforts in GCC, a lot of old niggles are gone now, and they've made a lot of new improvements I didn't think were possible/easy!