36 pointsby ashleyn6 hours ago11 comments
  • longfacehorrace5 hours ago
    Looked at repos of the two loudest users in that thread; either they have none or it's all forks of other projects.

    Non-contributors dictating how the hen makes bread.

    • ronsor5 hours ago
      In general, caving to online mobs is a bad long-term strategy (assuming the mob is not the majority of your actual target audience[0]). The mob does not care about your project, product, or service, and it will not reward you for your compliance. Instead it only sees your compliance as a weakness to further target.

      [0] While this fact can be difficult to ascertain, one must remember that mobs are generally much, much louder than normal users, and normal users are generally quiet even when the mob is loud.

      • lich_king5 hours ago
        Yes, but also... that's like 90% of the interactions you get on the internet?

        I don't want to be too meta, but isn't that a description of most HN threads? We show up to criticize other people's work for self-gratification. In this case, we're here here to criticize the dev caving in, even though most of us don't even know what Stoat is and we don't care.

        Except for some corner cases, most developers and content creators mostly get negative engagement, because it's less of an adrenaline rush to say "I like your work" than to say "you're wrong and I'm smarter than you". Many learn to live with it, and when they make a decision like that, it's probably because they actually agree with the mob.

        • ronsor5 hours ago
          I don't actually care what the dev does. That's their prerogative, and it doesn't affect whether or not I'll use the software (I will if it's useful). I think that's the difference between here and a "mob", assuming other commenters think similarly.

          I do think it's harmful to cave in, but that doesn't make me think less of the maintainer's character. On the other hand, some of the commenters in the issue might decry them as evil if they made the "wrong" decision.

          It's fine to have opinions on the actions of others, but it's not fine to burn them at the stake.

        • longfacehorrace4 hours ago
          Not just online; priests, CEOs, celebrities, politicians; don't make them happy you're a sinner, a bad employee, hater of freedom, etc.

          Anyone with a rhetorical opinion but who otherwise provides little to getting cars off assembly lines, homes built, network cables laid.

          In physical terms the world is full of socialist grifters in that they only have a voice, no skill. They are reliant on money because they're helpless to themselves.

          Engineers could rule the world of they acted collectively rather than start personal businesses. If we sat on our hands unless demands are met, the world stops.

          A lot of people in charge fear tech unions as we control the world getting shit done.

    • throawayonthe5 hours ago
      isn't forks of other projects how you usually contribute code on github
    • rurban3 hours ago
      Makes a good block list. Vehemently arguing to block AI
    • blibble4 hours ago
      most of the anti-AI community have already migrated their repos from Slophub
  • singularfutur5 hours ago
    Reverting a few trivial commits because of purity tests is a bad precedent. It rewards the loudest commenters and punishes maintainers.
    • imsofuture5 hours ago
      It will be a painful decade until those who have already lost this weird ideological war ever realize it.
      • rsynnott4 hours ago
        And which side is that? I mean, from my point of view, it seems like it’s probably the ones who are having a magic robot write a thousand lines of code that almost, but not quite, does something sensible, rather than using a bloody library.

        (For whatever reason, LLM coding things seem to love to reinvent the square wheel…)

        • spankalee4 hours ago
          > the ones who are having a magic robot write a thousand lines of code that almost, but not quite, does something sensible

          Gee, I wonder which "side" you're on?

          It's not true that all AI generated code looks like it does the right thing but doesn't, or that all that human written code does the right thing.

          The code itself matters here. So given code that works, is tested, and implements the features you need, what does it matter if it was completely written by a human, an LLM, or some combination?

          Do you also have a problem with LLM-driven code completion? Or with LLM code reviews? LLM assisted tests?

          • rsynnott3 hours ago
            Oh, yeah, I make no secret of which side I’m on there.

            I mean I don’t have a problem with AI driven code completion as such, but IME it is pretty much always worse than good deterministic code completion, and tends to imagine the functions which might exist rather than the functions which actually do. I’ve periodically tried it, but always ended up turning it off as more trouble than it’s worth, and going back to proper code completion.

            LLM code reviews, I have not had the pleasure. Inclined to be down on them; it’s the same problem as an aircraft or ship autopilot. It will encourage reduced vigilance by the human reviewer. LLM assisted tests seem like a fairly terrible idea; again, you’ve got the vigilance issue, and also IME they produce a lot of junk tests which mostly test the mocking framework rather than anything else.

        • kingstnap4 hours ago
          Dependencies aren't free. If you have a library that has less than a thousand lines of code total that is really janky. Sometimes it makes sense like PicoHTTPParser but it often doesn't.

          Left-pad isn't a success story to be reproduced.

          • rsynnott3 hours ago
            Not saying left pad is a good idea; I’m not a Javascript programmer, but my impression has always been that it desperately needs something along the lines of boost/apache commons etc.

            EDIT: I do wonder if some of the enthusiastic acceptance of this stuff is down to the extreme terribleness of the javascript ecosystem, tbh. LLM output may actually beat leftpad (beyond the security issues and the absurdity of having a library specifically to left pad things, it at least used to be rather badly implemented), but a more robust library ecosystem, as exists for pretty much all other languages, not so much.

        • GorbachevyChase4 hours ago
          I’m not sure where you’ve been the last four years, but we’ve come a long way from GPT 3.5. There is a good chance your work environment does not permit the use of helpful tools. This is normal.

          I’m also not sure why programmatically generated code is inherently untrustworthy but code written by some stranger who is confidence in motives are completely unknown to you is inherently trustworthy. Do we really need to talk about npm?

        • ronsor4 hours ago
          Not once in history has new technology lost to its detractors, even if half its proponents were knuckleheads.
          • latexr4 hours ago
            Web3, Google Glass, Metaverse, NFTs…
          • rsynnott3 hours ago
            Ah, yes. That’s why we all have our meetings in the metaverse, then go back home on the Segway, to watch 3d TV and order pizza from the robotic pizza-making van (an actual silly thing that SoftBank sunk a few hundred million into). And pay for the pizza in bitcoin, obviously (in fairness, notoriously, someone did do that once).

            That’s just dumb things from the last 20 years. I think you may be suffering from a fairly severe case of survivorship bias.

            (If you’re willing to go back _30_ years, well, then you’re getting into the previous AI bubble. We all love expert systems, right?)

          • BJones124 hours ago
            Nuclear power disagrees
            • raincole4 hours ago
              Nuclear power will win (obviously). Unless you're talking about nuclear weapon.
          • hubertdinsk4 hours ago
            latest counter-example is NFT.
            • ronsor4 hours ago
              NFTs lost because they didn't do anything useful for their proponents, not because people were critical of them. They would've fizzled out even without detractors for that reason.

              On the other hand, normal cryptocurrencies continue to exist because their proponents find them useful, even if many others are critical of their existence.

              Technology lives and dies by the value it provides, and both proponents and detractors are generally ill-prepared to determine such value.

              • hubertdinsk4 hours ago
                oh it's "because of this and that" now?

                The orignal topic was "not once blah blah...". I don't have to entertain you further, and won't.

              • blibble4 hours ago
                moving the goalposts
          • qotgalaxy4 hours ago
            [dead]
    • Seattle35034 hours ago
      This sort of purity policing happens to other open source mission driven projects. The same thing happens to Firefox. Open source projects risk spending all their time trying to satisfy a fundamentally extreme minority, while the big commercial projects act with impunity.

      It seems like it is hard to cultivate a community that cares about doing the right thing, but is focused and pragmatic about it.

    • Palomides4 hours ago
      what if the users legitimately don't want AI written software?
      • raincole4 hours ago
        You have to think twice if you really want to cater to these 'legitimate users' then. In Steam's review section you can find people give negative reviews just because the game uses Unity or Unreal. Should devs cater to them and develop their in-house engine?
        • Palomides4 hours ago
          maybe? devs should weigh the feedback and decide what they think will best serve the project. open source is, especially, always in conversation with the community of both users and developers.
          • Seattle35033 hours ago
            > open source is, especially, always in conversation with the community of both users and developers

            Not necessarily. sqlite doesn't take outside contributions, and seems to not care too much about external opinion (at least, along certain dimensions). sqlite is also coincidentally a great piece of software.

      • minimaxir4 hours ago
        Then they have the right to not use it: Stoat does not have a monopoly on chat software.
    • minimaxir4 hours ago
      And then you have the "Alas, the sheer fact that LLM slop-code has touched it at all is bound to be a black stain on its record" comments.
    • blibble4 hours ago
      maybe a preview of what's to come when the legal system rules the plagiarism machine's output is a derivative work?
      • spankalee4 hours ago
        Since a human can also be a "plagiarism machine" (it's a potential copyright violation for both me and an LLM alike to create images of Mickey Mouse for commercial uses) it'll matter exactly what the output is, won't it?
  • pythonaut_165 hours ago
    Wastes of time like this are exactly why Stoat/Revolt is unlikely to ever be a serious Discord alternative
    • argee5 hours ago
      Could you elaborate on this? I can’t tell whether you mean to say that open source projects run into user-initiated time sinks that detract from their productivity (which is arguably the case for any public facing project), or whether private repositories bypass this type of scrutiny by default which affords them an advantage, or whether this is about the Stoat/Revolt devs specifically and how they choose to spend their time.
      • ronsor4 hours ago
        I think the parent comment is referring to the fact that even focusing on whether ~100 lines of code across 3 commits should/should not be generated by an LLM is meaningless bikeshedding which has no place in a serious project.
    • Palomides4 hours ago
      why? I think having a stated policy on LLM use is increasingly unavoidable for FOSS projects
  • stavros4 hours ago
    I love how people in the thread are like "if I'm going to ask my group of friends to switch to this, I need to know it's not written by security-issue-generator machines", meanwhile at Discord LLMs go brrr:

    https://discord.com/blog/developing-rapidly-with-generative-...

    • ronsor4 hours ago
      To be fair, many of them are already fleeing Discord over the ID surveillance, so it makes sense that they would be pickier this time.
    • latexr3 hours ago
      No one on the thread is advocating for Discord, so I don’t understand what argument you are making.
      • stavros3 hours ago
        What non-LLM using service do you think the people saying "I can't switch to Stoat if it uses LLMs" are switching from?
        • latexr3 hours ago
          You pointed out Discord are using LLMs, so by definition that can’t be the “non-LLM using service” they are switching from.

          But if they are switching from Discord, then that means they are unhappy with it too, thus they are not advocating for it.

          So, again, what’s your point?

          • stavros2 hours ago
            My point is there is no non-LLM service. The commenters simply focus on the thing they saw, and didn't even bother comparing against their existing alternative.

            It's just the perfect world fallacy.

            • latexr2 hours ago
              > My point is there is no non-LLM service.

              Considering Stoat just (supposedly) removed all LLM code from their code base, there is at least one. I’d expect, based on Meredith Whittaker’s stance regarding LLMs, that Signal also doesn’t have LLM code, though I haven’t verified.

              > The commenters simply focus on the thing they saw, and didn't even bother comparing against their existing alternative.

              I mean, how do you know? There is one mention of Discord in that thread. Making sweeping statements about “the commenters” doesn’t seem right.

  • sodality25 hours ago
    If only the average open source project got this level of scrutiny actually checking for vulnerabilities. I get that you don't want your private chats leaked by slopcode, but this was a few dozen lines of scaffolding in large software created before LLM coding; it would have been better to register your discontent without making demands, then continue to watch the repo for vulnerabilities. This feels like fervor without any work behind it
  • philipwhiuk4 hours ago
    The fun part is this only happens because Claude Code commits its changes.

    If you use for example, GitHub Co-Pilot IDE integration, there's no evidence.

    • zihotki27 minutes ago
      There is also `git commit --amend` available, there are many ways to hide the evidence if one needs.
  • ronsor5 hours ago
    It seems the thread was brigaded by militant anti-AI people upset over a few trivial changes made using an LLM.

    I encourage people here to go read the 3(!) commits reverted. It's all minor housekeeping and trivial bugfixes—nothing deserving of such religious (cultish?) fervor.

    • raincole5 hours ago
      At this point perhaps to not disclose AI usage is the right thing to do. Transparency only feeds the trolls, unfortunately.
      • ronsor5 hours ago
        I have been saying this for a few years at this point. Transparency can only exist when there is civility, and those without civility deserve no transparency[0].

        [0] As a corollary, those with civility do deserve transparency. It's a tough situation.

  • cat_plus_plus4 hours ago
    "it's worth considering that there are many people with incredibly strong anti-LLM views, and those people tend to be minorities or other vulnerable groups."

    I have pretty low expectations for human code in that repository.

    • Seattle35033 hours ago
      Is that claim even empirically true?
    • ronsor4 hours ago
      The response mentioning minorities is obviously bad faith. Even if true, it's not really relevant, and most likely serves as a way to tie LLM use to slavery, genocide, or oppression without requiring rational explanation.
      • latexr3 hours ago
        I just read it, and found no bad faith in it. It was polite, not pushy, explained the argument well (though of course you may disagree with it), gave a business reason, and even ended with “thank you for reading and considering this, if you do”.

        > and most likely serves as a way to tie LLM use to slavery, genocide, or oppression without requiring rational explanation.

        Assuming and ascribing nefarious motivations to a complete stranger can be considered bad faith, though. Probably not your intention, but that’s how it came across.

        • ronsor2 hours ago
          I have observed this pattern before. Usually minority groups are mentioned in an attempt to shift a debate toward values (which basically means no meaningful debate if you disagree) and away from technical considerations (which arguably deserve the most attention in a software product).

          Aside from that, the statement is not empirically true (from my perspective at least). Evidence isn't provided either. I'm not saying that the commenter consciously wanted to tie LLM use to those negative things, but it could be done subconsciously, because I have genuinely seen those arguments before.

          • latexr2 hours ago
            I understand your point and believe you believe it, which is why I mentioned I don’t think you were arguing in bad faith. What I am saying is I don’t think the commenter in question was acting in bad faith either, because that requires deception. In other words, it seems to me that commenter—like yourself—was arguing genuinely. If one agrees with their argument (or yours) is a different matter altogether, but bad faith it doesn’t seem to be.

            Hope that clarifies what I’m getting at.

  • logicprog5 hours ago
    What a shame
  • deadbabe4 hours ago
    If you find yourself having to use LLMs to write a lot of tedious code, you have bad architecture. You should use patterns and conventions that eliminate the tedium, by making things automagically work. This means each line of code you write is more powerful, less filler stuff. Remember the days when you could create entire apps with just a few lines of code? So little code that an LLM would be pointless.
  • ragthr5 hours ago
    Nice move! It is fun to watch the copyright thieves and their companies go into intellectual contortions (militant, purity tests, ideology) if their detrimental activities get any pushback.
    • 8757654656090683 hours ago
      Nice move smashing those stocking frames! It is fun to watch the knitting pattern thieves and their companies go into intellectual contortions (militant, purity tests, ideology) if their detrimental activities get any pushback.