18 pointsby Nash0x7e28 hours ago9 comments
  • cyberrock7 hours ago
    I think my experience with Python has been a lot worse than OP's. Random Python projects on GitHub always lacked polish and documentation. If anything my enjoyment of Python has skyrocket with uv, because I don't need to spend an hour guessing which Python 3.x version is compatible with your library.
  • slowcache7 hours ago
    This post is like 6 months late. I share the same concerns that others in the thread do, but the talking point is pretty tired by now
  • dryarzeg7 hours ago
    I don't know if my point is valid or not, but...

    Stop blaming "AI" - whatever you mean by this. Whether it's an LLM, an LLM-based agent or something else - stop blaming AI and "AI" and LLMs and... you get the point.

    It's not the AI that makes the decision to, sorry for being straightforward, write the worthless code which feels like a piece of useless bloated trash. It's not the AI who makes decisions to do something without even understanding the topic - no matter how exactly you define "understanding" in this context. It's not AI who is responsible for this. Because whatever AI truly is right now - an autocomplete tool, advanced chatbot or, maybe, agent - whatever it is, the decisions are made by humans. AI is not responsible for anything that is happening right now.

    Humans and humans only are responsible for what's happening. It's their choice. It's their qualities that are clearly visible now. It's their behaviour.

    Stop blaming kitchen knives for murders.

    • FeteCommuniste7 hours ago
      AI has made it exceptionally easy to generate "compiles/runs and looks plausible but is still fundamentally flawed" code at a much greater scale than ever before. Maybe the analogy should be a machine gun rather than a knife.
      • dryarzeg7 hours ago
        I've been discussing that with my friend just now, and he told me (direct citation):

        Well, yeah, stop blaming the knives. Blame the cooks ("vibecoders") who think they can manage a kitchen because the knife cuts everything in half automatically. But also don't forget to blame the knife manufacturer ("AI" companies) who markets automated knives to people who don't know you shouldn't cut toward yourself.

        I kind of agree. Some people don't understand how to code because they're lazy or have other issues, while others are trying to make a profit from it. I suppose you can tell who's who. But AI is directed by humans anyway. Instead of copy-pasting, a human could choose to try and write the code themselves, and then ask AI to review it and highlight areas for improvement. A human could choose to ask AI how to do things and then try to do it themselves. But if a human chooses to do things the other way, that's their choice. AI is not to blame here. It's still a human choice, and the person making it is the one who is actually responsible.

        Some people smoke. Smoking kills, and not only can smokers die from it, but other people can be harmed by passive smoking as well. It's very easy to start smoking. But blaming cigarettes themselves, as objects/entities/etc. isn't the answer, I guess. It was a certain person's choice to try smoking. It was also the choice of another person to advertise smoking in one way or another, however...

        • FeteCommuniste6 hours ago
          Sure, it's not really the AI's fault ultimately. But you can still ask the question of whether a given codebase (or the Python ecosystem, to take the Reddit post example) would be better off if LLMs didn't exist.
    • orwin7 hours ago
      This isn't a good analogy though. It's not blaming a kitchen knife, it's blaming a voice activated auto turret.

      Or rather, blaming a car. Yes, a bad driver is way more dangerous than a good driver, but even the best driver can make a mistake. Like cars, it's an inherently flawed piece of technology, and like cars, its benefits are too high for most of us to ignore. Way better analogy than my auto turret one.

      • dryarzeg6 hours ago
        > but even the best driver can make a mistake

        Well, if you put it this way... even the best programmer in the world, who doesn't use AI at all, can also make a mistake. Of course, their mistakes would probably be less frequent, but I guess they wouldn't blame IDE for poor syntax highlighting (if it's good enough, of course) or the compiler or interpreter for failing to spot the logical error unrelated to syntax rules. They would say "it was my mistake". The problem with AI-generated code, though, is that those who generate it almost never take responsibility for it. They'll say something like, "AI made a mistake here and there." I have never seen someone who has generated flawed code using AI to take responsibility for it. And that's the main problem.

        It doesn't matter whether you're a bad driver or the best driver. If you cause an accident, you must be held responsible. As simple as that.

        > Like cars, it's an inherently flawed piece of technology

        Sorry, but what exactly do you mean? I'm just curious to know what you mean when you say that cars are "an inherently flawed piece of technology".

  • 8 hours ago
    undefined
  • Redoubts3 hours ago
    Why can't you just ignore the bad projects? AI is probably super annoying if you're a maintainer getting slop PRs; but if you're a professional I don't see how this can vex you that much.
  • beej717 hours ago
    Imagine a world where npm and all the other library repositories are 99% AI slop. And the posts about which library to choose are 99% AI-generated.

    It'll be so hard to find anything in the chaff you might get your old job as a dev back. :)

    • FeteCommuniste7 hours ago
      We'll have AIs doing our chaff-sifting for us, too, right?
  • OutOfHere6 hours ago
    It's game over, guys. With newer AI models and coding agents that you will see in the next year or two, resistance is futile. Move on to something else that is not purely a programming job. If you only move to a different programming language, the same outcome will inevitably follow you. Wages are expected to diminish. As a case in point, almost no one needs Assembly experts anymore. From a management pov, professional coding now comes down to specifying requirements precisely and having exhaustive tests for them, all AI generated.
  • lmm7 hours ago
    Python was already like that. A cascade of beginners cargo-culting other beginners, because it's easy enough to get started that everyone think's they're an expert and will blog about it. Switch to a language with a bit of a barrier to entry and you avoid the problem.
    • tayo427 hours ago
      People make the same posts in the rust subreddit

      Unless we need to go to languages even harder? Haskell?

      • smitty1e6 hours ago
        I'd estimate that Haskell + AI could reach cuneiform levels of inscrutability.