48 pointsby mltvc2 hours ago17 comments
  • stephenlf2 hours ago
    > The cost of turning written business logic into code has dropped to zero

    Didn’t realize this was science fiction.

    • geeteean hour ago
      I appreciate the author making that the first sentence.
    • bopbopbop725 minutes ago
      I think the author forgot that code has to compile and be useful.

      And how much is technical debt worth?

      • simonw3 minutes ago
        What coding agent are you using where the code doesn't even compile!?
        • bopbopbop72 minutes ago
          The one that cursor used to build their famous browser.
      • canadiantim16 minutes ago
        Depends if I can bundle the technical debt, get a triple AAA rating on it and then sell it
    • AstroBenan hour ago
      I swear all of these are coming from the prompt "hey chatgpt rewrite this article that got a lot of views"

      I've seen non-technical people vibe code with agents. They're capable of producing janky, very basic CRUD apps. Code cost definitely ain't zero

    • heliumteraan hour ago
      But it is true, the cost is effectively zero. There will be, for a long time, free models available and any one of them will give you code back, always!

      They never refuse. Worst case scenario the good models ask for clarification.

      The cost for producing code is zero and code producers are in a really bad spot.

      • dt3ftan hour ago
        I beg to differ. Let's say you're right. Code producers should turn to agriculture and let their managers and product owners prompt AI to produce code. How about code maintainers? Ever heard the mantra "You build it, you run it"? Lets say that AI can build it. Can it run it though? All alone, safely, securely and reliably? No. It can't. We can keep dreaming though, and when will AI code production services turn profitable? Is there a single one which turned profitable?
        • heliumtera38 minutes ago
          Calm down buddy, maybe you're confusing code producers with something else. It's 2026 we don't bother with maintenance no more, we /new to keep context clean and start over. Just don't forget to comment - never delete - old code. Always keep dead code around to please shareholders, line numbers up always. We produce code, that is the main thing, never forget.

          One could argue we could achieve the same goals by appending \n to a file in a loop, but this is inefficient nowadays with generous token offerings (but could change in the future than I highly suggest just outputting \n to a file an call it productivity increase)

          I didn't understand your point about product owners. Who the fuck would ever need one when code produces itself?

          • dullcrispa minute ago
            Right but memory is expensive now so where do I keep all of this new code that I’ve produced??
      • bopbopbop7an hour ago
        Because who cares about correct and compilable code, any code will do!
      • autoexecan hour ago
        > The cost for producing code is zero

        Zero as long as your time is worth nothing, and bad code and security issues cost you nothing maybe.

        "Getting code" has always been dead simple and cheap. Getting actually good code that works and doesn't turn into a problem for you down the road is the expensive part

        • chasd00an hour ago
          > Zero as long as your time is worth nothing

          i can't remember who said it but a long time ago i remember reading "Linux is free if your time is worthless". Now we all use Linux one way or the other.

          • autoexec34 minutes ago
            That's still very much true, but at least in the case of Linux the cost is getting lower and lower all the time. The time investment for many has reached about the same as the cost needed for Windows and as a result we see more and more people using linux. At this point it's a perfectly viable gaming platform!

            Maybe one day LLMs will eventually make good code at a low cost, and that will allow non-programmers to write programs with few problems but the cost will never be zero, and I think we're a long long way from making human programers obsolete.

            All of the intelligence that LLMs mimic came directly from the work of human minds which got fed into them, but what LLMs output is a lossy conversion filled with error and hallucination.

            My guess is that the LLMs producing code will improve for a short time, but as they start to slurp up more and more of their own slop they'll start performing worse.

  • mohsen1an hour ago
    I am thinking about this a lot right now. Pretty existential stuff.

    I think builders are gonna be fine. The type of programmer were people would put up with just because they could really go in their cave for a few days and come out with a bug fix that nobody else on the team could figure out is going to have a hard time.

    Interestingly AI coding is really good at that sort of thing and less good at fully grasping user requirements or big picture systems. Basically things that we had to sit in meetings a lot for.

    • ericpauleyan hour ago
      This has been my experience too. That insane race condition inside the language runtime that is completely inscrutable? Claude one-shots it. Ask it to work on that same logic to add features and it will happily introduce race conditions that are obvious to an engineer but a local test will never uncover.
    • wiseowisean hour ago
      > The type of programmer were people would put up with just because they could really go in their cave for a few days and come out with a bug fix that nobody else on the team could figure out is going to have a hard time.

      Amen. It was a good time while it lasted.

    • falloutxan hour ago
      meetings hardly reach anywhere. most of the details are eventually figured out by developers when interacting with the code. If all ideas from PMs are implemented in a software, it would eventually turn into bloatware before even reaching MVP stage.
    • oytisan hour ago
      All software engineers become pretty much the same in this world though. Anyone can sit in the meetings.
  • ossa-ma2 hours ago
    With all due respect to the author, this is a lot of words for not much substance. Rehashing the same thoughts everyone already thinks but not being bold enough to make a concrete prediction.

    This is the time for bold predictions, you’ve just told us we’re in a crucible moment yet you end the article passively….

    • YZFan hour ago
      Predictions

      - Small companies using AI are going to kick the sh*t out of large companies that are slow to adapt.

      - LLMs will penetrate more areas of our lives. Closer to the STTNG computer. They will be agents in the real life sense and possibly in the physical world as well (robots).

      - ASICs will eat nVidia's lunch.

      - We will see an explosion of software and we will also see more jobs for people who are able to maintain all this software (using AI tools). There is going to be a lot more custom software for very specific purposes.

      • falloutxan hour ago
        > Small companies using AI are going to kick the sh*t out of large companies that are slow to adapt.

        Big companies are sales machines and their products have been terrible for ages. Microsoft enjoys the top spot in software sales only due to their sales staff pushing impossible deals every year.

        • YZF38 minutes ago
          It's true the big company products have been terrible but they also enjoyed a moat that made it harder for competitors to enter.

          With this moat reduced I think you'll find this approach doesn't work any more. The smaller companies will also hire the good sales people away.

    • jdjdndbdhsjsb2 hours ago
      Here is my bold prediction: 2026 is the year where companies start the lay offs.

      2026 is the year where we all realise that we can be our own company and build the stuff in our dreams rather than the mundane crap we do at work.

      Honestly I am optimistic about computing in general. Llms will open things up for novices and experts alike. We can move into I the fields where we can use our brain power... But all we need is enough memory and compute to control our destiny....

      • IhateAI_2an hour ago
        The one-shotted mind is truly hilarious.
        • jdjdndbdhsjsban hour ago
          I'm human?
          • jdjdndbdhsjsb34 minutes ago
            Oytis: I can't reply to you directly, but yes I am sure I am human.

            Not sure how to prove it to you.

          • oytis43 minutes ago
            Are you sure?
      • Muromecan hour ago
        >Here is my bold prediction: 2026 is the year where companies start the lay offs.

        Start? Excuse moi

        • jdjdndbdhsjsb43 minutes ago
          Yeah fair... But now it is different I.e. they won't regret it
      • AIorNotan hour ago
        I don't know, its a bit of a hellscape in tech right now as thousands of people with deep domain knowledge and people knowledge and business knowledge (ie experienced engineers managers and product owners), were laid off by C Suites desperate to keep the AI funded mandates going

        Do you know how hard it to make a successful company or even make money? Its like saying any actor can goto hollywood and be a star

        VCs wont fund everyone

        Nobody is sure of anything

        • jdjdndbdhsjsb44 minutes ago
          Yes it is. But I am an optimist for human nature. I personally believe smaller companies doing different things is the future... Scaling as they need. It is a hellscape but people can and will adapt.

          > Do you know how hard it to make a successful company or even make money?

          Yes I have failed to do it before. I get this.

          > VCs wont fund everyone

          And? Do you need VCs? Economics mean that scale matters but what if we don't need it. What if we can make efficient startups with our own funding??

      • falloutxan hour ago
        Except it started in 2023, we are in the middle of layoff waves.
  • pvtmert33 minutes ago
    > The cost of turning written business logic into code has dropped to zero

    It hasn't. Large enterprises currently footing the bill, essentially subsidizing AI for now.

    I constantly see comparisons between the 200$ Claude-Code Max subscription vs 6-figure (100k$) salary of an engineer.

    The comparison here is, first of all, not apples-to-apples. Let's correct CC subscription to the yearly amount first; 12x200=2400$. Still more than 10x difference compared to the human engineer.

    Although when you have the human engineer, you also pay for the experience, responsibility, and you somewhat transfer liability (especially when regulations come into play)

    Moreover, creation by a human engineer, unless stolen IP or was plagiarized, owned by you as the employer/company. Meanwhile, whatever AI generated is guaranteed to be somewhat plagiarized in the first place. The ownership of the IP is questionable.

    This is like when a layman complains when the electrician comes to their house, identifies the breaker problem, replaces the breaker which costs 5$ and charges 100$ for 10-minute job. Which is complete underestimation of skill, experience, and safety. A wrong one may cause constant circuit-breaks, causing malfunction in multitude of electronic devices in the household. Or worse, may cause a fire. When you think you paid 100$ for 10-minutes, in fact it was years of education, exams, certification, and experience you had paid for your future safety.

    The same principle applies to the AI. It seems like it had accumulated more and more experience, but failing at the first prompt-injection. It seems like getting better at benchmarks because they are now part of their dataset. All these are hidden-costs 99% does not talk about. All these hidden costs are liabilities.

    You may save an engineer's yearly salary today, at the cost of losing ten times more to the civil-lawsuits tomorrow. (Of course, depending on a field/business)

    If your business was not that critical to get a civil-lawsuit in the first place, then you probably didn't needed to hire an engineer yourself. You could hire an agency/contractor to do that in much cheaper way, while still sharing liability...

  • ausbah33 minutes ago
    blogs like this seem like they’re in the right direction with LLMs being “here to stay” and a near indispensable part of people’s daily toolkit, but the near certainty that programming as a job or skillset is dead in the water seems just wrong?

    like ok the cost for anyone to generate almost always working code has dropped to zero but how does a lay person verify the code satisfies business logic? asking the same set to generate tests to that just seems to move the goalposts

    or like what happens when the next few years of junior engineers (or whatever replaces programming as a field)who’ve been spoon fed coding through LLMs need to actually decipher LLM output and pinpoint something the machine can’t get right after hours of prompting? a whole generation blindly following a tool they cant reliably control?

    but maybe I am just coping because it feels like the ladder on the rest of my already short career , but some humility m

  • bopbopbop743 minutes ago
    Has there been any good and useful software created with LLMs or any increase in software quality that we can actually look at?

    So far it's just AI doom posting, hype bloggers that haven't shipped anything, anecdotes without evidence, increase in CVEs, increase in outages, and degraded software quality.

    • oytis38 minutes ago
      Software quality has been degrading for decades without LLMs though.

      I only have anecdotal evidence from some engineers I know that they don't write software by hand any more. Provided the software they are working on was useful before, we can say that LLMs are writing useful software now.

      • 35 minutes ago
        undefined
  • lefrenchy20 minutes ago
    Lost me at the first sentence. That is an insanely large claim to make with no evidence.
  • Xiolan hour ago
    > The cost of turning written business logic into code has dropped to zero

    Tokens are free now?

    • TheCorehan hour ago
      > Or, at best, near-zero.
      • blamarvtan hour ago
        I mean, lots of numbers are near zero depending on your definition of near.
    • heliumteraan hour ago
      Yes. Every platform offers free tokens generously.

      That is a true statement. Might not be much, but is enough for you to produce some code and shit out a readme and then show on hacker news that your capable of pushing to git with the help of llms

  • falloutxan hour ago
    > The cost of turning written business logic into code has dropped to zero. Or, at best, near-zero.

    Zero if you dont consider Anthropic's API pricing, the prompter's hourly rate and verification bottleneck.

    • heliumteraan hour ago
      Everybody offers a generous free tier.

      Verification? LoL, lmao even. Your vibes are low.

      If you're a professional code producer, you shit out code as fast as possible. Don't give anyone time to analyze the disgusting pile of shit you generated, just shit out code as fast as you can and call it a win! Who would prove you wrong?

      Would someone waste their biological precious resources reviewing machine generated slop, when your cadence is super human? Would someone use the same machine you used to evaluate itself? Lol

  • rc-114025 minutes ago
    Saying it again, I think we're in need of a moratorium on "AI Has Changed The World Forever" posts. All of them read the same and offer nothing past "I asked a LLM to make a midsize feature, I haven't looked at the code but it compiles on my machine and that should terrify you" - buddy, we've had people pushing code that compiles on their machine and occasionally goes quickly read or unread in PRs, that terrifies me now.
  • aleda145an hour ago
    I personally love this development. Sure, I find some pleasure in writing code. But what I love most is mapping out a gnarly problem on pen and paper. Then the code is "just" an implementation detail. Guess I'm an ideas guy as per the author?
  • PostOncean hour ago
    I personally believe (and so far, my evidence suggests) that AI doesn't anywhere near as well as claimed for almost any of its use cases.

    However, let's suppose the alternate case:

    If AI works as claimed, people in their tens of millions will be out of work.

    New jobs won't be created quickly enough to keep them occupied (and fed).

    Billionaires own the media and the social media and will use it to attempt to prevent change (i.e. apocalyptic taxation)

    What, then, will those people do? Well, they say "the devil makes work for idle hands", and I'm curious what that's going to look like.

  • zb318 minutes ago
    > The cost of turning written business logic into code has dropped to zero

    Then go and throw your $0 to fix some real bugs on GitHub.. really, if AI works so well, why are all those issues still open?

    Look, almost 2K issues open here: https://github.com/emscripten-core/emscripten/issues

    If AI really works like non-technical people think it does, why doesn't Google just throw their AI tool to fix them all?

  • monero-xmran hour ago
    I own (cofounded) a medium-sized saas business with hundreds of employees. I maintain final decision of everything technical and still code every day because it’s important. All of the engineers use LLM tools, you’d be stupid not to. But I need good engineers, I replace good engineers when someone leaves, and the business itself is so much larger than just programming. The system is just so huge and complex, and I am the benevolent dictator that architected it and maintains the core design decisions, that the LLM does not replace the need for engineers nor my own expertise.

    Furthermore if we were truly in the utopia the author describes, why do all the LLM companies employ (and pay top dollar) for so many engineers? Why does OpenAI pay for slack when they could just vibe code a chat app in an hour?

    The challenge of building a real, valuable software business (or any business) is so much harder than using LLM to prompt “build me a successful software business”

  • kgravesan hour ago
    When these vibe coded projects realise that maintenance, security updates, API changes are still needed. Get ready for a massive swing back to senior software developers being in demand.

    Playing software maintainer while many vibe coded web apps aren't built with proper software architecture or practices only makes the swing back to senior engineers being in demand a possibility.

    Good luck to those who are building 600K-LOC vibe coded web apps with 40+ APIs stitched together.

    • pvtmert25 minutes ago
      Honestly, "vibe-code-cleaners" are already out there and in demand!

      I even expect "vibe-code-scalers" will come soon, to be able to fix and scale up the spaghetti AI agents plopped in the first place.

      The author seems to be an Amazonian, it also seems that they are good at "Invent", but not at "Simplify" bit.

      Big-Tech has invented LLMs, that is great. Big-Tech hasn't been great when it comes to "Simplify"-ing things. Actually, notoriously bad at it.

      That is the opportunity here; "Simplifying" these workflows, making AaaS (Agent as a Service or AI as a Service)

  • jongjongan hour ago
    I predict it's going to be a bloodbath. People who worked for Big Tech have no idea what's coming. Some of us software engineers who have been outside have been experiencing issues for almost a decade. The industry is extremely anti-competitive.

    Whatever you produce, nobody is going to use unless you produce it under the banner of Big Tech. There are no real opportunities for real founders.

    The problem is spreading beyond software. The other day, I found out there is a billion dollar company whose main product is a sponge... Yes, a sponge, for cleaning. We're fast moving towards a communist-like centrally planned economy, but with a facade of meritocracy where there is only one product of each kind and no room for competition.

    This feeling of doom that software engineers started to feel after LLMs is how I was feeling 5 years earlier. People are confused because they think the problem is that AI is automating them but reality is that our problems arise from a deeper systemic issue at the core of our economic system. AI is just a convenient cover story, it's not the reason why we are doomed. Once people accept that we can start to work towards a political solution like UBI or better...

    We've reached the conclusion of Goodhart's Law "When a measure becomes a target, it ceases to be a good measure" - Our economic system has been so heavily monitored and controlled in every aspect that is has begun to fail in every aspect. Somebody has managed to profit from every blindspot, every exploit exposed by the measurement and control apparatus. Everything is getting qualitatively worse in every way that is not explicitly measured and the measurement apparatus is becoming increasingly unreliable... Most problems we're experiencing are what people experienced during the fall of communism except filter bubbles are making us completely blind to the experience of other people.

    I think if we don't address the root causes, things will get worse for everyone. People's problems will get bigger, become highly personalized, pervasive, inexplicable, unrelatable. Everyone will waste their energy trying to resolve their own experience of the symptoms but the root causes would remain.

  • IhateAI_2an hour ago
    I feel like YC has a bunch of these optimistic blog posts ready to throw up on the front page anytime something goes viral about how LLMs are bad.

    Software isnt going to become more economically valuable its going to be used to replace economic inputs of labor with units of compute.

    Its entirely intended to take humans out of the equation or devalue human labor and it always has. Dont be a fool.

    • tptacekan hour ago
      That's literally what automation is. You could make the same argument against the power loom. People did!
      • IhateAI_3an hour ago
        *THE word ''Luddite'' continues to be applied with contempt to anyone with doubts about technology, especially the nuclear kind. Luddites today are no longer faced with human factory owners and vulnerable machines. As well-known President and unintentional Luddite D. D. Eisenhower prophesied when he left office, there is now a permanent power establishment of admirals, generals and corporate CEO's, up against whom us average poor bastards are completely outclassed, although Ike didn't put it quite that way. We are all supposed to keep tranquil and allow it to go on, even though, because of the data revolution, it becomes every day less possible to fool any of the people any of the time. If our world survives, the next great challenge to watch out for will come - you heard it here first - when the curves of research and development in artificial intelligence, molecular biology and robotics all converge. Oboy. It will be amazing and unpredictable, and even the biggest of brass, let us devoutly hope, are going to be caught flat-footed. It is certainly something for all good Luddites to look forward to if, God willing, we should live so long. Meantime, as Americans, we can take comfort, however minimal and cold, from Lord Byron's mischievously improvised song, in which he, like other observers of the time, saw clear identification between the first Luddites and our own revolutionary origins.

        It begins : As the Liberty lads o'er the sea Bought their freedom, and cheaply, with blood, So we, boys, we Will die fighting, or live free, And down with all kings but King Ludd!*

        Your homework:

        https://archive.nytimes.com/www.nytimes.com/books/97/05/18/r...

    • kykatan hour ago
      To me it seems like the big question for the future will be how to achieve political relevance as "the little guy". It seems like with LLMs the typical "get educated" pathway for the lower class is closing quick. I dread to think of a world where large portions of society are essentially "useless".
    • pousadaan hour ago
      What went viral? To me it just seems like people are pretty divided on the topic which makes sense as it’s an emerging technology. I feel I see as many posts against AI as glazing it.
      • hunterpayne12 minutes ago
        Which side of that argument has lots of marketing dollars behind it. I suspect what we are all seeing is marketing plus a few useful idiots against everyone else. I will change my mind when I start seeing actual apps created by LLMs which people actually use. What I do see is LLMs replacing search engines and lots of failed software projects. The basic problem here is that the powers that be think the economics of software are like manufacturing. They aren't; they are closer to music publishing, just a lot bigger. And AI isn't having any real impact there.
      • themacguffinmanan hour ago
        The recent frontpage post I see is https://news.ycombinator.com/item?id=46926245 (not on frontpage anymore, probably downranked by flamewar detector since it has tons of comments)