What software developers actually do is closer to the role of an architect in construction or a design engineer in manufacturing. They design new blueprints for the compilers to churn out. Like any design job, this needs some actual taste and insight into the particular circumstances. That has always been the difficult part of commercial software production and LLMs generally don't help with that.
It's like thinking the greatest barrier to producing the next great Russian literary novel is not speaking Russian. That is merely the first and easiest barrier, but after learning the language you are still no Tolstoy.
Theyre explicitly saying that most software will no longer be artisianal - a great literary novel - and instead become industrialized - mass produced paperback garbage books. But also saying that good software, like literature, will continue to exist.
As such, the article's point fails right at the start when it tries to make the point that software production is not already industrial. It is. But if you look at actual industrial design processes, their equivalent of "writing the code" is relatively small. Quality assurance, compliance to various legal requirements, balancing different requirements for the product at hand, having endless meetings with customer representatives to figure out requirements in the first place, those are where most of the time goes and those are exactly the places where LLMs are not very good. So the part that is already fast will get faster and the slow part will stay slow. That is not a recipe for revolutionary progress.
Your point that most software uses the same browsers, databases, tooling and internal libraries is a weakness, a sameness that can be exploited by current AI, to push that automation capability much further. Hell, why even bother with any of the generated code and infrastructure being "human readable" anymore? (Of course, all kinds of reasons that is bad, but just watch that "innovation" get a marketing push and take off. Which would only mean we'd need viewing software to make whatever was generated readable - as if anyone would read to understand hundreds/millions of generated complex anything.)
Perhaps a good analogy is the spreadsheet. It was a complete shift in the way that humans interacted with numbers. From accounting to engineering to home budgets - there are few people who haven't used a spreadsheet to "program" the computer at some point.
It's a fantastic tool, but has limits. It's also fair to say people use (abuse) spreadsheets far beyond those limits. It's a fantastic tool for accounting, but real accounting systems exist for a reason.
Similarly AI will allow lots more people to "program" their computer. But making the programing task go away just exposes limitations in other parts of the "development" process.
To your analogy I don't think AI does mass-produced paperbacks. I think it is the equivalent of writing a novel for yourself. People don't sell spreadsheets, they use them. AI will allow people to write programs for themselves, just like digital cameras turned us all into photographers. But when we need it "done right" we'll still turn to people with honed skills.
It's the article's analogy, not mine.
And, are you really saying that people aren't regularly mass-vibing terrible software that others use...? That seems to be a primary use case...
Though, yes, I'm sure it'll become more common for many people to vibe their own software - even if just tiny, temporary, fit-for-purpose things.
Unlike clothing, software always scaled. So, it's a bit wrongheaded to assume that the new economics would be more like the economics of clothing after mass production. An "artisanal" dress still only fits one person. "Artisanal" software has always served anywhere between zero people and millions.
LLMs are not the spinning jenny. They are not an industrial revolution, even if the stock market valuations assume that they are.
The steam engine analogy is just garbage imho, because its not like we weren't already automated up to our neck. And each new layer of automation required a new specialized skill set. We aren't horses being driven and to be replaced or miners who refuse to adopt jackhammers or luddite weavers (most of the time).
I think most skilled professions that adopt automation just add new skill demand for each new multiplier of output.
The best analogy I can come up with is CNC machining. A machinist of old was very much a craftsman, making each mill pass according to a drawing and doing lots of measuring and such.
Now machining is front-loaded and automated. You spend all the effort on the model and design, then let one or more machines make parts, and you are there for intervention, smoothing, part repositioning, q/a, etc etc.
This is almost a perfect analogy for my day to day interaction with these agents. I let the process run and design in inspection points, test points etc to make sure it completes according to my spec. If not, it's just as often the right move to chuck the part in scrap and try again.
And what do you feel is the role of universities? Certainly not just to learn the language right? I'm going through a computer engineering degree and sometimes I feel completely lost with an urge to give up on everything, even though I am still interested in technology.
A lot of engineers and programmers did not go to school.
Maybe there'll be an enormous leap again but I just don't quite see the jump to how this gets you to 'industrial' software. It made it a lot faster, don't get me wrong, but you still needed the captain driving the ship.
The question is more what becomes of all the rowers when you’re switching from captain + 100 rowers to captain + steam engine
They’re not all going to get their own boat and captain hat
A Knowledge Pool is the reservoir of shared knowledge that a group of people have about a particular subject, tool, method, etc. In product strategy, knowledge pools represent another kind of moat, and a form of leverage that can be used to grow or maintain market share.
Usage: Resources are better spent on other things besides draining the knowledge pool with yet another new interface to learn and spending time and money filling it up again with retraining.
Take this for example:
``` Industrial systems reliably create economic pressure toward excess, low quality goods. ```
Industrial systems allow for low quality goods, but also they deliver quality way beyond what can be achieved in artisanal production. A mass produced mid-tier car is going to be much better than your artisanal car.
Scale allows you not only to produce more cheaply, but also to take quality control to the extreme.
Perhaps an industrial car is better than your or my artisanal car, but I'm sure there's people who build cars by hand of very high quality (over the course of years). Likewise fine carpentry vs mass produced stuff vs ikea.
Or I make sourdough bread and it would be very impractical/uncompetitive to start selling it unless I scaled up to make dozens, maybe hundreds, of loaves per day. But it's absolutely far better than any bread you can find on any supermarket shelf. It's also arguably better than most artisanal bakeries who have to follow a production process every day.
This has never been true for "artisanal" software. It could be used by nobody or by millions. This is why the economic model OP proposes falls apart.
I don't think this is true in general, although it may be in certain product categories. Hand-built supercars are still valued by the ultra-wealthy. Artisanal bakeries consistently make better pastries than anything mass produced... and so on
As a developer for almost 30 years now, if I think where most of my code went, I would say, quantitatively, to the bin.
I processed much data, dumps and logs over the years. I collected statistical information, mapped flows, created models of the things I needed to understand. And this was long before any "big data" thing.
Nothing changed with AI. I keep doing the same things, but maybe the output have colours.
I think I've overall just had just 2 or 3 projects where anyone has actually even tried the thing I've been working on.
Not that I disagree: I’m on record agreeing with the article months ago. Folks in labs probably seen it coming for years.
Yes we’ve seen major improvements in software development velocity - libraries, OSes, containers, portable bytecodes - but I’m afraid we’ve seen nothing yet. Claude Code and Codex are just glimpses into the future.
If we use about 20 TW today, in a thousand years of 5% growth we’d be at about 3x10^34. I think the sun is around 3.8x10^26 watts? That gives us about 8x10^7 suns worth of energy consumption in 1000 years.
If we figure 0.004 stars per cubic light-year, we end up in that ballpark in a thousand years of uniform spherical expansion at C.
But that assumes millions ( billions?) of probes traveling outward starting soon, and no acceleration or deceleration or development time… so I think your claim is likely true, in any practical sense of the idea.
Time to short the market lol.
AI capabilities are growing exponentially thanks to exponential compute/energy consumption, but also thanks to algorithmic improvements. we've got a proof that human-level intelligence can run at 20W of power, so we've got plenty of room to offset the currently-missing compute.
Correlation doesnt say anything about the sensitivity/scaling. (i recognize that my original comment didnt quite make this point, though the correlation is definitely not 100%, so that point does still stand)
can you note the difference between the earth being lit by torches, candles, kerosene lamps and incandescent bulbs, versus LED lights? LED isnt glowing harder, it just wastes less energy.
A rocket stove, or any efficient furnace, can extract vastly more energy from the same fuel source than an open fire. I assume combustion engines have had significant efficiency improvements since first introduced. And electric engines are almost completely efficient - especially when fed by efficient, clean/renewable source.
How about the computing power of a smartphone versus a supercomputer from 1980?
What is more energy efficient, a carpenter working with crude stones or with sharp chisels?
and we can, of course, put aside whether any measurement of economic value is actually accurate/useful... A natural disaster is technically good for many economic measures, since the destruction doesn't get measured and the wealth invested in rebuilding just counts as economic activity
And, Of course, then there's creeptocurrencies which use an immense amount of energy to do something that was previously trivial. And worse, when it is used in place of cash. But even there, some are more efficient than others - not that anyone who uses them actually cares.
Another common misconception is, it is now easier to compete with big products, as the cost of building those products will go down. Maybe you think you can build your own Office suite and compete with MS Office, or build a SAP with better features and quality. But what went into these software is not just code, but decades of feedback, tuning and fixing. The industrialization of software can not provide that.
Basically every company that does anything non-trivial could benefit from tailor-made software that supports their specific workflow. Many small companies don't have that, either they cannot afford their own development team, or they don't know that/how software could improve their workflow, or they are too risk-averse.
Heck, even my small family of 4 persons could benefit from some custom software, but only in small ways, so it's not worth it for me to pursue it.
Once we're at the point where a (potentially specialized) LLM can generate, securely operate and maintain software to run a small to medium-sized business, we'll probably find that there are far more places that could benefit from custom software.
Usually if you introduce, say, an ERP system into a company that doesn't use one yet, you need to customize it and change workflows in the company, and maybe even restructure it. If it were cheap enough to build a custom ERP system that caters to the existing workflows, that would be less disruptive and thus less risky.
On the contrary, this is likely the reason why we can disrupt these large players.
Experience from 2005 just don't hold that much value in 2025 in tech.
But taking out features are difficult - even when they have near to zero value.
Why it sometimes make sense for new players to enter the market and start over - without the legacy.
This is indeed one of the value propositions in the startup I work in.
That would be why a significant portion of the world's critical systems still run on Windows XP, eh?
The idea of automation creating a massive amount of software sounds ridiculous. Why would we need that? More Games? Can only be consumed at the pace of the player. Agents? Can be reused once they fulfill a task sufficently.
We're probably going to see a huge amount of customization where existing software is adapted to a specific use case or user via LLMs, but why would anyone waste energy to re-create the same algorithms over and over again.
Are they, though? I am not aware of any indicators that software costs are precipitously declining. At least as far as I know, we aren't seeing complements of software developers (PMs, sales, other adjacent roles) growing rapidly indicating a corresponding supply increase. We aren't seeing companies like mcirosoft or salesforce or atlassian or any major software company reduce prices due to supply glut.
So what are the indicators (beyond blog posts) this is having a macro effect?
I'm personally doing just that because I want an algorithm written in C++ in a LGPL library working in another language
I like the article except the premise is wrong - industrial software will be high value and low cost as it will outlive the slop.
...Or so think devs.
People responsible for operating software, as well as people responsible for maintaining it, may have different opinions.
Bugs must be fixed, underlying software/hardware changes and vulnerabilities get discovered, and so versions must be bumped. The surrounding ecosystem changes, and so, even if your particular stack doesn't require new features, it must be adapted (a simple example: your react front breaks because the nginx proxy changed is subdirectory).
The whole premise of AI bringing democratization to software development and letting any layperson produce software signals a gross misunderstanding of how software development works and the requirements it should fulfill.
What I want is software that can glue these things together. Each week, announce the fixture and poll the team to see who will play.
So far, the complete fragmentation of all these markets (fixtures, chat) has made software solutions uneconomic. Any solution's sales market is necessarily limited to a small handful of teams, and will quickly become outdated as fixtures move and teams evolve.
I'm hopeful AI will let software solve problems like this, where disposable code is exactly what's needed.
With my last side project, I became frustrated with my non-technical founder because he would have a lot of vague ideas and in his mind, he was sure that he had a crystal clear vision of what he wanted... But it was like, every idea he had, I was finding massive logical holes in them and finding contradictions... Like he wanted a feature and some other feature but it was physically impossible to have both without making the UX terrible.
And it wasn't just one time, it was constantly.
He would get upset at me for pointing out the many hurdles ahead of time... When in fact he should have been thanking me for saving us from ramming our heads into one wall after another.
This sounds weird, or wrong. Does anonymous stats need cookies at all?
High-level languages are about higher abstractions for deterministic processes. LLMs are not necessarily higher abstractions but instead about non-deterministic processes, a fundamentally different thing altogether.
There is a difference between writing for mainstream software and someone's idea/hope for the future.
Software that is valued high enough will be owned and maintained.
Like most things in our world, I think ownership/stewardship is like money and world hunger, a social issue/question.
This whole article was interesting, but I really like the conclusion. I think the comparison to the externalized costs of industrialization, which we are finally facing without any easy out, is a good one to make. We've been on the same path for a long time in the software world, as evidenced by the persistent relevance of that one XKCD comic.
There's always going to be work to do in our field. How appealing that work is, and how we're treated as we do that work, is a wide open question.
“The difference I return to again and again isn’t tech depth. It’s constraints.”
"Rough framework I’m using lately:"
Consumer software aims at maximizing joy.
“Enterprise software is all about coordination.”
"Industrial software operates in a environment of the real-world "mess", yet its
"Industrial stuff appears to be more concerned with:
a.
failure modeslong-term maintenance
predictable behavior vs cleverness
But as soon as software is involved with physical processes, the tolerance for ambiguity narrows quickly.
Curious how others see it:
What’s your mental line between enterprise and industrial? What constraints have affected your designing? “Nice abstractions.” Any instances where these failed the test of reality?
Your consumer/enterprise/industrial framework is orthogonal to the articles focus: how AI is massively reducing the cost of software.
The important thing is that goods =/= software. I, as an end user, of software rarely need specialized software. I dont need an entire app generated on the spot to split the bill and remember the difference if I have the calculator.
So, yes, we are industrializing software, but this reach that people talk about (I believe) will be severely limited.
low cost/low value software tagged as disposable usually means development cost was low, but maintenance cost is high ; and that's why you get rid of it.
On the other hand, the difference between good and bad traditional software is that, while cost is always going to be high, you want maintenance cost to be low. This is what industrialization is about.
What are the constraints with LLMs? Will an Anthropic, Google, OpenAI, etc, constrain how much we can consume? What is the value of any piece of software if anyone can produce everything? The same applies to everything we're suddenly able to produce. What is the value of a book if anyone can generate one? What is the value of a piece of art, if it requires zero skill to generate it?
Quite the opposite is true. For a large proportion of people, they would increase both the amount of years they live and quality of life by eating less.
I think the days where more product is always better lapse to an end - we just need to figure out how the economy should work.
It’s always a choice between taking more time today to reduce the cost of changes in the future, or get result fast and be less flexible later. Experience is all about keeping the cost of changes constant over time.
First the core of the argument that 'Industrialization' produces low quality slop is not true - industrialization is about precisely controlled and repeatable processes. A table cut by a CNC router is likely dimensionally more accurate than one cut by hand, in fact many of the industrial processes and machines have trickled back into the toolboxes of master craftsmen, where they increased productivity and quality.
Second, from my experience of working at large enterprises, and smaller teams, the 80-20 rule definitely holds - there's always a core team of a handful of people who lay down the foundations, and design and architect most of the code, with the rest usually fixing bugs, or making bullet point features.
I'm not saying the people who fall into the 80% don't contribute, or somehow are lesser devs, but they're mostly not well-positioned in the org to make major contributions, and another invariable aspect is that as features are added and complexity grows, along with legacy code, the effort needed to make a change, or understand and fix a bug grows superlinearly, meaning the 'last 10%' often takes as much or more effort than what came before.
This is hardly an original observation, and in today's ever-ongoing iteration environment, what counts as the last 10% is hard to define, but most modern software development is highly incremental, often is focused on building unneeded features, or sidegrade redesigns.
Oh wait. It is already a thing.
The mass production of unprocessed food is not what led to the production of hyper processed food. That would be a strange market dynamic.
Shareholder pressure, aggressive marketing and engineering for super-palatable foods are what led to hyper processed foods.
I think some people do instinctively feel like all different kinds of software have different shelf lives or useful lifetimes for different reasons.
But there's always so much noise it's not very easy to get the expiration date correct.
Mass production is pretty much a given when it comes to commodities, and things like long shelf life are icing on the cake.
The inversion comes when mass production makes the highly processed feed more affordable than the unprocessed. After both have scaled maximally, market forces mean more than the amount of labor that was put in.
Strange indeed.
I have some programming ability and a lot of ideas but would happily hire someone to realize those ideas for me. The idea I have put the most time into, took me the better part of a year to sort out all the details of even with the help of AI, most programmers could have probably done it in a night and with AI could write the software in a few nights. I would have my software for an affordable price and they could stick it in their personal store so other could buy it. If I am productive with it and show its utility, they will sell more copies of it so they have an incentive to work with people like me and help me realize my ideas.
Programming is going to become a service instead of an industry, the craft of programming will be for sale instead of software.
As someone who has worked in two companies that raised millions of dollars and had hundred people tackling just half of this, tax software, you are in for a treat.
Edit: Just noticed I said "any buisness", that was supposed to be "any small buisness." Edited the original post as well.
Edit: And if I was using C or C++ above my lack of capitalization would either evoke an error too OR passably continue foward referencing the wrong variable and result in a similar error to your transposition.
One of the things that happened around 2010, when we decided to effect a massive corporate change away from both legacy and proprietary platforms (on the one hand, away from AIX & Progress, and on the other hand, away from .Net/SQL Server), was a set of necessary decisions about the fundamental architecture of systems, and which -- if any -- third party libraries we would use to accelerate software development going forward.
On the back end side (mission critical OLTP & data input screens moving from Progress 4GL to Java+PostgreSQL) it was fairly straightforward: pick lean options and as few external tools as possible in order to ensure the dev team all completely understand the codebase, even if it made developing new features more time consuming sometimes.
On the front end, though, where the system config was done, as well as all the reporting and business analytics, it was less straightforward. There were multiple camps in the team, with some devs wanting to lean on 3rd party stuff as much as possible, others wanting to go all-in on TDD and using 3rd party frameworks and libraries only for UI items (stuff like Telerik, jQuery, etc), and a few having strong opinions about one thing but not others.
What I found was that in an organization with primarily junior engineers, many of which were offshore, the best approach was not to focus on ideally "crafted" code (I literally ran a test with a senior architect once where he & I documented the business requirements completely and he translated the reqs into functional tests, then handed over the tests to the offshore team to write code to pass. They didn't even mostly know what the code was for or what the overall system did, but they were competent enough to write code to pass tests. This ensured the senior architect received something that helped him string everything together, but it also meant we ended up with a really convoluted codebase that was challenging to holistically interpret if you hadn't been on the team from the beginning. I had another architect, who was a lead in one of the offshore teams, who felt very strongly that code should be as simple as possible: descriptive naming, single function classes, etc. I let him run with his paradigm on a different project, to see what would happen. In his case, he didn't focus on TDD and instead just on clearly written requirements docs. But his developers had a mix of talents & experience and the checked-in code was all over the place. Because of how atomically abstract everything was, almost nobody understood how pieces of the system interrelated.
Both of these experiments led to a set of conclusions and approach as we moved forward: clearly written business requirements, followed by technical specifications, are critical, and so is a set of coding standards the whole group understands and has confidence to follow. We setup an XP system to coach junior devs who were less experienced, ran regular show & tell sessions where individuals could talk about their work, and moved from a waterfall planning process to an iterative model. All of this sounds like common sense now that it's been standard in the tech industry for an entire generation, but it was not obvious or accepted in IT "Enterprise Apps" departments in low margin industries until far more recently.
I left that role in 2015 to join a hyperscaler, and only recently (this year) have moved back to a product company, but what I've noticed now is that the collaborative nature of software engineering has never been better ... but we're back to a point where many engineers don't fully understand what they're doing, either because there's a heavy reliance on code they didn't write (common 3P libraries) or because of the compartmentalization of product orgs where small teams don't always know what other teams are doing, or why. The more recent adoption of LLM-accelerated development means even fewer individuals can explain resultant codebases. While software development may be faster than ever, I fear as an industry we're moving back toward the era of the early naughts when the graybeard artisans had mostly retired and their replacements were fumbling around trying to figure out how to do things faster & cheaper and decidedly un-artisanally.
I won't put intention into the text because I did not check any other posts from the same guy.
That said, I think this revolution is not revolutionary yet. Not sure if it will be, but maybe?
What is happening os that companies are going back to "normal" number of people in software development. Before it was because of adoption to custom software, later because of labour shortage, then we had a boom because people caught up into it as a viable career but then it started scaling down again because one developer can (technically) do more with AI.
There are huge red flags with "fully automated" software development that are not being fixed but for those outside of the expertise area, doesn't seem relevant. With newer restrictions related to cost and hardware, AI will be even a worse option unless there is some sort of magic that fixes everything related to how it does code.
The economy (all around the world) is bonkers right now. Honestly, I saw some Jr Devs earning 6 fig salaries (in USD) and doing less than what me and my friends did when we were Jr. There is inflation and all, but the numbers does not seem to add.
Part of it all is a re- normalisation but part of it is certainly a lack of understanding of software and/or// engineering.
Current tools, and I include even those kiro, anti-gravity and whatever, do not solve my problems, just make my work faster. Easier to look for code, find data and read through blocks of code I don't see in a while. Writing code not so much better. If it is simple and easy it certainly can do, but for anything more complex it seems that it is faster and more reliable to do myself (and probably cheaper)
The following is just disingenuous:
>industrialisation of printing processes led to paperback genre fiction
>industrialisation of agriculture led to ultraprocessed junk food
>industrialisation of digital image sensors led to user-generated video
Industrialization of printing was the necessary precondition for mass literacy and mass education. The industrialization of agriculture also ended hunger in all parts of the world which are able to practice it and even allows for export of food into countries which aren't (Without it most of humanity would still be plowing fields in order not to starve). The digital image sensor allows for accurate representations of the world around us.
The framing here is that industrialization degrades quality and makes products into disposable waste. While there is some truth to that, I think it is pretty undeniable that there are massive benefits which came with it. Mass produced products often are of superior quality and superior longevity and often are the only way in which certain products can be made available to large parts of the population.
>This is not because producers are careless, but because once production is cheap enough, junk is what maximises volume, margin, and reach.
This just is not true and goes against all available evidence, as well as basic economics.
>For example, prior to industrialisation, clothing was largely produced by specialised artisans, often coordinated through guilds and manual labour, with resources gathered locally, and the expertise for creating durable fabrics accumulated over years, and frequently passed down in family lines. Industrialisation changed that completely, with raw materials being shipped intercontinentally, fabrics mass produced in factories, clothes assembled by machinery, all leading to today’s world of fast, disposable, exploitative fashion.
This is just pure fiction. The author is comparing the highest quality goods at one point in time, who people took immense care of, with the lowest quality stuff people buy today, which is not even close to the mean clothing people buy. The truth is that fabrics have become far better and far more durable and versatile. The products have become better, but what has changed is the attitude of people towards their clothing.
Lastly, the author is ignoring the basic economics which separate software from physical goods. Physical goods need to be produced, which is almost always the most expensive part. This is not the case for software, distributing software millions of times is not expensive and only a minuscule part of the total costs. For fabrics industrialization has meant that development costs increased immensely, but per unit production costs fell sharply. What we are seeing with software is a slashing of development costs.
The Industrial Revolution created a flywheel: you built machines that could build lots of things better and for less cost than before, including the parts to make better machines that could build things even better and for less cost than before, including the parts to make better machines... and on and on.
The key part to industrialisation in the 19th-century framing, is that you have in-built iterative improvement: by driving down cost, you increase demand (the author covers this), which increases investment in driving down costs, which increases demand, and so on.
Critically, this flywheel has exponential outputs, not linear. The author shows Jevons paradox, and the curve is right there - note the lack of straight line.
I'm not sure we're seeing this in AI software generation yet.
Costs are shifting in people's minds, from developer salaries to spending on tokens, so there's a feeling of cost reduction, but that's because a great deal of that seems to be heavily subsidised today.
It's also not clear that these AI tools are being used to produce exponentially better AI tools - despite the jump we saw ~GPT-3.5, quantitive improvement in output seems to remain linear as a function of cost, not exponential. Yet investment input seems to be exponential (this makes it feel more like a bubble).
I'm not saying that industrialisation of the type the author refers to isn't possible (and I'd even say most industrialisation of software happened back in the 1960s/70s), or that the flywheel can't pick up with AI, just that we're not quite where they think it is.
I'd also argue it's not a given that we're going to see the output of "industrialisation" drive us towards "junk" as a natural order of things - if anything we'll know it's not a junk bubble when we do in fact see the opposite, which is what optimists are betting on being just around the corner.