Tractors didn't just change farming. They emptied entire regions.
What saved the people (not the communities) was that other industries absorbed them. Factory work, services, construction. The question for software isn't whether AI creates efficiency. It's whether there's somewhere else for displaced engineers to go.
I've been writing code professionally for 16 years. The honest answer is I don't know. The optimistic scenario is that AI makes software so cheap that we build things we never would have attempted. The pessimistic one is that most of what needed building gets built, and the remaining work fits in fewer hands.
Both seem plausible. I'd bet on somewhere in between, but I'm not confident enough to tell anyone starting out that they should ignore the risk entirely.
Software engineering is one of the most intellectually demanding categories of white collar work. I’m not saying it is invincible, but I do not see why SWEs should worry more.
You're on a site dominated by software engineers, in the field of software engineering, and likely have a lot of software engineer friends.
Translators got fucked, there's very little market for them compared now compared to decades past. Find their forums and I bet you'd have seen similar worry.
(or something like that. Obviously I'm too well adjusted to have these existential worries)
Its bad. The most depressing part is that it is because of de-funding not AI. While at the same time this field is probably one of the only venues for escaping the AI sinkhole but its being dismantled rather than built up. Source my partner in research.
An absolutely massive numbers of farmers were replaced too. Farm management was rather grueling in the early 1900s. Farmers that embraced mechanization were able to buy up surrounding farms that didn't and grew in size. As the equipment got better the amount of work per acre farmed dropped so farms expanded with more acreage. Farmers and hands dropped in number.
I've come to the conclusion that we won't be replaced, because a majority of our work is to split up business questions, probe, ask around for other people's knowledge, assemble, build a plan etc.
AI only knows what it knows, it doesn't go after the unknowns. It is reactionary in its nature.
Now, let's say something happens and I'm wrong: let's think about that. The AI can do stuff like that. I think when that happens the economy as we know it collapses, and we've got bigger fish to fry. I would say, if this happens, nearly all white-collar jobs are disappearing.
By the time software engineers are fully automated, then executive assistants, accountants, business development people, marketers, administrative staff, researchers, and HR will also have been fully automated. At that point we have a revolution on our hands and not having a coding job will be the least of your worries.
(or it happens slowly enough that we have time to adjust)
I don't think that's true, mainly because if it were true it would have happened a long time ago. We will never settle on one version of a thing (let it be messaging, recipes, notes, image galleries, etc...). New variants emerge over time, the only thing AI does is accelerate this.
>We will never settle on one version of a thing
This depends on how well a monopoly can fit into the equation.
>We will never settle on one version of a thing (recipes)
Here is an example of missing the whole elephant because you're looking to close. While the number of recipes are booming, the number of food distribution companies has collapsed into just a few mega corporations. And those corps are massively controlling the prices all of us must pay.
I think the main "concern" is that Senior devs, code, essentially the entire current working body of programmers is going to bootstrap AI, and once they're gone, they'll be no one to replace them. And at that point, there's no fall back system.
I think there is something here but not much. The majority of business are carrying some SAS products that are an entire marching band when all they want are a drummer and guitar player. Making bespoke efficient tools will surge for sure.
The problem is that the building of these tools is all the same end. Increasing industry control to few players and further widening wealth inequality. Which leads us back to where does everyone go to work at that point?
We are at some sort of societal inflection point where we need new industries but only 20% of degrees are in some sort of science. 80% of degrees are in what is becoming nothing more than resume checkboxing for jobs that no longer will exist. Who is going to make the next big industry breakthough with 20% of degrees in business management? I don't see any push to get people in college for actual meaningful progress.
It seems it did happen, humans have hit post scarcity in survival terms(unevenly distributed). However we have in no way planned for what happens here, fairness has never been a priority. The cuthroat capitalism that made this possible is now eating itself with no plans to change.
LLMs and specifically auto-regressive chat bots with transformers for prediction will probably not replace engineers any time soon. They probably won't ever replace humans for the most cognitively demanding engineering tasks like design, planning, or creative problem solving. We will need a different architecture for that, transformers don't look like they get smarter in that way even with scale.
AI will replace humans in performing every cognitive task
This is probably true, but on a time horizon that is almost certainly much much longer than we think. Centuries. Perhaps millennia, even.It's fun to go back to the newspapers of the 1920, 30s, and 40s, and see how absolutely CERTAIN they were this was going to happen to them. I'm sure there are examples from the 19th and 18th centuries as well.
Advancement happens in fits, and then tends to hibernate until another big breakthrough.
And even when it does happen, humans love to do things, just for the sake of them. So I highly doubt art, music, literature, or any other thing humans love to intrinsically do are going away, even if they can be done by AI. If anything, they'll be done MORE as AI enables wider participation by lowering the cost and skill barriers.
But yea: self driving cars are still not here, see e.g. all the other AI booms
Difference here is we’re seeing it with our own eyes and using it right now. So much absolutely existential competition between companies (even within them!) and geopolitically.
That's one of my triggers that we've reach AGI. In may senses, self driving cars are here. In the vast majority of tasks self driving likely works fine. It's when you get to the parts where you need predictive capabilities, like figuring out what other idiots are about to do, or what some random object is doing in the road that our AI doesn't have the ability to deal with these things.
Yeah they are, even if you don't have one yet. We can rathole into whether the need to hit level 5 before it "counts", but Waymos drive around multiple cities, today, and Tesla FSD works well enough that I'd rather drive next to a Tesla with FSD than a drunk driver.
If your evidence that AI isn't something to be worried about is saying self-driving cars aren't here, when they are, will then, we're fucked.
The future is here, it's just unevenly distributed. For cars, this manifests as they're physically not available everywhere yet. For programming, it's unevenly distributed according to how much training data there was in that language and that domain to scrape across the whole Internet.
Also — I think the arguments of yourself and another comment are also great analogies to AI situation, we can haggle over “ok but what is {FSD, AGI} really and in many ways it’s already here!”
I agree totally and I would just point out we’re at an even more intense moment in the AI space
There were plenty of people in 1890 saying heavier than air powered flight was never going to happen.
>humans love to do things, just for the sake of them.
This said, it doesn't prove a negative. How many things would people be doing if they could get paid for it. It's easy to say these things in generalities, but you do any specific things, especially for a living, those could dry up and disappear.
Maybe? I guess the better question is "when?"
>unless you believe that there is something about biology that makes it categorically better for certain kinds of computation.There's no reason to believe that's the case.
How about the fact that we don't actually know enough about the human mind to arrive at this conclusion? (yet)
And also at what cost and at what scale?
Will we be able to construct a supercomputer/datacenter that can match or exceed human intelligence? Possibly, even probaby.
But that would only be one instance of such an AGI then and it would be very expensive. IMHO it will take a long time to produce something like that as a commodity.
A tractor can't reproduce or repair itself, but it is better than a horse for farming. A self-driving car can't learn by itself, but a datacenter can use its data to train a new version of the car software. A humanoid robot by itself might not be flexible enough to count as AGI, but it can defer some problems to an exascale datacenter.
We will be able to construct a datacenter that exceeds human intelligence. And every year after that the size of the datacenter will get smaller for the same intelligence output. Eventually it will be a rack. Then a single server. Then something that is portable.
Well I don't actually remember, because - depending on your definition of digital computer - it was around 80 years ago and I wasn't born yet. Which is kind of my point. Eventually, we might get there. And I can imagine that simpler AI systems will help to bootstrap more AI systems. But there is still a lot work to be done.
Why will they want to?
We might end up answering the Fermi paradox within our lifetimes.
I was just laid off from my job of 8 years in which I was the UX Researcher, Designer, Front-End Dev and Customer UX Support. In a week I have sold my house and am downsizing significantly and in two years or less will be working as an RN(nurse). I will try to get back into my field but the current administration and the many tech layoffs has flooded the market with people like me looking for job. All the while AI is eating my career & field. It just doesnt seem wise that my career of 20 years is going to be around in the next ten years.
Also, will there be interfaces we have today in five to ten years or so? My guess is AI is the interface that does everything for us through voice (Open AI's upcoming device) or text .. now we still could have handheld AI phones or devices but where AI does everything including presents articles, games we play, etc and all from these AI devices' lock screen (websites are not visited much)
That said, I don't have much faith in the future of my programming career either. Unless robotics gets exponentially better, registered nurses are going to be way safer from automation (at least the ones doing physical treatment).
What’s the expected compensation difference between previous career and RN?
The dream of a Jira integration directly wired to an autonomous system to quickly close stories with no human intervention will remain a dream for a long time for anything except the lowest-level 10% of stories. Its not interactive enough; the feedback loop needs to be tighter, the vibes need to be conversational, and businesses will get the most value out of the pilot in the chair being someone who in years past called themselves a software engineer. I think we still will; the tools just change.
Tech is a tool. It will take away some jobs, and then create new ones. Think of a combine tractor -- it took away crop picking jobs, but created a new job of combine tractor driver. It bumps productivity.
The correct frame is "how can software engineers (or anyone, for that matter) use AI to increase my productivity?" With that frame, AI does not replace engineers; rather, engineers are in the best position to understand how it deliver products faster and implement that understanding.
The only reason society didn't collapse: there were enough other jobs to absorb those displaced workers. Will there always be?
> Combine tractors deleted jobs.
Number of jobs is not the metric to key off of. If that were the case, we should get rid of combine tractors and pay people to farm by hand bc it would increase the number of jobs.
1. The common (and correct) claim that software engineering is not just about writing code (counter argument, with time, AI will be able to take on planning, debugging. Counter counter argument: if you ever tried just do what customers ask, you will get conflicting requirements, humans will need to help AI make decisions, not implement them)
2. Related to the above, as long as a good software engineer + AI brings more ROI than a mediocre engineer + AI that brings more ROI than a random person + AI that brings more ROI than just AI, it will be economically wise to hire more good engineers to beat your competitors who just opted to save money and fired their engineering team. Salaries might go down but for top talent, eg imagine an “AI whisperer” that can not be a 10x engineer but a 1000x because they know how to get the most out of Claud code / cursor. They will be paid accordingly.
3. Jevons paradox - perhaps making software ubiquitous, cheaper to make, will actually make software engineers in larger demand
Clothing demand has increased greatly in the past decade due to fast fashion. Much of this clothing is designed to cost a few bucks, last a few wears, then get thrown out. It's an ecological disaster.
Maybe we'll see something similar happen with software — as production costs fall, trends will shift toward few-use throwaway software. I highly suspect this is already happening.
software has worked this way since the rise of the internet and SaaS. consumers rarely need to install anything locally other than a browser.
tl;dr it argues when there's a dramatic improvement in the efficiency of production of a good or service, its per-unit cost goes down so much that demand skyrockets, leading to greater demand for employees in that sector. The examples it gives are radiologists (after neural nets were predicted to be able to perform their jobs essentially for free), and dock workers
If this happens in the case of SWEs, it would mean a 'unit' of software will be able to be produced much more cheaply, but the demand for and price (i.e. salaries) of SWEs might stay the same or increase.
The problem with this argument is that AI, or at least the vision of AI companies and governments are spending trillions of dollars on, purports to replace the human itself. Put another way it intends to automate all the 3 steps (as well as any ancillary services in marketing the widget, legal services in protecting the company, etc). So any increase in demand does not lead to any additional labor since the labor per unit is 0.
This video’s argument simply collapses the debate back to whether AI can largely replace human intelligence or not.
We still need / want to study physics to understand the universe despite the fact that it is a process that we have no control over.
AI is the same. Even when code is written by AGI you still need to be able to understand what it is and how it works.
The alternative is complete abdication and resembles a lot more like religion or even a cult.
Trust but verify.
now to be fair this is my first attepmt at vibe coding and so I might not know how to prompt the ai
That basically turns your bad prompt into a good prompt, then execute on it.