Basically we do not rationally analyze what work can be automated and what work is forever safe. We just assume that "sexy work" is safe, and work backwards to figure out how to explain this belief to ourselves.
The other side of that irrationality coin is 2D extrapolation: a thing happened (or a context is such N), so therefore I shall extrapolate it happening again (once or many times) into the future on a smooth line, so as to fit my bias.
Software engineering is falling to this trend too (somewhat)
The solution is to stop merely thinking of yourself as a software engineer and move up to the level of “manager of agents”.. but actually, managers deal with human stuff and this is fascinatingly mechanical - in fact even the unpredictability of these new tools is quite predictable. And so, a more useful framing is “software development process engineer”.
You can look at all the literature on building factories and production lines for ideas on what you’ll be doing.
You shouldn’t ever just have your agent write the software then review and ship it. You are missing massive opportunities to take yourself out of more loops over time. What self-reflection are you and the model doing to catch opportunities to improve? What is your method for codifying your acceptance criteria, so your agents can do the work to higher quality over time without you in the loop to get it there? What’s your process for continuous improvement? How do your models know what work other team members’ models are doing simultaneously so there’s less stepping on toes? Can THAT be automated so you don’t need to sit in Slack and trade “human-verbal locks” on areas of the architecture?
There’s immense room for creativity in the role of a software development process engineer.
People could learn things and join the workforce!
/s
New skills mean shit when there is no job market that can take everyone.
Usually people that have such takes of yours, never had to actually fight months, years, to finally get back on track.
Naturally, when selling AI, the take is to downplay its impact on people lives.
We signed up for this. YOU signed up for this. No one owes anyone a job. When the activities that create value change, move with it or get left behind.
If you prefer a vocation which has been the same for centuries that option is open to you. But to get into the software job market you’d best ask if the job you are trying to get is obsolete, and focus on fixing your skills and job search process/methodology.
The biggest question is “where is the net-new hiring?” (as opposed to backfill hiring) .. and then, if you are out of the market you have time on your hands to match skills to your answer.
> People could learn things and join the workforce!
> /s
The point is to always, always blame the individuals being harmed for the structural problems they face.
Lost your job? Well fuck you if you can't afford to pay a lot of money to go back to school for years and support your family out of savings in the mean time. It's your own damn fault for not being rich enough.
Obviously in the long run this is good, more productivity per employee is always better, but short term jobs are changing and people are likely getting laid off (or will at least have more free time)
I already do less coding than I used to do, because agency work has slowly focused on a mix of SaaS products, integrations via iPaaS, serveless or managed containers.
The whole MACH development approach mantra.
Meaning that even in development, at least in consulting, the teams have become a fraction that they used to be.
AI is the next step reducing team sizes.
I would have been interested in the experience and thoughts of someone whose opinions I respected, both as a social thing and to learn something.
In other words, some types of questions are aimed at 1) building a social connection with the person you’re asking and 2) because you want to know what they, specifically, think about their topic.
AI can’t really replace either of these. AIs might function as a weak social replacement for some people, but you aren’t really going to advance in your personal or professional life by making friends with Claude.
A good example of the second one are AskMeAnything type forum posts: I don’t care what some generic celebrity/famous figure thinks about something, I care specifically about what George Clooney thinks about it. The AI will always be guessing, building a model on what George has said in the past, but it will never actually say what he thinks right now.
For a more serious and contemporary example: there are dozens of videos on YouTube right now, interviews with various experts and pundits on the situation in Iran. Many of them have hundreds of thousands of views. But why would someone watch this instead of just asking ChatGPT what’s going on in Iran? Because we want to know what this particular person thinks.
Does the accounts payable team keep their jobs because their manager enjoys chatting with them? Does the junior analyst stay employed because the VP values their specific personal opinion on the Q3 revenue forecast? Note the article is about work
When you have X employee in a certain role, you know someone is “handling” a particular thing. With AI that isn’t really clear. Maybe you just get the same person owning the responsibilities that previously were under 3 people.
So yes, white collar jobs will be replaced, but they won't be replaced entirely.
The way AI replaces work is in that there is an enormous ROI to work with fewer (and smarter) people. Those social interactions are a big part of work, but they are only very rarely "the work", and they cost time. In the cases that they are required; they seem to cluster and the ROI of fewer social synchronization problems increases even more.
But that might all be wrong. I'm not confident enough to say where we'll land. I also see its possible demand will go up faster because of/and enabled by the increase in supply, and the social aspect is "the real work" to be done.
If block succeeds, we'll see more layoffs of that kind, probably even more extreme ones. You are not top senior level employee? Out. You don't single handedly cause 30% of the AI spend on your 15 person team? Out.
People say how in five years there won't be seniors because one stopped junior hiring... in five years the seniors won't be needed either. Already today, we have single person billion dollar exits, high schoolers making millions from food apps. This is thanks to LLMs.
The technology is there to replace most of the white collar work, it's just not applied enough yet. The economic system needs to adapt to not having labor being such a big redistributor.
I have started to say that it will be irresponsible for people to. Manually write code in a year or two from now - and I am setting the systems I work for up to that.
It will happen sooner than later.
Already now I can not compete with agentic programming.
Single person, or single founder? I guess there's n0tch, but he hired people when he started making money. (There may very well be truly solo cases that I don't know about.)
A few others have commented that the job becomes a kind of hybrid. I already think of it like that. If you're a person who can talk to a client and then immediately implement something to solve a problem, that's still going to be part of the process for a while. The sales cycle is still going to be competitive, whether it's based on timing or insider connections. Software people are going to have to start thinking of themselves as small firms; you have to go close a deal and then your agent army can help you deliver.
The block layoffs were due to years of over hiring.
> Already today, we have single person billion dollar exits
It was nowhere near that much, and this was more a coordinated marketing move by OpenAI than an organic process.
> high schoolers making millions from food apps
This app is a sign of the massive bubble we’re in. The developer should be ashamed to make people think they could estimate calories from an image.
There’s trillions of dollars behind these AI companies succeeding. A lot of the hype you’re seeing is paid for. If you’re reading news articles, blogs, etc and not digging any further you’re being manipulated.
The chain of operation never ends either. Every AI system needs someone to run it. Whatever runs it needs to be built and maintained. Follow that chain as far as you like — human agency doesn't disappear, it scales up. The universe is not running out of things that need doing.
"AI will take our jobs" is not a civilisational concern. It's a failure to imagine what civilisation could actually be.
We need judgement when we can't verify/prove that the answer is correct so we need a human we can trust. For example in author's example the pandas snippet is verifiably correct and I don't really care about judgement in that case. When there is a verification/test that gives a clear pass/fail to AI, the AI can just keep throwing stuff at the wall until it's green and it's good enough for a lot of use cases.
I suspect that will change as trust in automated systems increases. (For example the author seems to consider AI a source of "correctness", which implies this trust is already surprisingly high.)
At that day it is over for consulting.
Or why he couldn't have asked a human about the NaN thing.
I know those are arbitrary examples but.. the behavior doesn't really seem to depend on the category? It might have to do more with how urgently the knowledge is needed?
It’s not about where we are today folks (the intercept of the line). It’s about the rate of progress (the slope of the line).
We went from the first airplane flight to walking on the moon in about 60 years. We had regular supersonic commercial flights shortly after. Applying the same logic, we should all be routinely flying to Pluto, travelling in flying cars like in the Jetsons, and commuting from Sydney to New York every day like it's nothing.
I agree with you that this article isn’t particularly convincing.
For example UI design can be replaced by AI. Unless UI or UX design people were bringing something like _taste_ instead of simply mechanically operating figma - they are not keeping their jobs.
I genuinely don't need to learn SQL ever in my life. I just don't need it for dashboards or analytics use. A person whose main job was to translate requirements to SQL into a dashboard and nothing else would not keep their job anymore. The person to whom they were providing the analysis to could just perform the analysis themselves using AI.
I do think that most jobs would change dramatically but for sure some of them would be eliminated completely.
All of these foundation concepts are vocabulary.
We need vocabulary in order to understand and have reasonable conversations.
Do you need to be an expert? Probably not .. but yes, we should all understand.
I think we'll develop personal moats automatically. Some people don't are naturally uninquisitive. They'll be most at risk.
sql is a common interview question, like joining and transformation etc. if its so simple maybe they shouldn't be asking this.
OP clearly does not have a whilte collar job.
There are cases and cases of IT folks being replaced by AI because companies think that AI is better than humans on everything.
The unemployment rate during the peak of the Great Depression was 25%, not 100%.
Clearly, some white-collar jobs will be replaced. Hard to argue against that, given it's already beginning to happen. So the question becomes what is the eventual rate of conversion and what is the subsequent economic impact over time? I don't think anyone has a credible handle on that, except to note that it won't be zero.
But who's to say that things will be 'running as they are now' for long? And who knows what a new economy will look like?
If and when that transition occurs, I think the job market will pick up.
By providing productivity tools you do effectively replace jobs because there's only so much of a good or service a person will want to consume.
For example, just because a game dev studios can make 10x more games with AI, this doesn't mean the industry will make 10x more money unless demand for video games increases. Instead what is likely to happen as the cost of making games reduces is that the price of games for consumers will drop too as competition increases, which will turn hurt game dev profits, so game dev studios will likely have to be 10x smaller in the future – even if there's still technically people working in the industry.
However when the work of agricultural workers became increasingly automated there were lots of other industries people could work instead, at the time that was factory work, and although the details will be different, I'm sure to some extent this will happen with white collar work too. But the question I'd ask today is what is that alternative source work, and is it as good as white collar work?
Our economy went from, farming -> factory work -> office work. I strongly suspect the next step will be more people working in manual labour jobs and working in servant type roles. It's hard to see where else the demand will come from.
We are only one major incident away from this trend reversing. Now that we have AI, regulation is less burdensome. More testing requirements, more certification requirements, more security requirements, more accessibility requirements.
Everyone keeps their jobs; the bar goes up. Whenever an industry gets better tools, we raise standards instead of making more cheap junk. We make $25K cars instead of $5K cars at 1960s engineering standards.
Doesn’t look like a stable, growing profession. And if you compare it to the 70s-00’s it got really rough around 2010 for obvious reasons.
I’m not saying it turns out bad 100% of the time, but it’s easy to forget because good professionals make it look effortless. When the skill isn’t there, though, and you're used to only seeing professional photos it becomes very obvious (and again, that's perfectly fine if you're not expecting professional photography).
my bad, yeah that part is needed but as an artistic expression i don't see the point.
Company bosses somehow see this differently. Now that the best performers are empowered by AI, cut the worst-performing workforce, and still enjoy efficiency gains!
Maybe this will change in the future if AI-run companies emerge, get backing, and outcompete existing players.
What's stopping their customers from using AI directly instead of that company services?
Companies massively overhired during Covid after receiving trillions in free money and are now cutting the fat after the well's run dry.
AI productivity is just the excuse to save face because people believe it.
Main Points, in Order of Importance
1. Most White Collar Work Is Relationship-Based, Not Transactional The central claim. A dominant share of workplace "questions" aren't requests for correct answers -- they are social, trust-based exchanges where the relationship and the advisor's judgment are the actual product.
2. Two Kinds of Question-Answering That Keep Getting Conflated The foundational distinction. Transactional questions have a correct answer and an imminent need. Relationship-based questions use the question as a pretext for social exchange, shared perspective, and felt understanding. AI handles the first well; it cannot substitute for the second.
3. AI Cannot Replace Trust and the Weight We Give to Respected Opinions Even a correct AI answer carries less weight than advice from someone whose judgment you trust. This isn't irrational -- it reflects that the value in consulting, advising, and managing is partly in the relationship itself, not just the information delivered.
4. Strategy Consulting as the Illustrative Case A concrete test domain. Buyers of consulting aren't purchasing correct answers; they want advice from trusted people, catharsis in being heard, and help clarifying their own thinking. None of that is substitutable by an AI regardless of output quality.
5. Human Factors Intensify in Procedural Organizations An underappreciated corollary. In government and military contexts, lacking market feedback mechanisms, human trust and social organization become even more load-bearing, not less.
Opinion
It's a short, clear piece with a genuinely useful distinction at its center -- but it doesn't fully earn its conclusion.
The two-question-types framework is clean and rings true experientially. Most people have felt the difference between wanting an answer and wanting a conversation, and the observation that these get conflated in AI replacement debates is fair and underappreciated.
Where it falls short is in the leap from "relationship-based questions exist" to "therefore white collar work won't be replaced." The argument proves that AI can't fully substitute for trusted human relationships -- it doesn't prove that organizations will continue to pay for those relationships at current rates, or that AI won't restructure which human interactions are deemed worth paying for.
A client might still want a trusted advisor but find that one advisor supported by AI can now serve ten clients instead of three.
There's also an implicit assumption that the relationship-based component is dominant in most white collar work. That may be true in strategy consulting, but it's a significant empirical claim that the piece asserts rather than argues across the broader category of white collar work.