(I suspect some very junior jobs have genuinely been taken by AI, but it seems to me that the driving factor is still a return-to-mean after ZIRP).
But now literally all of us are using it. The company gives us a $100 monthly stipend for it. We’re a small dev team, but our CEO is thrilled. He keeps bragging about how customers are gobsmacked by how quickly we’re delivering new features and he’s been asked if we’ve made a ton of hires in the last year. We’re actually down two developers from when I started.
I don’t love the code it writes, but the speed is unbeatable. We still need devs, and I don’t think that’s ever going to change. But we don’t need as many devs. We’re shipping like crazy with a small team. I don’t think more people would speed us up much at all.
To me it’s the cumulated effects of many things happening coincidentally: - Massive offshoring
- Money becoming scarce after the ZIRP era and in the recession except for AI investments
- Near monopoly positions that allow layoffs as a strategy to push stock price, without penalty for the decline in quality (enshittification)
- Cherry on the top, LLMs being able to do the work of juniors when used by seniors
If it was only about AI productivity we wouldn’t see this urgency.
The director purchased a subscription to Claude and will most likely get rid of him at the beginning of the new year, because the job can pretty much be automated at this point.
Many Marketing/copyrighting people have also been laid off over the last year due to the same reasons.
"I don’t love the code it writes, but the speed is unbeatable. We still need devs"
I think this will be the problem going forward: Less positions to fill and the same amount of potential candidates. You will need to have more experience and credentials to compete.
... for "now". Wait until the debt kicks in.
> But we don’t need as many devs. We’re shipping like crazy with a small team.
... for "now". Wait until the debt kicks in.
So as LLMs are getting better, junior devs aregetting worse.
Yet good software is still as scarce as ever, and if anything it's getting worse. So it seems like the effects of AI on software development are either too minimal, or are just short-term effects that translate to technical debt down the line which ends up erasing all the gains.
And for the commercial and open source code you do see, how do you know if it's being produced more quickly or not?
And finally, even if LLMs speed up coding by 10% or 50% or whatever... writing code is only a fraction of the job.
At least with my org and a lot of my friends companies, they’ve just kind of stopped building (non-AI) features? Not completely, but like a 70% reduction. And that’s apparently fine with the market, since AI promises infinite productivity in the future. As an example Amazon had a massive outage recently, like a world-scale, front page news outage. Then they followed it up with a massive layoff. In 2018 logic the markets probably would have been like “this company is fucked people are going to start moving out of AWS if they can’t promise reliability”. In 2025 logic, it barely even registers. I guess the assumption is that even with less employees they’ll be AWS can be even more stable in the future because of better AI. Even companies who _should_ be more concerned with reliability aren’t because _they’re too concerned about their next big AI move_.
I guess in the end I still think it’s about AI but more about how companies are reacting to AI rather than replacing too many jobs.
I think the glut is ZIRP, but the lack of recovery (or very slow pickup) is AI.
Nothing fancy. No Claude code, Codex, Cursor, Etc. Just focused Q&A with mostly free Gemini models.
I've been writing software for 25 years, though.
What I'm working on doesn't have much boilerplate, though, and I've only been programming for 18 years. Maybe I need to work for another 7 before it starts working for me.
Example stuff that helps:
- extensive test suites
- making LLMs use YAML for data-intensive files, instead of writing inline
- putting a lot of structural guard rails using type-systems, parse-dont-verify, ...
- having well scoped tasks
- giving the LLM tight self-serve feedback loops
Recently I made it fix many bugs in a PEG grammar and it worked really well at that. I made it turn a test suite from an extensive Go test array to a "golden file" approach. I made it create a search index for documentation and score the search quality using qualitative IR metrics, and then iterate until the metrics met a minimum standard.You also need to know what chunks of AI output to keep and what to throw away.
For example, Gemini 'Fast' quickly identified a problem for me the other day, based on the stacktrace. But its proposed solution was crappy. So, I was happy with the quick diagnosis, but I fixed the code manually.
Indeed, during pandemic data field sort of matured. Modern data stack emerged. No one knew what they were doing and it was extremely over hired for.
In addition, I highly doubt LLMs have a significant effect on data teams as their main jobs aren’t writing code.
LLMs are still very unreliable at converting language to SQL. I believe the best benchmark was merely at around 70%. Who in their right mind can trust this level of accuracy?
Maybe the market is bifurcating along the lines of people who hate AI chat bots and people who love them?
As a result we have budget cuts, layoffs and movement to low cost countries like India.
Communication skills matter.
We are generally rather negatively judgy towards some mistakes (the ones that indicate you're an idiot) versus positive towards others (some mistakes show taste or talent).
AI is having some effect at the margins but it’s mostly an excuse.
Companies always prefer to avoid saying they are just laying people off. It can be a negative market signal to investors, which is paradoxical, since it might indicate lower growth expectations. It also creates possible exposure to lawsuits depending on the circumstances and state.
The nice thing about AI as an excuse is you can say to your investors and board “we are shedding cost but still growing and in fact our productivity is up because we are leveraging AI!”
- Zirp policy ended as countries started using currencies other than the dollar for international trade. Now the Fed and private banks cant print money and distribute zero-interest loans. This stopped the 'free cash' that used to go into Silicon Valley. A lot of money has either moved to interest vehicles or foreign investments/currencies. So Silicon Valley doesnt have cash to burn as it did before. As a result, they are having to back the valuations/stock prices with their profitability instead of the infinite cash that kept coming into the market doing it for them. Hence, the layoffs and focus on profitability.
- Trump did something to reclassify something legally and that stopped allowing tech companies to show their software engineering work as research to reduce their taxes or something. And Biden did not reverse it. (surprise). Or Biden may have done it, not certain at this point. Whatever it was, it went into effect 2-2.5 years ago and made headcount more costly.
It correlates pretty well with the line showing technology jobs over time in the article at hiringlab.org: https://www.hiringlab.org/wp-content/uploads/2025/11/sector_...
The main difference seems to be that the number of jobs posted to HN has dropped significantly lower, well below the low point of 2020. It's really pretty dismal, which seems to have started around the middle of 2022. Maybe the types of jobs that get posted to HN are doing even worse than "technology" in general.
This is the company so large, their jobs data was used in lieu of the Fed's jobs data when the gov was shut down.
Data scientist is an exception based on title, but in my experience there are a large number of people with that title who have never heard of the scientific method, let alone could apply it with any rigor.
I'm sure LLMs are taking some of these jobs, but a lot of the decrease is probably due to general cutbacks on overhead and a realization that they produce limited value.
The hiring bar is going up partly because it’s become possible to spend the effort that would’ve gone into hiring to instead vibecode tools to do the easy bits of the required job - and delegate the rest, while finding new efficiencies in other team members’ roles to make room for what’s newly delegated. The net result is the same work getting done with fewer headcount.
The broad effect is the economy becomes more efficient and new jobs get created just as old jobs get divvied up according to the “replaceability” of each of the many roles that make up a particular job.
It is hard to tell what is really going on. No company will admit that they are firing, e.g., DEI hires from 2023. I have seen some open source CoC loudmouths being fired, but that is not enough to establish a large trend.
That's because it's a fantasy you've contrived to make yourself feel good.
I’d recommend talking to people in the trades first. Not saying it can’t be a good move, but it is definitely hard and has its own huge downsides like poor working environments, long hours, and years to actually get into decent paying roles.
Faddish career advice is usually bullshit, or it’s too late and the bus it’s telling you to jump on left the stop years ago.
That's not coding. Those are bullshit jobs.
As companies tighten their belts, they’re quicker to cut low performers that had been hanging around for too long anyway because:
1) cost reduction
2) companies had been lazy at getting rid of low performers when the market was hot and they didn’t need to cut (and couldn’t find better devs to replace them anyway)
3) with the market this skewed towards employers, you can replace low performers with better talent anyway because everyone’s looking
Judging by your reaction, guessing you’re the low performer?
Edit: judging by your comment history, you sure have a grudge with the world. So this clearly isn’t about me. Have a good night, dude.