> In a productivity boom such as this, a rise in unemployment may not indicate increased slack. As such, our normal demand-side monetary policy may not be able to ameliorate an AI-caused unemployment spell without also increasing inflationary pressure
I'm not saying AI isn't impacting the employment market, but this statement isn't really about AI so much as it is an advance warning that inflationary monetary policy is unavoidable if all the people saying that software engineering is dead are correct.
Before Trump it was, for good reason, incredibly taboo to place pressure on the fed or even hint at interfering. Most economists are pretty horrified that particular barrier has been crossed.
The fed has a pretty big stick, and a mandate to try to balance inflation with unemployment. Throwing politics into the mix is a very bad idea since politicians worry about very different things, and adhere to election timelines.
The president has no business getting involved here.
the rest, and in particular the economics profession, is window dressing
i think gold is a terrible money, for example. great savings vehicle though, should be tax free to convert money into and from.
they'd throw me out on my ear
AI is helping produce more software, right? Including more software that is for sale?[1] Or more online services that are for sale?
[1] One of the interesting things here is going to be liability. You can vibecode an app. You can throw together a corporation to sell it. But if it malfunctions and causes damage, your thrown-together corporation won't have the resources to pay for it. Yeah, you can just have the company declare bankruptcy and walk away, leaving the user high and dry.
After that happens a few times, the commercial market for vibecoded apps may get kind of thin. In fact, the market for software sold by any kind of startup may also get thin.
Today I am planning an exit strategy. Anyone else considering what they’ll do in a post AI software engineering world?
And that's fine, I use it as a tool, but am I faster? Not where detail matters, and I tend to do a lot of work in that area.
Generating dashboards? Absolutely. Converting to JavaScript vNext UI framework? Yes.
Kernel code? No.
If the doom really comes to pass then what future is there for us? I fear a life of impecunious servitude and poverty more than death.
I don't have time to post significantly about it but I'd love to trade thoughts and figure this out.
Email?
AI can code - but can it understand what it missing from the organisation and persuade it to chnage - to spend years at industry conferences?
Look at Starliner. NASA just announced that Boeing stuffed up, not with an engineering mistake (no one still knows exactly what broke) but that the whole organisation is so screwed up and so political Nasa just don’t believe Boeing can fix it.
AI cannot fix our turf wars. That’s not intelligence (humans know going to war is bad, but Putin still exists). I ye the systems we live in, and work in.
Changing those is feasible - once they are coded, transparent and open To inspection in a democracy.
We need programmable introspective systems of organisation - democracies in other words.
The engineering was not the problem - the problem was the organisation was more or less toxic and incapable of doing engineering. Writing code that won’t get used because if politics is a job we and AI can both do.
A lot of companies will use the speed of AI to wallpaper over the fact that they don't know what to make or how to prioritize.
Writing code faster alone doesn't change a great deal. Frankly it'll just create a larger influx of noise. Focus is very difficult to do, it'll become harder in the advent of LLMs.
Another way to frame it is what would you do in a low trust environment where corporations and the government were not to be trusted. You would likely avoid things like bubble bursting AI stock investments, jostling for rank in a company, etc.
But writing code was never much more than 35-40% of my job while working for companies/others. Most my time has always gone towards communication, design, and validation. All three of those are not particularly vulnerable to mass AI automation except for the most trivial of scenarios and I have not seen evidence that has changed in over 2 years of so called "improvements".
My "exit plan" ultimately is to be one of the engineers capable of using these tools to scale my impact accordingly so I can focus on higher order problem solving, which ultimately is what is most valuable. I would be more concerned if I was in marketing/sales or frankly middle management.
Maybe this is just "copium" on my part, who knows, this sector is moving fast.
If LLMs provide substantial material to be able to produce what one envisions faster, that is great. But LLMs will not be doing the envisioning. Most humans already are poor at that. Hence why there are very few real 'visionaries' in history.
Envisioning always requires deep thinking. If LLMs eat away at a humans ability to sit and think, this will make envisioning solutions harder. So you'll see more stuff produced, but largely more crap.
As for what happens after that, I'd really prefer not to have to do physical labor or trades. And it doesn't seem like any other white collar occupation is really going to be insulated, other than perhaps medical. So my strategy is to basically wait and see what society looks like after the transition and I guess I'll try and decide on something then?
https://www.reuters.com/world/us/us-third-quarter-productivi...
Productivity up 5%.
Productivity/dollar up 3% Q2 and 2% Q3 even as labor costs up 1%.
https://www.stlouisfed.org/on-the-economy/2025/nov/state-gen...
> ... on average, industries with 1 percentage point higher time savings experienced 2.7 percentage points higher productivity growth relative to their prepandemic trend. We stress that this correlation cannot be interpreted as causal, and that labor productivity is determined by many factors. However, the current results are suggestive that generative AI may already be noticeably affecting industry-level productivity.
The tech sector employs about 2% of the labor force. Even if AI was dramatically increasing labor productivity in the tech industry, it would have a negligible effect on these statistics.
> People in much more important and powerful positions than her
I said "understanding," you said "power." There's a difference: presidents and CEOs say lots of dumb stuff.
It would be helpful if this was articulated in depth. It's used as a shibboleth alongside "productivity" but it's rarely followed with the concrete details
This isn't the first time they new technology has reshaped society, or even just the economy. How well were the results of prior things predicted ahead of time?
More importantly, are they planning to do anything about it?
https://www.washingtonpost.com/technology/2026/02/23/ai-econ...
This is about labor productivity, a standard national-level economic indicator (see https://www.bls.gov/news.release/pdf/prod2.pdf and https://fred.stlouisfed.org/series/OPHNFB) going up 4.9%, as reported in this article linked in TFA: https://www.reuters.com/world/us/us-third-quarter-productivi...
How comforting. Sounds to me like "ZIRP won't fix this one folks, it's gonna take something other than money to fix what's coming."
The closest thing we've seen in terms of scope/velocity is probably the introduction of the web in the late 90s to the broader world. Very few jobs were killed by that, though, relatively speaking.
Today we use Luddite as an epithet, but they were right about the effect of automation on their jobs.
So, frankly, we know what this looks like: Poverty, a drastic reduction in social mobility, continued disparity between the Haves and HaveNots, good jobs replaced with shitty jobs and now the gig economy is played out so not exactly anything remaining to soak up surplus labor especially surplus overeducated labor, anger, frustration, depression, increasing deaths of despair, violence, political instability, and mobs of people rallying behind populists and possibly continuing to support those populists through violence.
That’s it. An eroding tax base necessitates one of those or a combination.