This is the lump of labor fallacy- the belief that there’s a fixed amount of economically valuable work that technology and capital can eliminate through automation or capital accumulation instead of transforming it.
Middle class status anxiety manifesting as a rhetoric about neofeudalism.
If you can perform all the same jobs I can for a penny a day, and food and rent cost a dollar a day, I'll have a hard time earning enough to remain fed and housed.
Until now, I've always been competing against other flesh-and-blood humans who needed to eat and pay rent, so I've never had to worry about the labor price floor too much.
1: verbal/literary labour with your vocal cords, or the use of an output tool, to playback existing works in environments where mechanical methods are disallowed,
2: aesthetic labour through precise recreations of other's mental labour in the performing arts,
3: legal labour in acting as an agent to help or hinder existing legal processes in systems that give humans standing,
4: biological labour as a substrate for growing certain transplantable products,
5: smuggling labour (either non-invasive or surgically invasive),
6: political labour to incrementally sway electoral results in any polity where you are still enfranchised,
7: security labour as a canary for infohazards (such as 'diplomat' programs produced by another ASI) inside of a quarantined environment,
8: frontman work as a deniable patsy to enable economic forgery, deception, revenge or warfare by a patron ASI, possibly without your knowledge.
There are plenty of ways to gain some currency token that can pay for your negentropy upkeep costs.
Unless you're a few centuries old, you haven't. You've had the potential to be competing agaist industrial and computational technology your whole life. Go back further, and the prevalence of slaves served as a similar cost differential (free humans versus enslaved, human versus AI).
Slaves at least still need to be fed and housed, but I'm sure they were tough competition indeed for independent laborers.
Then you've constructed a tautology. Humans remain competitive in various applications of their labour, broadly defined, despite entire categories having become uncompetitive. If we exempt those categories then the historical record looks static. But only because we defined away the change.
> Slaves at least still need to be fed and housed, but I'm sure they were tough competition indeed for independent laborers
I believe there is evidence for this all over the place. By analogy, however, AI is orders of magnitude less power efficient than humans. This places a floor on the price of AI and thus human labour that competes with it. (Though that floor, as with pre-information age floors, is well below almost everyone on this forum.)
It is both more and less power efficient, depending on the task. When a coding task is easy enough that an LLM can actually do it, I've seen e.g. Claude do a week's worth of human labour in a few hours of wall-clock time for what amounted to 0.25 euro of subscription cost. When it can't, it will churn as many tokens as you've got and leave a big ball of mud behind, as seen with the recent attempt at a vibe-coded browser.
When Stable Diffusion is good enough, the energy cost per image output is comparable to the calorific consumption of a human living long enough to type the prompt. When Stable Diffusion isn't good enough, no amount of re-prompting gets you something it can't do.
And while AI might be less power efficient than me on some tasks, power is cheap enough that I don't think the energy price floor affords my continued survival.
My argument is this is, as presented, baseless.
> power is cheap enough that I don't think the energy price floor affords my continued survival
Again, slaves were cheap enough. Industry is cheap enough. Yet global labour rates remain far from homogenous.
Even if, AI is going to tank the bargaining power of the working class even harder than it already is.
It's already the reality for many that they're working for minimum wage, in toxic environments, with no benefits, and more overtime than legal in places that regulate it solely because they have very little better choice.
Furthermore, this power inequality directly translates to influence over economic output of our civilization - by the time value of human menial or cognitive labor goes low enough to delete jobs, all that will be left will be various equivalents of being a sugar baby for the rich - fulfilling their emotional, sexual, and social needs. Not even art, because that's among the things gen AI is displacing the most effectively.
> Middle class status anxiety manifesting as a rhetoric about neofeudalism.
The middle class is a tiny, tiny fraction of the population nowadays. Even among those working high-earning jobs like tech/healthcare/finance, most are just upper worker class.
Kinda, but also no.
Yes, there's a lot of people (including me) who genuinely enjoy the output of these models; but art isn't only aesthetics, I observe it also being a peacock's tail, where the cost is the entire point.
Why are originals more valuable than reproductions? Nobody who understands the tech can seriously claim that a robot with suitable brush and paints is incapable of perfectly reproducing any old masterwork down to the individual brush strokes — of course a robot can do that, the hardest part of that is compiling the list of requisite brush strokes, but that too can be automated.
But such a copy, and lets say the paints were chemically perfect and also some blend of plant and petroleum derivatives so as to fool even a carbon-dating test, would never command as much money as the original unless someone deliberately mixed them up so that nobody would even know which was which.
However, I don't know that this would ever help the masses. Perhaps a quadrillionaire in a space mansion would like to buy all of Earth and all the people on it, but that doesn't mean we'd get any better than being forced to LARP whatever folly* they chose for us.
What I'm saying is, I have a cozy bullshit job that gives me the perspective of someday not being in the working class anymore. But if that wasn't the case, I'd 100% fuck that and look for alternative lifestyles.
The real risk here is the military implications of pairing AI and robots. If the army doesn't need lots of people then there is a real problem. But robotics will take a long time to get there even in an optimistic case. Labour will still have value for a long time to come.
The issue is if bread becomes insanely cheap and cloth becomes insanely cheap, then a very inefficient weaver can still afford food. It runs in to comparative advantage theory eventually - just because someone else is much more efficient than you at literally everything doesn't mean you can't still do something inefficiently and set up a win-win situation. Although there might still be something better to put your time towards. I recommend managing your own capital - maybe even building it and maintaining it yourself - but people seem dead set against the idea for some reason.
I dunno why geohotz thinks in this article that shares in a granary are a bad idea, someone has to profit from storing food. May as well be me. I'll do it if he won't. I like granaries.
So you go after high-income earners who are not concerned with price, but this requires proximity, networking, and then on-going relationship management with a much smaller group of people who are all much more likely to talk to each other.
Effectively, this is the problem the grandparent comment outlined with extra steps; it isn’t your labor by itself that is uniquely valuable, it is your relationship with buyers, and those relationships can sour. Vendors who signed on with Walmart experienced something similar where they began scaling to meet demand only to find themselves completely reliant on Walmart’s orders to service their loans. Walmart was then free to dictate the terms of the relationship. This is a very familiar dynamic in societies that never successfully divorced themselves from feudal ideas (e.g. Most of South Asia, parts of South America).
Mass market handicrafts have never been a viable means of subsistence anywhere, ever, if we benchmark against what I think a reasonable lifestyle looks like. That is why we stopped doing things that way.
Nevertheless mass market handicrafts are still a theoretical option. It hasn't become a worse option over time, it is in fact a better option now than it ever was in the past. Today is the best day in history to be a weaver, even for those that do it by hand. A weaver in the 1700s would weep tears of joy at the opportunity to weave things by hand today at market rates. They would say things like "wow, I can afford a much better life than in the 1700s with my weaving skills", "what just happened to me? What was that time portal?" and "maybe I should learn to code, they earn even more money than I do".
I actually did very menial work in a food processing plant while still in education, I'm not better than the people working there but I'm different from them in interests and in upbringing (even though I didn't come from wealth, the people I studied with, shared hobbies with did). I wasn't able to discuss the things I read with these people and they weren't able to talk about their interests. I believe if I had to work there for years or decades it would lead to dysfunction.
There were plenty of craftsmen who lived lifestyles that were acceptable for the time periods they were living in. From around 1400, to around the early 1900s, European settlements consisting of more than a few hundred people would have had blacksmiths, cobblers, tailors, furniture makers, and various other craftsmen. It wasn’t until the late 19th century that this arrangement started to be replaced by the economies of scale made possible by the factory system.
> A weaver in the 1700s would weep tears of joy at the opportunity to weave things by hand today at market rates.
The idea that a medieval peasant would be envious of our standard of living is doing nothing to further the point you are trying to make.
This is why the population of labor animals plummeted in the 20th century. And to put it plainly, it means that there's no guarantee you can always sell labor for enough to keep your home from getting paved over for a data center.
And the fact is, for those souls who are motivated to do so, they can make a living hand-weaving anyway and do not need to weave 3,000x faster. They weave at a similar pace to that people always have. They can still afford bread. Society will almost give bread away to people, it is absurdly cheap.
We’re nowhere near it, but there is a point at which the marginal utility of laborers is worth less than the security risk the laborers represent by continuing to exist. This is already happening with a lot of manufacturing and resource extraction.
> This is the lump of labor fallacy- the belief that there’s a fixed amount of economically valuable work that technology and capital can eliminate through automation or capital accumulation instead of transforming it.
What if I consider the labor conditions that exist after this transformation to be undignified?
In general, yes. For many groups, no.
It assumes that there is something of value for them to do and as shown by masses of long term unemployed in many areas, that is not always the case.
For example, people on the autism spectrum and with disabilities have persistently high unemployment. Because of various limitations, there is nothing for them to do in many cases. The market should have corrected this (especially over the long term) if reconstitution was consistently possible.
If AI makes all humans seem limited in a similar fashion, the idea of labour reconstitution falls apart.
There is also a large portion of the population on social assistance so while there are things they can do, the market value of what they can do is often well below their needs.
> If AI makes all humans seem limited in a similar fashion, the idea of labour reconstitution falls apart.
I think the problems here is you’re comparing a relative minority to “all humans”. Unfortunately, what affects a minority of society, inherently, has a small effect on society as a whole. If “all humans” now have no employment value because AI or automation can do it all, there will still be a cost to that production. Even if you assume the AI part is $0, the power needed or the raw materials becomes the main cost as opposed to labor. Then you need to have enough demand from those non-working non-wage-earning humans for whatever that AI is producing. Otherwise, what is the point of the production in the first place.
Maybe extreme automation would put the wealth gab on hyper drive. Only those handful who happen to own an automated production company can have any income. However, what do you imagine the final outcome of that would be in a democratic society? Like I know it’s fashionable to cry at the state of democracy, but despite the recent inflation and affordability crisis and income insecurity etc, we don’t have an “all humans” levels of unemployments. What do you think would happen if we automate, and subsequently fire, “all humans”?
Let’s assume AI will actually replace 99% of jobs eventually. Society will completely change at that time to adapt. What else is the point? Are AIs gonna be producing stuff for other AIs leisure?
The problem is that the road to there might be painful before society is forced to change to adapt. It won’t all happen at once so it’ll keep happening in waves and waves will be painful until they get better then another wave again. That’s assuming the prophecy of “all humans” labor is no longer needed.
Before I say why though, there's a lot of bad futures possible even with remote controlled humanoid robots whose only true AI are the basics, to keep themselves standing upright and to walk and run on command without falling over. All those people (even here on HN) who claim (to quote a recent exchange) "Every $1 taken by illegal workers is $1 stolen from Americans", how are they going to react to finding every minimum wage job in the US has been replaced with a Tesla Optimus whose "AI" is "actually Indians" thanks to a low-latency Starlink connection to Jodhpur? How do anti-authority protests (whether anti-ICE or anti-Iranian-government) work with any robots where the "AI" is smoke and mirrors of remote control? This is plenty enough for techno-feudalism, even without AI replacing all jobs.
But back to the original. As a proof of possibility that tech can, in fact, delete labour: Labour is made of atoms. We have always known, for as long as records show, how to make a bunch of atoms that can be forced to do labour for us, even though we have not understood the details of production until very recently. It's called "slavery", and the rules enforcing and maintaining slavery are encoded into the oldest laws we know of, and that text itself references older laws we've not yet been able to identify[0].
That just leaves a question: can technology also arrange atoms to solve arbitrary labour?
Yes, obviously we can physically do it, we're a species that rearranges them at every level, from disassembling mountains for ore, to re-engineering DNA and RNA for crops and medicine and making animations by moving around single atoms.
But that leads to a more precise question: can we learn what pattern we need to do this deliberately, and not with a 20 year delay from simply using our existing biology? There's a quote for that, too:
If the human brain were so simple that we could understand it, we would be so simple that we couldn’t.
I consider intelligence to be the inverse of how many examples you need to learn something, by which standard all machine learning is very very stupid, and only makes up for it to an extend by being very very stupid very very quickly. Will we solve how to reduce that to only "very stupid", or even just "stupid"? Perhaps, perhaps not. Without improvements, machine learning systems will not take all our jobs, but may still make slaves of us for the reasons previously given, remote controlled robotics. Current ML approaches still have yet to solve for fully-general all-conditions self-driving even in the easy mode that is the USA, they need too many examples to take all our driving jobs despite millions of vehicles on the road gathering training data from the behaviour of all the vehicles they can observe.When you say:
> This is the lump of labor fallacy- the belief that there’s a fixed amount of economically valuable work that technology and capital can eliminate through automation or capital accumulation instead of transforming it.
some sure fear that way, e.g. it sure was weird to see the exchange with Sen. Richard Blumenthal, and the response from Altman which is pretty close to what you're saying rather than "let me rephrase that, we're all going to die if we get this wrong":
I think you have said in fact, and I'm gonna quote, development of superhuman machine intelligence is probably the greatest threat to the continued existence of humanity. End quote. You may have had in mind the effect on, on jobs, which is really my biggest nightmare in the long term.
- https://www.techpolicy.press/transcript-senate-judiciary-sub...Not that it takes much intelligence to become an existential threat: ML may only improve at the per-example rate of evolution, but evolution created covid and ebola and the black death and HIV.
But yeah, techno feudalism? All that needs is for Optimus' on-board AI to not fall over when told to walk and run, to keep its balance while holding heavy objects or wielding a nightstick or firing a gun.
It offers no constructive alternative and the author (yes, I know who he is) seems to have no issue with Google hosting their email.
It's hard to take this too seriously (even if there is some legitimate worry here)
By working for them you are enabling this.
I understand that sentiment, but it's very one-sided.
I enjoy having the worlds knowledge at my fingertips. I enjoy being able to video-call my family from anywhere in the world at any time. I enjoy never being lost cause I always have a map showing where I am. I enjoy having group chats with all my different social groups, big and small. I enjoy being able to easily work from home.
None of the above was possible just 20 years ago. All of them are enabled by big tech and none of them is based on surveillance, ads or social media.
Yes there are drawbacks. I also find them bad to a point of threatening society. But we need to ack the positives, otherwise it's not an honest debate but only a mix of ranting and populist propaganda.
Most of those things were actually possible. In many cases they weren’t as convenient, but as a child of the 80s I can tell you that life wasn’t like the dark ages before we all got smart phones.
In any case, I don’t think anyone here is arguing against technological progress. What we’re saying is that big tech has been too powerful, and too unregulated, for far too long.
However, I wasn’t talking specifically about libraries. The web did still exist 20 years ago. Wikipedia is more than 20 years old. And newsgroups have been around much longer too.
The web was also mobile accessible for more than 20 years (WAP, for example, was introduced in 1999).
There were also phone numbers you could ring who could provide quick searches for information look up. People are most familiar with them in terms of telephone directory services (eg ring an operator to ask for the phone number of someone else) but there were other general knowledge services too. In fact I used one once when my bike chain broke, I walked to a local pay phone, and enquired how to put a chain back on.
Even know, there’s a plethora of information at local government information and audit offices, which isn’t available online. most of which is store on microfilm. A friend needed to visit one office recently to look at historic maps to trace the origins of a public right of way (which is a legal public footpath though farmland in the UK)
Like I said before, we weren’t living in the dark ages before smartphones came along.
It's not ranting to not ack every positive if the negative clearly outweighs it. I would much rather live 20 years ago than live now without a job. Wouldn't you?
Alternatively suppose you get to keep your job. What percentage of the population being unemployed do you think would make it worse for you personally than going back 20 years. Because there is going to be more unemployment and it will affect your environment unless you have got a private island (some people do - some ai owners do)
Literally all of them are.
Also, we've got an entire generation growing up on ads, algorithmic brainrot, and now ai slop.
You're also forgetting algorithmic price fixing, algorithmic pricing, the billions in R&D into making internet platforms and services more addicting and effective at siphoning out your money, etc.
That's not true. All of the examples you mentioned are possible without Big Tech. There are F/LOSS and community supported alternatives for all of them. Big Tech might've contributed to parts of the technology that make these alternatives possible, but that could've been done by anyone else, and they are certainly not required to keep the technology functional today.
Relying on Big Tech is a personal choice. None of these companies are essential to humanity.
> none of them is based on surveillance, ads or social media.
That's not true either. All Alphabet and Meta products are tied to and supported in some way by advertising. All of these companies were/are part of government surveillance programs.
So you're highly overestimating the value of Big Tech, and highly underestimating the negative effects they've had, have, and will continue to have on humanity.
Not only that, but big tech proprietary products have depended and depend heavily on F/LOSS and community supported code.
you can simply not use any of that shit and it litteraly doesnt kill you. theres tons and tons of people who dont use social media nor any of this tech stuff as seriously as you do.
if this is how you feel its time for a detox. go outside. into the woods or something. anything to learn that your phone and pc is not your life and Elon Musk is not your (wanted or unwanted) master.
you choose who and what control your life. chose wisely. chose yourself
In my home country several old people had to close their shop as they were forced to move to a digital accounting system, they didn't have a choice. My bank only allows me to go to their office without an appointment 1 day a week (maybe not even). My grandpa who doesn't have a phone (he never even got a landline), doesn't have internet and barely even drives, he has to depend on others to call and make appointments. If you want to apply for a job, you need internet connection. Many won't even hire you without owning a car (even if you could perfectly commute with a bicycle or public transport).
If you think we're at the end of this 'evolution', we're just getting started. My grandpa could perfectly do everything on his own until 2010, by 2018 it was getting almost impossible, 2026 he feels like a burden for not being into technology.
Except that’s already happening. Through social media being engineered to be additive, advertising and user data collection being used to manipulate voters, AI bosses proudly claiming they’re putting people out of work, and games companies paying on the weak with loot boxes and other massively overpriced in game transactions.
And why isn’t there any legislation against these predatory tactics? Because big tech also donate millions to the very people we elect and who are supposed to serve the citizens.
And that’s without discussing the indirect costs of big tech from data centres ruining the lives of local residents, to independent stores getting screwed by knockoffs from Amazon and cheap Chinese stores.
> seems to have no issue with Google hosting their email.
That’s a pretty weak counterpoint. In fact it’s basically what we call an “ad hominem attack”. What you’re doing is arguing about the individual rather than discussing their points directly.
It’s like saying “you can’t be worried about climate change because you own a car.”
> It's hard to take this too seriously (even if there is some legitimate worry here)
If you think there is legitimate worry the you should take their points seriously. It would be contradictory to do otherwise
The good ol' AGI and then ASI singularity everyone likes to talk about. To be fair, it is possible.
I work for one of these companies. I also have pay bills to pay. I'd like to understand what a real, good alternative is.
Frontier labs such as Google DeepMind are not just going to shutter their doors because 10% of the peons dropped their jobs. I believe, at least, that we should be demanding political accountability and safeguards for society. I only get to live once: if I am to spend it for social change, I best maximize by expected return.
And quitting a job in a capitalist society probably has negative return overall.
You can't help but swing between "AGI is going to save us, praise the tech lords" and "AGI is going to kill us, tech lords have mercy" if you believe there is no counterpower to the tech lords.
"You, alone, leaving your job" is not a great way to counteract the tech lords (although at least it makes a point and show other people there is a problem.)
But there is the option to use your counterpowers (you know, legislators and all that ?) The tech lords are actively trying to avoid that (see the hilarious Musk vs Breton feud.)
It would be better if your system did not give money to power to choose lawmakers.
Maybe AI will make USA realize the definition of corruption and proper election funding laws.
But, if you don't want to join the underclass, maybe, just maybe, consider not picking tech lords as kings next time.
What about 20%? 50%? 90%? 100%?
The "real, good" part all depends on your expectations of life.
There's a real shortage in trades people and I'd love to see ChatGPT fix a leaky pipe, build a house or make a chair. So switching to the trades/manual labor, while financially tough at the beginning might be a good long term choice. But this requires much more physical work than most of us on HN are used to.
Moving away from capitalist society into a cheap tiny off grid house in a rural area and leading a much more basic life is also an option. You don't need 100k to survive, but you do need it in populated areas. (Also, I'm European and therefore not dependent on employment for health care, so I'm ignoring that part.)
There are many choices we can make that remove our dependence on big tech. But big tech is hella convenient and so is having expendable income, so it's a tough choice to make.
> seems to have no issue with Google hosting their email.
There's this meme where person A says "we should improve society somewhat", and B replies "yet you participate in society! curious". Very similar argument.
This means, work somewhere else, or even _do_ something else.
From my experience, the problem I saw, and why I really respect OPs post, is that many good and smart people were lying to themselves in those environments. They'd do exactly what you do and try find reasons to justify working in tech.
Go into your average modern tech engineering team at e.g. Amazon, and ask them how many of the engineers in there use and support the software they're creating. They tiny fraction of people who say they do use it and support it, go check their usage, and you'll see half of them were overinflating it. HN knows it better than anywhere: many of these tech companies are not producing great tech to improve people's lives.
To you point "no constructive alternative" - think about it this way, if you're spending your life writing something you won't even use for reason that boil down to "it's just not valuable for me, especially knowing how its made", then doing literally anything other than working there is a more valuable use of _your_ time for you.
Look at your household and figure out what you need and what would improve your lives. If it's "6 figures salary and a world owned by megacorps", then working in places like Amazon is the best thing you can do for your family.
If you're a small household without kids, like a lot of people in these engineering environments, then instead of spending 12 hours a day mon - fri addicted to trying to solve this really cool little engineering problem (which just so happens to help e.g Amazon), you'd be far better solving some really cool little engineering problem that just so happens to help your family, like building some cool home automation thing for them, or working on your own house to make it more efficient so you can use less energy so anyone else working in your house can retire earlier with smaller outgoings. Or even just being a housewife/husband will improve the lives of the people you care about in more valuable and appreciated ways than anything you could do working at Amazon.
Now, I appreciate I'm in a lucky place to be able to do this, but if you've been able to work as an engineer in top engineering environments and this post is relevant to you, then you are already more than lucky enough to be able to walk away from those environments do things that are consciously useful and appreciated by other humans whom you value.
There is some nuance in what "not working for big tech" means though. The general gist is to not take work making tools that can foreseeably be used to hurt people and the social fabric at large. Reject "disruption." Don't take money to make your life worse. That sort of thing.
This won't actually work though. The only reason we even have this discussion is because we're rich enough that pure survival isn't even really in our instinct anymore. Most of us haven't experienced actual hardship for years and we live in luxury.
There are plenty people in the world who are smart and poor and living tough lives, who are ready to replace people who quit because they have te luxury to quit. Just look at the huge amount of Indian people moving across the world to work in tech. These people aren't going to let the opportunity to significantly improve their lives go because they're going to work on software that might negatively impact society at some point. You could see this exact thing happen when Elon took over Twitter. Many people left because they disagreed with Elon, while many H-1B stuck around because they (and their families) actually had something to lose.
I don't think many of us on HN realize how incredibly spoiled we are with the lives we live.
"Will"? If you don't see how this is happening today, you're either a part of the problem, or blissfully ignorant.
> It offers no constructive alternative
WDYM? The article clearly suggests that people should stop working for these companies.
Besides, why must every criticism propose a solution? The problem should be fixed by those who created it.
so yes, it is rather absurd to demand radical changes from the society when you are unwilling to endure even minor inconveniences yourself.
We can see this logic reflected at times in business history. Ford paid workers double the daily wage so they could afford the cars they built and Costco pays employees 50% more than Walmart. They're not doing these things out of the goodness of their heart but out of greed to increase long term profits.
Once the robots, energy, and weapons all belong to the same small group, that group no longer needs to sell anything to anyone. Production continues, but only for themselves and their enclosed system. The rest of humanity becomes a surplus population that can simply be allowed to die off.
In other words: the economy you’re worried about preserving is already obsolete the moment the owners of the machines no longer require wage slaves or consumers to keep the system running. At that point, mass demand is no longer a feature, it’s a bug that gets patched out.
Markets and consumers would still exist, unless the AI is smart enough to solve the local knowledge problem via centralized computation.
A consumer doesn't have to be a person. It's an abstraction, and behind it sits any entity with capital. That includes non-human entities.
An example would be bidirectional trade between an AI company who mines resources, and an AI company who makes robots to mine resources.
Or if it's one big monopoly, they can have an internal market to facilitate competition and cooperation, like how Samsung operates.
If there is so much of something that everyone can have one, effectively prices would drop until everyone could afford one. Like how even homeless people can afford air, because there is so much of it that there is no point charging people for access. Or YouTube they just let anyone watch it because it is so cheap to push bytes out over the internet.
Removing humans from the production process would, in theory, be similar. There is a certain amount that gets produced and we come up with some way to allocate it. Prices and wages adjust to that reality.
> We can see this logic reflected at times in business history. Ford paid workers double the daily wage so they could afford the cars they built...
That just so story is probably a lie. The math wouldn't work out and Ford would know it - he was just competing with other companies for labour. It is like programmers getting paid huge amounts - it isn't because the companies think it helps them because of some vague circular logic about what happens in the broader market. They just need the skills, now.
The only goods that will be produced will be the one that machines will need for their survival or for whatever unfathomable goals they will have. Human goods will only be produced as long as human labour still has some value.
What we’ve also seen in recent decades is a massive shift to people borrowing money to pay for luxury goods. This means that businesses can still continue to tank the economy because their profits are propped up by other people’s debts.
And in fairness, it’s not just consumer goods that are sold this way either. Entire businesses are run on borrowed money and suppressed wages with the hope that they win the “business lottery” and receive a massive buyout. Often they’re deliberately selling their products below cost price to boost their client portfolio and thus making it entirely uneconomical for normal businesses to compete on price.
And then we wonder why the economy is so volatile. The whole thing is held together by gum and prayers and the only people benefiting are those who are already wealthy.
You don’t need anyone to buy them if you already have all the capital. You sell good and services to make more capital, but if you’ve got enough capital to provide all your needs, you don’t need anyone buyers.
It's difficult to see where it might head that doesn't lead to population collapse and some form of dystopia.
Show me the robot that can plumb a new sink in, or brick up an old doorway... Because I'd really like to buy it, those things are hard and time consuming!
I find it quite interesting, and somewhat disturbing, that we've so quickly come to the point of seeing the AI power drivers as openly adversarial to people and deeply entangled with equally adversarial government forces.
But are we actually (and realistically) talking about technofeudalism in the next couple of decades?
There’s a lot of fear around what will happen with AI, not so much of extinction but rather of two things: fear of losing income, and arguably more importantly, fear of losing identity.
People often are invested in what they do to the point that it’s who they are. That being replaced or eliminated might be a bigger psychological threat than lack of income, at least to those of us fortunate enough to be well off right now.
However, these threats are outweighed by the benefits that AI can eventually bring. Medical advances, power generation, manufacturing capability. Our systems for running society have a lot of problems, economically, politically, epistemologically. These can also be improved with AI assistance.
The real problem is the transition, it’s such a huge shift, and it will happen all at once to everyone, uprooting our idea of the world and our place in it.
What we need is to embrace AI and find a way to make sure that the transition and benefits of AI are distributed instead of concentrated.
For me this looks like the following: companies must commit to retaining some minimum number of employees in every currently existing function, to be determined proportional to their profit taking. This sets a floor on the job losses that can come later when AI really comes on stream.
The justification for this is three fold: firstly, it’s a safety mechanism, it ensures that regardless of the capabilities of an AI system, there are multiple humans working with it to verify its results. If they aren’t verifying diligently, then they’re not doing their jobs.
Secondly, jobs aren’t just a way of making income, they’re wrapped up in identity and meaning for at least some people, and this helps to maintain that existing identity structure across a meaningful cross section of society.
Third, it keeps the economy running, money circulating. You can’t have a market economy without consumers. UBI is one component of this too, but this is both more direct, more useful and more meaningful.
Benefits come to those who have the means to access it, and wealth is a measure of the ability to direct and influence human effort and society.
How exactly do you propose that AI will serve the wellbeing of the worker/middle classes after they've been made obsolete by it?
Goodwill of the corporations working on them? Of their shareholders, well-known to always put welfare first and profit second? Government action increasingly directed by their lobbying?
> What we need is to embrace AI and find a way to make sure that the transition and benefits of AI are distributed instead of concentrated.
Sure. How? We've not done it with any other technological advances so far, and I don't see how shifting the power balance further away from the worker/middle class will help matters.
There's a reason why the era of techno-optimism has already faded as quickly as it's begun.
Let me be clearer: I said “companies must commit to” where the stronger phrasing is “companies are forced to by legislation”. But to begin with this might be voluntarily done by some number of companies.
Also, in this vision of society the AI companies (OpenAI, Anthropic, google etc) are taxed heavily. The taxation is redistributed, there is UBI for some fraction of the population, maybe the majority. Others still work in companies mandated to keep employees as I outlined above.
Importantly, we as a society specifically aim to bring about these benefits of AI by using the redistributed funds in part to invest in them.
Part of this is the free market, part is planned government investment. If one fails, maybe the other succeeds. Either way, we try to spread the benefits and importantly to ensure the benefits are actually there in the first place.
https://www.noahpinion.blog/p/plentiful-high-paying-jobs-in-...
Also, my p(doom) is 1.0-epsilon under the status quo without AGI/ASI, due to old age and disease. Under some assumptions, self-interest says that I may as well roll the dice.
His argument is based on comparative advantage, and he says
"The key difference here is that everyone — every single person, every single AI, everyone — always has a comparative advantage at something!".
This is why everyone in the world has a job and a decent salary today, just as humanity will in the future(!!). In reality it is not like this, and he talks about some of the reasons for this in the above mentioned paragraph.
I also disagree massively with his discounting of the scenario where energy gets relocated to AI instead of food production. He paints that as unlikely, while I think it is almost inevitable. I don't necessarily think it will happen in one fell swoop with force, but it can definitely happen over a generation through pure market forces. The owners of AI just have more money, and will use this to buy energy. Cost of food will rise compared to the value of labour, aka cost of living will rise, and it will be harder and harder to sustain a family. Since we somehow think that taxing wealth is absurd, we will keep taxing labour.
It seems to me that he tries to wave away the fact that if AI becomes much more productive than humans in everything, then economy predicts that it will be allocated the energy, not humans. And he waves this away with 'politics will save us', which I find unlikely.
A: participate and have a chance to not be part of a perpetual underclass
B: for moral reasons, don't participate, be part of the underclass
I kinda would have hoped for
C: <something> to stop this from happening
Otherwise it's the worst sales-pitch ever
Such situations usually correct themselves violently.
Historically, they did because everyone's capacity for violence was equal.
What about now that the best the average person can do is a firearm against coordinated, organized military with armoured vehicles, advanced weaponry, drones, and sooner than later, full access to mass surveillance?
Also, how will a revolution happen once somebody trains a ML model end-to-end to identify desire for revolution from your search and chat history and communication with like-minded others?
Assuming the intent isn't prevented altogether through algorithmic feeds showing only content that makes it seem less attractive.
> capital is the only thing separating one set of humans from the other, and that separation is large, and the overwhelming majority of humans are in the underclass.
Obviously, it's a matter of degree. You could reasonably argue that any capitalist society meets the criteria depending on your definition of "large", and depending on how you interpret the "capital is the only [sic] thing" part
Then again, maybe this is why Brazil finds itself in this steady state of economic inequality and endemic violence but without critical mass for a civil war: the same diversity that makes Brazilians apt at navigating conflicts is what makes us incapable at finding a common enemy and building an united front?
I left the UK for this reason and live very comfortably on around £15k. I rent a city centre flat with 600 megabit fibre and really good amenities. I have time and space to build what I want.
"Give me the place to stand, and I shall move the earth." - Archimedes.
Unfortunately in the UK it's really hard to survive, let alone actually have time to do anything meaningful. I don't know if it's engineered by big tech/property/finance or some other demon. Maybe the monster in qntm's "There is no Anitmemetics Division" is allegorical.
That said, the real point is paying off your mortgage (or getting fixed low interest). With no mortgage I could almost live on that little in Sweden.
Utilities are particularly cheap here, I pay around £10 per month for water and electricity. Many working people live on around RM3000 or about £550 a month here.
This is echoing a term made by Varoufakis about an increasing amount of money being held by a smaller and smaller group, not a return to literal peasant existence. It’s not feudalism, it’s ‘neo-feudalism’.
The argument that labour can move is true, except where it can’t. Look at the entire towns of miners made irrelevant with no replacement to their jobs. Sure you might say they can move half way across the country to clean toilets but they have skills, a family and houses somewhere else.
Where the argument of a feudal analogy really rings true is the increasing attempt to do back to extraction of rent for everything. Subscriptions for everything, including homes are becoming more and more normal. Are we really okay with a world in this form?
The nub is something I've thought about before. My contingency plan for AI turning the industry I work in upside down is to make hay while the sun shines before that point. Have enough saved or invested for a (lean) retirement (depending on how far away that point is).
But what if AI turns every industry upside down. Will there be enough overall economic activity to actually invest in at all. Then we're all poor regardless of how much we've individually saved, or what kind of social safety net exists, simply because there's not enough economic activity to fund it.
That is, at the moment, and I hope forever, a very remote possibility. For a whole host of reasons, technological and economic ones. But if that did happen in the next 20-30 years...
It's not always the case that given an opinion, both extremes are wrong. But in this specific case, it certainly is. Neither the "LLMs will usher in the post-scarcity economy" view nor the "LLMs will doom everyone to unemployment" view (which are remarkably closer to each other than it would appear at first glance) are correct. LLMs are a useful tool with inherent, fundamental limitations that mean that they will never be able to do everything. AGI is currently a pipe dream, and LLMs are not going to be the technology that achieves it.
- "In the future, when labor is fully marginalized..." Hasn't happened in the history of the world, not going to happen in the future either. Some forms of labor were replaced by machines, which then gave rise to new types of jobs, such as building and maintaining the machines. The human cost cannot be neglected, because many people do find it difficult to retrain to other jobs. But on the whole, there are more jobs and higher-paying jobs now than there were a hundred years ago. Higher-paying not just in absolute financial terms, but also in terms of what can be purchased with that money. The richest man of the 19th century couldn't buy an air-conditioned house, not with all his millions.
- "GPT$$$ is surely smart enough to separate you from whatever you have..." Assumes an unbounded growth curve in the "smarts" of AI, and worse than that, assumes that that AI will take the form of an LLM. This is laughable. LLMs will not ever achieve AGI; they are simply not capable of it. If AGI is achievable at all (which I doubt), it will come from one of the currently-neglected avenues of research whose funding is currently being neglected because LLMs are sucking all the metaphorical oxygen out of the room.
- "the neofeudal world": assumes that all companies are like that. Yes, there are many companies that suck to work for because they treat their workers as mere cogs in a machine, instead of as human beings. But not all companies operate that way. If you are being treated as a cog in a machine, start looking for opportunities to jump ship to a better working environment. I've worked in both types of places, and I would be willing to take a big pay cut to work for company that didn't treat me as a cog. They're out there, but it might take some looking. Tip: ask employees what it's like working for the comapny, don't just take the interviewers' word at face value.
If one feels morally compelled to pay with their own income to stop these parasites, hats of to them. It is nasty that people are being put in this position. They need support from a society. Society needs information and debate about real issues, which requires them to be free from the barrage of falsehoods and yellow journalism. And possible it would help to have a Roosevelt, but culture is the biggest hindrance to change.
- Yuval Noah Harari, Ideas for the Future
This has been suggested a bunch of times in the comments of HN as well as on other social media, but what exactly would that look like?
As this seems like the ultimate prisoner dilemma and the winning solution there is always be first to make a deal, even if we accept the premise of AI turning us all into an underclass (a prediction often made with revolutionary technology I might add).
> capital is the only force,
be squared with
> A pile of money will buy you nothing in the neofeudal world.
and
> didn’t operate on capitalist principles
? Capital being the driving force is the very definition of capitalism. It's even in its name!
There are many things that can happen before we're all enslaved by AGI. It might well not happen. We might enter a war, or a cycle of civil wars that change society in a way we can't predict. Or, most probably, some jobs will disappear, some others will become available and AI will be a commodity. Just as machines did after the Industrial Revolution. It's extremely hard predicting the future. Telling people to "stop participating" (how? By quitting their job? By fighting the class war?) is a bit irresponsible.
Imagine if you owned a million humanoid robots and a data center with your own super-intelligence. Would you produce doodads to sell to people? Or would you build your own rockets to mine asteroids, fortresses and weapons systems to protect yourself, and palaces for you to live in?
I don't agree that this is where we are headed, but that is the idea. Thinking about this in relation to our current economy is missing the point.
1) Instead of LLMs, imagine large models trained end-to-end on ALL online content and the impact it has on public opinion and discourse. What about when everything is an algorithmic feed controlled by such a model under the control of the elite? You might be resistant(but probably aren't), but in aggregate this will be effective mind control over society.
2) Money directs human effort. Every quantum of bargaining power the worker/middle class lose due to being less needed is the reduction in our ability to have a say in who society should serve and how.
3) Don't forget regulatory capture is a thing. Not just a thing but happening as we speak. Are you still optimistic?
4) Tech is already addicting and ads are already everywhere even without technology that has a theory of mind.
5) Do not forget that humans are social creatures, power over others is not just an accidental byproduct of wealth. Once you're unnecessary for labor, what's left? Fulfilling sexual/emotional/social whims of the wealthy elite? Hunger games? Being a pet in a billionaire's human zoo city so he can brag about his contributions to humanity?
Yes, the transition can be painful and some people will lose out and face hard career changes.
But overall, the multiplicative power of investment only increases, helping to make everything cheaper, and everyone richer.
People focus too much on their own small part of experience - like Claude Code replacing CRUD developers. Without appreciating that the LLM revolution (and broader AI like AlphaFold) also includes PhD students that don't need to lose time programming tooling, interns that might have spent their time on tooling that can now use LLMs to learn faster and actually contribute to their fields, disadvantaged students that can learn more directly and in a personal way, without it being dependent on their physical location.
All of this means you get more experimentation, more ideas, and more successes.
I have several times seen the claim that the change from hunter-gatherer to agriculture lead to a lower quality of life.
But do we really know how to measure "quality of life"?
The system is flawed for different reasons. Tolerance for high vertical integration and oligopolies have seriously damaged the efficiency of the market and limited people's ability to disrupt. Capital concentration has created a new form of aristocracy. They have successfully lobbied to significantly weaken the mechanisms supposed to spread this money, notably inheritance tax. The Supreme Court has significantly altered how democracy functions by lifting limits on fundings and given far too much power to the richest.
The last forty years have basically torn down all the foundations Tocqueville saw as fundamental to the success of the young USA. People should fight to get things back on track.
AI is mostly incidental in that. It doesn't matter if AI temporary concentrates some wealth if the mechanisms for it to then be spread again are in place.
Not to excuse the guy, but I think that, looking deeper, the situation with geohot is more involved. He grew up in a lower-middle-class household and was lucky to be a smart kid in a time when being a nerd could be a ticket out.
I guess not unlike many of us here on HN.
Unlike many of us, his explorations in the corporate world were all short stints. If I’ve kept tabs correctly, he never stayed longer than a year. Sometimes only for weeks.
Apart from that, I often take the pattern you noticed more as confession, penance, and a "tell your children not to walk my way" kind of message. Maybe I read this stuff too generously.
OPs post had neither.
“Opt out of capitalism” doesn’t work when you’re trying to feed your family. He offers no alternative, speaks from a place of safety with no acknowledgment that the people he’s addressing don’t have the same safety net as he does.[0]
He’s not wrong. We are all fucked. But if it were as simple as “not participating” (whatever that means), then we wouldn’t be.
[0]: to be fair he does address others at tech companies, maybe he assumes that everyone working in big tech has a safety net, which is perhaps not as unreasonable as I first thought.
that's why they are also more egocentric, racist, etc. When people do not feel the threat of society it is easier to have opinions that verge out of the norm or could restrict further employment (and also opinions that are wrongfully or rightfully policed in society)
Ranting about how everyone else should opt-out after he’s filled his cup at the bosom of these behemoths is hard to swallow.
In his defence, he does say that he’s targeting folks who already work in big tech.
Even with the poor judgement to join Musk's twitter at all, he left a few weeks after joining, soon after ChatGPT was released. Before ChatGPT, the idea that the singularity was anywhere near was utterly fringe: tech version of all the new-age stuff, I think Charlie Stross described it as "Rapture for atheists" or something like that.
It's now… well, a lot of people with a lot of power are trying to *make it* be the singularity. I still don't think this is "it", despite how useful I find what we do have, but of the top 10 valued companies by market cap in Q4 last year, 9 are chasing it, the money is definitely interested, in a way it just wasn't when he worked at those places.
b) I'd argue that decentralization of power and knowledge has always been a main driver for George Hotz¹²³ and possibly a reason why he is no longer at Facebook, Google, Twitter.
¹) https://geohot.github.io/blog/jekyll/update/2025/10/06/alway...
²) https://geohot.github.io/blog/jekyll/update/2023/11/04/disru...
³) https://geohot.github.io/blog/jekyll/update/2021/06/10/a-cir...
It's a realistic take. I personally wouldn't absolve him of his contribution though.
Capital leads to class difference, often immense class difference, which is not a claim against our society as primarily capitalistic but in favor of it. If you took away all the food grown in America and the clothes woven in Bangladesh and the laptops manufactured in China, there would be no Amazon, no Google, no Microsoft, no "technofeudalism." The economic base is still defined by the exchange of commodities, its just that the US does not produce many industrial goods anymore, so the US economy is mostly a service based economy. Chinese citizens do not experience their lifeworld in terms of service based industries, they are surrounded by mass markets and complex factories and very material evidence of mechanization which we often do not see directly in the West, only the end product. So to many Americans it feels like they live in a magical society where they click some keys on their laptop and food and clothes and whatever they need shows up on their doorstep--but there are real workers out there tooling all the machines and developing all the architecture to make those things appear, to reduce the basic struggles of life to give time for greater and more advanced forms of social organization beyond the need to survive.
This is not what peasants had; for them, despite having a relatively complex existence, a bad season could and often would kill their entire family. Or a raiding band would take all their food, or they'd die of the plague...life was far more tenuous, and the basic made of production was not commodity production, it was growing food and animal husbandry. International trade, artisanal crafts, and capital improvements on industrial production were nowhere near the level they were in even the early modern period. Nothing about our contemporary society resembles this way of living.
Addendum: The claim that somehow everyone in tech could just "stop," like consciously decide to stop creating things, is absurd. Amazon is very good at what it does, but it does not have exclusive control over the trade of all goods in the whole world. Rakuten is a major competitor in Japan, there are many other companies that have strong holds in their local markets. You take a Bolt in Germany, not an Uber. Chinese users can query DeepSeek, which is surely more proficient in Mandarin than ChatGPT. Even if a state uses its sovereign power to artificially control industry, it only slows the development of capital, since other states may allow their own companies and technologies to flourish, like China is doing now with its electric vehicles. If Amazon does not meet its projections, it fails, its employees all lose their jobs, Jeff Bezos might even go bankrupt. There is a constant pressure of competition.
As a worker, your goal should not be to arbitrarily stop working--you may not enrich others but you certainly won't be enriching yourself either. The goal should be to capture far more wealth that is the result of your labor. This is only possible through labor organizing, which does not permanently cease the means of production, it only takes control of them. But business continues and people still produce things and do services and enjoy the wealth of those things and services. One should basically desire to live in a wealthy, prosperous society. This article does nothing but ask workers to go into voluntary poverty; it is reactionary and backwards.
It doesn't mean people are literally serfs on their lords manor growing substance crops. Are you serious?
Yet the fact that this was necessary is tangential, the Bell system didn't exist to sell switches or phones. The phone network monopoly was AT&T's fief, the rent was the phone bill everyone had to pay!
If you aren't AMD, nVidia, Google, or Apple how much luck do you think you'll have putting in an order to TSMC for 2nm? Or Samsung? Or Micron? Or Hynix?
No gpt18Pro won't cost $1Bi dollars. The buck (or the bubble) will stop somewhere.
I'd be more worried for the people making trades run 1ms faster, they literally create no value to the world that is not something their own peers believe it
There are billions of people not knowing and not caring about what is chatgpt and while it might hit them hard, humans are more flexible and less impressionable than most people think (I mean, some people think it's the other way as well and they might be right in some situations)
Why?
But if you disagree I have an NFT to sell you
capitalism is artificial intelligence. we dont control capitalism, capitalism controls us and through us builds its next vessel
He’s not wrong though, but he’s in a weird position to say that. Also, this post isn’t constructive in any possible way.
But this isn't technical.