If anything, I would argue that the strategic decisions actually can be automated/performed via broader consensus. With that handled, all that's left is the cartel that CEOs have invented to justify their exhorbant pay packages.
CEO compensation is determined by board committees mostly made up of other CxOs. They write letters to each other's shareholders about how valuable CEOs are to build up the edifice.
I wish my compensation were determined by fellow engineers who "truly know my worth". I'd pay it forward if I were on a committee determining colleague's pay packets.
Sort of a stock suicide pact ...
What's the common element between these successes?
Except, Elon has been largely successful at being the "CEO" of these companies because he attracts talent to them. So...either:
(1) Businesses still will require human talent and if so, I don't see how an AI bot will replace a CEO who is necessary to attract human talent.
OR
(2) Businesses don't require any human talent beneath the CEO and thus a CEO is still necessary, or at least some type of "orchestrator" to direct the AI bots beneath them.
If there was a job description to "throw this football 50 yards into a trash can, a couple of times per week" I wouldn't be able to do the job at all, but an NFL quarterback might be able to do the job for 5 different companies while also Tweeting 50 times a day.
s/operates/operates on/
So they are a surgeon? Wouldn't be surprised at the damage they cause, conidering the business results of so many companies.
I thought determining vision/goals/direction was the responsibility of the board. The Chief Executive Officer is supposed to execute the board's wishes.
A CEO's job is (roughly) to maximize a company's valuation. It is not to run the company themselves, not to be nice, not to improve the world. I'm not claiming this is what _should_ be, just how it _is_. By this metric, I think Musk has done really well in his role.
Edit: Tangentially related -- at the end of the musical "Hadestown", the cast raise their glasses to the audience and toast "to the world we dream about, and the one we live in today." I think about that a lot. It's so beautiful, helps enforce some realism on me, and makes me think about what I want to change with my life.
It's called "lying to customers and investors".
> And the bad-person strategy works well for him.
Worked. Tesla is not doing that well recently. Others are a bit better.
Maybe, maybe not. We often see technology reach a threshold that allows for sudden progress, like Newton and Leibniz both coming up with calculus at around the same time (https://en.wikipedia.org/wiki/Leibniz%E2%80%93Newton_calculu...), or Darwin rushing to publish On The Origin of Species because someone else had figured out the same thing (https://en.wikipedia.org/wiki/Alfred_Russel_Wallace).
SpaceX benefited immensely from massive improvements in computing power, sensors, etc.
You can decide if that’s a touch of luck. I’m sure he had a few near misses in combat with an element of luck, too.
I do not want to take credit away from SpaceX in what they achieved. It sure is complex. But it's also possible to give someone excess credit by denying others what is due. I don't know which part of 'reusable rockets' you are talking about, whether it's the reusable engines and hardware or if it's the VTOL technology. But none of that was 'invented' by SpaceX. NASA had been doing that for decades before that, but never had enough funding to get it all together. Talking about reusable hardware and engines, the Space Shuttle Orbiter is an obvious example - the manned upper stage of a rocket that entered orbit and was reused multiple times for decades. SpaceX doesn't yet have an upper stage that has done that. The only starship among the 9 to even survive the reentry never entered orbit in the first place. Now comes the 'reusable engine'. Do you need a better example than the RS-25/SSME of the same orbiter? Now let's talk about VTOL rockets. Wasn't Apollo LMs able to land and takeoff vertically in the 1960s itself? NASA also had a 'Delta Clipper' experiment in the 1990s that did more or less the same thing as SpaceX grasshopper and Starship SN15 - 'propulsive hops', multiple times. Another innovation at SpaceX is the full-flow stage combustion cycle used in the Raptor engine. To date, it is the only FF-SCC engine to have operated in space. But both NASA and USSR had tested these things on the ground. Similarly, Starship's silica heat tiles are entirely of NASA heritage - something they never seem to mention in their live telecasts.
I see people berating NASA while comparing them with SpaceX. How much of a coincidence is it that the technologies used by SpaceX are something under NASA's expertise? The real engineers at SpaceX wouldn't deny those links. Many of them were veterans who worked with NASA to develop them. And that's fine. But it's very uncharitable to not credit NASA at all. The real important question right now is, how many of those veterans are left at SpaceX, improving these things? Meanwhile unlike SpaceX, NASA didn't keep getting government contracts, no matter how many times they failed. NASA would find their funding cut every time they looked like they achieved something.
> It's the same as Steve Jobs, the Android guys were still making prototypes with keyboards until they saw the all screen interface of the iPhone.
Two things that cannot be denied about Steve Jobs is that he had an impeccable aesthetic sense and an larger-than-life image needed to market his products. But nothing seen in the iPhone was new even in 2007. Full capacitive touch screens, multi-touch technology, etc were already in the market in some niche devices like PDAs. The technology wasn't advanced enough back then to bring it all together. Steve Jobs had the team and the resources needed to do it for the first times. But he didn't invent any of those. Again, this is not to take away the credit from Jobs for his leadership.
> Sometimes it requires a single individual pushing their will through an organization to get things done, and sometimes that requires lying.
This is the part I have a problem with. All the work done by the others are just neglected. All the damages done by these people are also neglected. You have no idea how many new ideas from their rivals they drive into oblivion, so as to retain their image. Leaders are a cog in the machine - just like everyone else working with him to generate the value. But this sort of hero worship by neglecting everyone else and their transgressions is a net negative for human race. They aren't some sort of divine magical beings.
The entire point of an MBA is networking for executive roles.
The secret sauce is execution.
Hired CEOs are there to execute a board vision.
Made CEOs are there to execute their vision.
Their level of expertise, access, relationships, etc all scale with the business. If it’s big, you need someone well connected who can mange an organization of that size. IANAE but I would imagine having access to top schools would be a big factor as well.
Maybe he’s just that good at what he does?
If you need to be lucky in meeting the right people, you can increase your chances by spending your evenings in the your nearest financial district watering hole. We’ve easily established luck can be controlled for, which puts us back into skill territory.
What specifically must one luck out on? Have you tried?
I played every single day, and I played at different locations. I also made sure I performed my pre-ticket rituals which I learned from other lottery winners. Other people could have done the same. It’s absolutely a skill issue.
Everyone one of us here has an unbroken line of lucky (lucky enough!) ancestors stretching back a billion years or so. Pretending it's not a thing is silly.
When you're born matters. Where you're born matters. Who you encounter matters. etc. etc. etc.
> What specifically must one luck out on? Have you tried?
I think perhaps we have different definitions of luck.
Sure, but that's not what's being asserted. I am not "permanently locked out" of megacorp CEO roles; I'm just vanishingly unlikely to get one.
There are lots of people who have enough singing/dancing skill to be a mega popstar like Taylor Swift. There just aren't enough slots.
Could I become the next Steve Jobs? Maybe! I'd have to get really lucky.
Vanishingly unlikely to get one if you try, or vanishingly unlikely to get one if you sit on your ass all day?
I assume you’re talking about the former and yet I don’t think you’ve thought this through. I think you’ve blindly attributed to luck what actually requires time, perseverance, grit, lack of morality. The only way to figure that out is for you to offer up your understanding of what one must luck out on?
Because they're a form of luck?
If you're born in the developed world, that's luck. If you're born to supportive parents, that's luck. If you're Steve Jobs and you wind up high school buddies with Woz in Mountain View, CA, that's luck. White? Luck. Male? Luck. Healthy? Luck. A light touching of psychopathy? Luck!
> Vanishingly unlikely to get one if you try, or vanishingly unlikely to get one if you sit on your ass all day?
Both.
> I think you’ve blindly attributed to luck what actually requires time, perseverance, grit, lack of morality.
There are many, many people who devote time, perserverance, and grit to their endeavours without becoming a "hugely expensive" CEO. Hence, luck. Is it the only thing? No. Is it a thing? Yes, absolutely.
Those people who devote time - do they devote time to becoming a hugely expensive CEO or just some “endeavours”?
I think we’re fundamentally disagreeing on whether or not lack of luck can be adequately compensated for by exerting more effort. I have not yet heard of a compelling argument for why that’s not the case.
Again, no one said they're requirements. Just significant factors. You don't have to be white, you don't have to be male, you don't have to be from the developed world… but you do have to have some substantially lucky breaks somewhere.
A quadriplegic orphan of the Gaza War might become the next Elon Musk. But the odds are stacked heavily against them.
I did.
> It talks about CEOs in general, not just megacorp ones, even if it does use megacorp CEOs in the intro.
This does not accurately describe the article.
So what does that tell you?
It must be luck plus something else.
That is why I said “significant role”, not “the only requirement”, yes.
And what is typically done is you ignore it. It’s always there, it’s random, and it applies to all samples.
Same with luck and success. You can control luck, so you focus on what’s left.
You aren't the CEO of anything, like most, because you aren't good enough.
Set up two identical agents in a game with rules guaranteeing a winner, and you will end up with one loser being equal to the winner.
I agree that CEO positions in aggregate are likely generally filled by people better at "CEOing", but there is nothing ruling out "losers" who were equally skilled or even better that just didn't make it due to luck or any of the innumerable factors playing into life.
The movie makes it quite clear, actually.
The Bobs were actually way better than the stereotypical layoff consultants. They even caught on the crazy management chain and the busywork generated by TPS reports. Sure they wanted to layoff good engineers, but doesn't invalidate the actual good findings.
Did we ever see him interacting with a customer? I don't remember that part and I can't find any clip of it. We see him in many other situations. We know he was not respected and was a weirdo in many ways, but that doesn't say anything about the quality of his customer communication.
Getting thousands of employees to all work towards a common goal is EXTREMELY difficult. Not to mention selling it to customers, investors, etc.
It doesn’t matter how technically proficient you are - you will fail if you don’t have people skills.
And people skills are far harder to measure, so we basically filter by success (which everyone knows is imperfect).
And there are far, far fewer people with the kind of people skills needed than people who can program a computer. Hence, pay is far higher.
You might as well ask why people don’t use AI pickup coaches.
It is good, that CEOs also get some of this "You will be replaced by AI!" flak, that we hear from CEOs of big tech directed at developers. Do those CEOs think their job is more complex than a software developer job, which they are so eager to replace? How many times more urgently do we want to replace the CEO, considering salaries? How about we put as many times the amount of money into that, as we are putting into trying to replace developers?
In the end neither will work out any time soon, judging current "AI"'s actual AI level. I think for that we still need some 2-3 architectural leaps forward. And by that I don't mean simply building bigger ANNs and ingesting more data. It already seems like the returns for that are rapidly diminishing.
You can estimate the difficulty of a job by what fraction is the population can successfully do it and how much special training this takes. Both of which are reflected in the supply curve for labor for that job.
> How many times more urgently do we want to replace the CEO, considering salaries? How about we put as many times the amount of money into that, as we are putting into trying to replace developers?
Pretty sure that (avg developer pay * number of developers) is a lot more that (avg ceo pay * number of ceos).
Since businesses need to start somewhere/when and most startups fail, I think most people who even get into the role of CEO, are doing it successfully. However, this is a lot due to circumstances and many factors outside of their control. There are also many CEOs ruining their businesses with bad decisions. It is not certain, that an "AI" wouldn't do at least as good as those failing CEOs. Similarly, many developers ruin things they touch, introducing tons of complexity, dependencies and breaking user workflows or making workflows cumbersome without listening to user feedback and so on.
In short many people do a bad job and businesses are carried by others, who do a good enough job to make a net positive for the final product. Or consequences of messing up are happening slowly, like a slow user drain, or a user replacement with bad actors until good actors start to leave, or any other possibility.
About the pay argument: Well, these days you still need a good crew of developers to make the shiny AI toys do what you want them to do, so you are not replacing all of the developers, so you can't calculate like that. If we calculate some Silicon Valley CEO making 2 million and a developer making 100k-200k, then we are still at a ratio of 10x-20x. If we manage to make only one CEO obsolete or 2 out of 3 CEOs 1.5x as efficient, we have achieved a cost saving of 10-20 developers! Yay!...
I thought you meant "AI-startup CEO" for a moment and was going to agree.
My dad had a manager (who was a VP) that he privately nicknamed "VPGPT", because despite being a very polite and personable guy he pretty much knew nothing about the engineering he was ostensibly managing, and basically just spoke in truisms that sounded kind of meaningful unless you do any kind of analysis on them.
I'm not saying that AI would necessarily be "better", but I do kind of hate how people who are utterly incapable of anything even approaching "technical" end up being the ones making technical decisions.
You'll easily find people preaching or selling that sort of thing on Twitter, and the sort of people who are still on Twitter are probably buying it.
(Probably mentally unhealthy people, but still it happens!)
Don't steal this idea it's mine I'm going to sell it for a million dollars.
It’s not entirely irrelevant to the commercial case but you really need to start with technical feasibility and how resilient the job is to mistakes.
And something that is high risk, high leverage and very softskills/experience driven is a bad place for AI imo
Anthropic‘s shenanigans with their AI vendor should illustrate how far we are from credible AI CEOs
The entire job is almost entirely human to human tasks: the salesmanship of selling a vision, networking within and without the company, leading the first and second line executives, collaborating with the board, etc.
What are people thinking CEOs do all day? The "work" work is done by their subordinates. Their job is basically nothing but social finesse.
So, writing emails?
"Hey, ChatGPT. Write a business strategy for our widget company. Then, draft emails to each department with instructions for implementing that strategy."
There, I just saved you $20 million.
I think that you don't appreciate that charismatic emails are one of the few things that modern AI can do better than humans.
I wouldn't trust ChatGPT to do my math homework, but I would trust it to write a great op-ed piece.
It would be actually nicely self reinforcing and resisting a change back, because now it's in board's interest to use an LLM which cannot be smoothtalked into bad deals. Charisma becomes the negative signal and excludes more and more people.
If it were this easy, you could have done it by now. Have you?
In order to save $20 million dollars with this technique, the first step is to hire a CEO who gets paid $20 million dollars. The second step is to replace the CEO with a bot.
I confess that I have not yet completed the first step.
Although I think it's more likely that we're going to enter an era of fully autonomous corporations, and the position of "CEO" will simply no longer exist except as a machine-to-machine protocol.
And that trust can only be a person who is innately human, because the AI will make decisions which are holistically good and not specifically directed towards the above goals. And if some of the above goals are in conflict, then the CEO will make decisions which benefit the more powerful group because of an innately uncontrollable reward function, which is not true of AI by design.
You can't think of a single difference in the nature of the job of artist/musician vs. lawyer vs. business executive?
That applies to every call to replace jobs with current-gen AI.
But I can't think of a difference between CEOs and other professions that works out in favor of keeping the CEOs over the rest.
Everyone is indispensable until they aren't.
I can think of plenty, but none that matter.
As the AI stans say, there is nothing special about being human. What is a "CEO?" Just a closed system of inputs and outputs, stimulus and response, encased in wetware. A physical system that like all physical systems can be automated and will be automated in time.
The market they serve is themselves and powerful shareholders. They don't serve finicky consumers that have dozens of low-friction alternatives, like they do in AI slop Youtube videos, or logo generation for their new business.
A human at some point is at the top of the pyramid. Will CEOs be finding the best way to use AI to serve their agenda? They'd be foolish not to. But if you "replace the CEO", then the person below that is effectively the CEO.
Or maybe the person you're describing is right, and CEOs are just like a psy-rock band with a Macbook trying out some tunes hoping they make it big on Spotify.
The real job is done behind the curtain. Picking up key people based on their reputation, knowledge, agency, and loyalty. Firing and laying off people. Organizational design. Cutting the losses. Making morally ambiguous decisions. Decisions based on conversations that are unlikely to ever be put into bytes.
Lots of people have legal obligations.
In this case, I assume that in this case you're referring to a fiduciary duty (i.e. to act in the best interests of the company), which is typically held not by the CEO, but but by the directors.
Ultimately the responsibility to assign daily operation of the company rests with the board, both legally and practically, as does the decision to use a human or AI CEO.
More practically, legal accountability would be placed in the individuals approving LLMs actions and/or the entity providing the LLM service. The latter aspect being why many AI vendor deals fall through. Because everything is awesome until the contract comes and the vendor wants to take no responsibility for anything that results from their product.
Anything that removes the power of CEOs and gives it to the worker should be highly encouraged. Economic democracy is the final frontier of human empowerment and giving workers then means to have democratic control over the economy can only unlock more human potential, not less.
High performance sports teams have a captain that is often elected in some form from the team.
Likewise the crew of a pirate ship used to elect their captain.
Both examples serve contrary to your point, and there's no reason you couldn't have something similar in business: a cooperative that elects a CEO, rather than it being done by a board of other CEO's.
Pretty sure the moment you do this, the workers liquidate the company and distribute the assets among themselves, as evidenced by the acceptance rate of voluntary severance offers in many past downsizings, such as the Twitter one.
The only thing that will do this is if workers are the resource bottleneck.
> Economic democracy is the final frontier of human empowerment and giving workers then means to have democratic control over the economy can only unlock more human potential, not less.
This already exists. It's called free enterprise and freedom of association.
Unless of course you mean that nobody can own or expend resources without (nominally) everybody agreeing... which has also been tried, and failed horribly.
Except replacing CEOs with AIs will not do this.
It won't make the companies worse run, why would workers want to destroy their means to live? CEOs do this with no skin in the game, the workers should take that skin as they will always be better stewards than the single tyrant.
Where is your evidence that companies won't be worse run? Workers could just vote to give themselves massive raises and hemorrhage the company, ironically like how some private equity firms operate but en masse. No one would start companies in this sort of scenario thereby causing the economy to fall, especially in comparison to companies that don't have this sort of voting system for companies.
It seems like you'd need some sort of fairly radical control structure (say, no board, just ai interacting directly with shareholders) to get around this. But even this ignores that the automation is not neutral, it is provided by actors with incentives.
It would be an interesting experiment to promote an executive assistant to CEO though.
I swear there’s a joke or cautionary tale here somewhere about “first they came for..” or something along those lines. The phrasing escapes me.
Maybe the problem isn’t that you can’t automate a CEO, it’s that the actual tangible work just isn’t worth as much as some companies pay for it, and this thread it touching a few too many raw nerves.
Well, either way it’s hilarious.
My business experience is that company culture is very important to a company’s success and I’m just doubtful that this can be created through AI.
(Surprisingly though, that's enough for them to recognize that you're a human. Their models can identify your complex thought progression in your prompts - no matter how robotic your language is.)
The REAL problem here is the hideous narrative some of these CEOs spin. They swing the LLMs around to convince everyone that they are replaceable, thereby crashing the value of the job market and increasing their own profits. At the same time, they project themselves as some sort of super-intelligent divine beings with special abilities without which the world will not progress, while in reality they maintain an exclusive club of wealthy connections that they guard jealously by ruining the opportunities for the others (the proverbial 'burning the ladder behind them'.) They use their PR resources to paint a larger-than-life image that hides the extreme destruction they leave behind in the pursuit of wealth - like hiding a hideous odor with bucketfuls of perfume. These two problems are the two sides of a coin that expose their duplicity and deception.
PS: I have to say that this doesn't apply to all CEOs. There are plenty of skilled CEOs, especially founders, who play a huge role in setting the company up. Here I'm talking about the stereotypical cosmopolitan bunch that comes to our mind when we hear that word. The ones who have no qualms in destroying the world for their enjoyment and look down upon normal people as if you're just fodder for them.
For the soft CEO skills, not so much.
Not that that's a deal-breaker. I have a vision of an AI CEO couched as a "strategic thought partner," which the wet-CEO just puppets to grease the skids of acceptance among the employees.
I'd fully trust an AI CEO's decision making, for a predictable business, at least. But some CEOs get paid a lot (deservedly so) because they can make the right decisions in the thick fog of war. Hard to get an AI to make the right decision on something that wasn't in the training corpus.
Still, business strategy isn't as complex as picking winners in the stock market.
I think an AI could be strong at a few skills, if appropriately chosen:
- being gaslightingly polite while firmly telling others no;
- doing a good job of compressing company wide news into short, layperson summaries for investors and the public;
- making PR statements, shareholder calls, etc; and,
- dealing with the deluge of meetings and emails to keep its subordinates rowing in the same direction.
Would it require that we have staff support some of the traditional soft skills? Absolutely. But there’s nothing fundamentally stopping an AI CEO from running the company.
I have zero idea what most CEOs that I've worked for do ... and the seem to want it that way.
Every time the LLM CEO gets caught doing a crime and goes to 'jail', the LLMs on the exec board can vote to replace it with another instance of the same LLM model.
Forget 'limited liability', this is 'no liability'.
Every year I feel a bit less crazy in my silly armchair speculation that the Second Renaissance from the Animatrix is a good documentary. If AI "takes over" it will be via economic means and people will go willingly until they have gradually relinquished control of the world to something alien. (Landian philosophers make the case that hyperstitional capitalism has already done this)
I would take the over that this will happen sooner than later -- when it's proven to make a lot of money to have an AI CEO, suddenly everyone will change their tune and jump on the bandwagon with dollar signs in their eyes, completely ignoring what they are giving up.
Except unlike e.g. the metaverse/cryptocurrency bandwagon of yesteryear, there's no getting off.
Also, I think it misses the critical point. C-suite executives operate under immense pressure to deliver abstract business outcomes, but the lack of clear, immediate feedback loops and well-defined success metrics makes their roles resistant to automation. AI needs concrete reward functions that executive decision-making simply doesn't provide.
1. I will take five automated CEOs. If I can split my company into five distinct companies (one per product), it would be amazing. We are splitting the company into two to streamline focus on different/incompatible industries, and I am dreading the process of finding another CEO. It is very, very hard.
2. I know a lot of CEOs. It helps. I didn't know a single one when I started. It is no more a cult than my programmer's peer group was.
3. Did I tell you how hard it is to find a good CEO? It is VERY, VERY hard. Think of hiring a great product guy with agency to do whatever needs to be done, with people skills to attract talent, a sales drive, and a willingness to deal with finance & legal. Oh, and I am in the tech field, so I need him to be very hardcore technical. Your mileage might vary, but this is who I need. Anyone who has that is running their own companies. Oh, and the person has to have a proven track record. I cannot let someone unproven ruin the company and well-being of hundreds of employees and tens of thousands of customers.
4. I don't believe CEOs are special in any way other than that most other professionals are special. There are probably some underlying qualities, but they're all so different.
5. Some CEOs got there because they were lucky, but they didn't stay there for long because of luck. It is very, very simple to screw up as a CEO.
6. Growing someone within an organization to become a CEO is very hard. We are trying - giving some people more and more responsibilities, trying to involve them in more and more aspects of the organization. The filter is - repeatable success. You don't have to succeed all the time, but you have to succeed most of the time. Most people don't want the pressure, aren't interested in certain aspects, or are unsuccessful more often than they should.
7. Boards are not a cult as well; they don't have CEO's back. Boards are represented by investors (pension funds, wealthy individuals, etc.) - they will oust the CEO if the company's performance suffers. They are willing to pay a lot to the CEO because ... it is so hard to find a good CEO.
if (marketCrash) then sendEmailToGovernmentAskingForBailout();
An even more interesting one is: What will we reward?
We've been rewarding labor quantity, as well as quality via higher wages - as motivation and as incentives for more education. This reflected the productivity primacy of knowledge work in modern economies, but that might not be the case down the road.
We've also been rewarding capital. Originally this was a way for the elites to keep themselves in place (a.k.a. economic rents), but in modern times it's been more of an entrepreneurial incentive (a.k.a. economic profits.)
Without the economic profit rationale, there's no reason to reward capital accumulation. Only pro-profit decisions are good for society, pro-rent decisions are awful. If there's no profit to incentivize, capitalism is just bad all around.
If AI becomes a better profit decision-maker than an entrepreneur, any humans left in the loop are nothing but high-rollers gambling with everyone else's money.
It’s been tried before, it didn’t work out well.
Whatever the merits of the argument here (and my bolshie side has also flippantly pushed it in the past) the motivation and thrust of the essay needs to be considered in that ideological grounding.
The main job of CEOs is not decision making. 99% of company decisions are made below the level of CEO. For the ones that make it to CEO, the board tends to have final say.
It’s a leadership role where people interactions are the most important. The CEO sets the tone, gets people on the same page, and is the external face of the company.
It’s silly to think a robot can replace that.
The investors can organize the government bailouts themselves. You don't need a CEO.
If you've ever worked at a company that's a chaotic shitshow, you'll know how strong the effect of the CEO is - it always comes down to the guy at the top not being up to it.
The leverage of the role is enormous, and the strength of someone who can carry out this role well for a large company is sky high - not many such people in the world, and they only need one.
So the math all comes out very straightforward: even at obscene looking salaries, they're still a bargain.
Could be good, but could also be bad if it turns out the AI is able to be even more ruthless in how it treats its workforce.
The good news is that it doesn't need to be very accurate in order to beat the performance of most execs anyways.
Where "very often" means "almost never?"
This is how successful American propaganda is. 39% of people believed something that definitionally could never be true.
So you will find people who make average salaries defending the stratospheric salaries of CEOs because they believe they'll one day be the one benefitting or they've fallen for some sort of propaganda such as the myth of meritocracy or prosperity gospel.
Our entire economy is designed around exploiting working people and extracting all of their wealth to a tiny portion of the population. And we're reachign the point where the bottom 50% (if not more) have nothing left to exploit.
Ai and automation could be used to improve all of our lives. It isn't and it won't be. It'll be used to suppress wages and displace workers so this massive wealth transfer can be accelerated.
I get the point of the article. But those with the wealth won't let themselves be replaced by AI and seemingly the populace will never ask the question of why they can't be replaced until economic conditiosn deteriorate even further.
It's not that difficult to get into the top 1%. Most Americans earn a top 1% income. Even the top 1% of America is only a salary of around $500k. It's possible 19% of survey takers were in the top 1%, or were on a path to make that in the future.
I don't see how it's definitionally untrue to believe you could make $500k a year at some point...Let alone $34,000 a year...
1% of Americans earn a top 1% income. They weren't being asked "do you make more than an amputee kid in Gaza?"
> It's possible 19% of survey takers were in the top 1%…
There's a whole field of math devoted to preventing this. Polling works quite well, all things considered.
> They weren't being asked "do you make more than an amputee kid in Gaza?"
Context matters.
Often posed as a multiple choice question.
I'm not; this sort of thing is quite well documented.
https://phys.org/news/2024-09-people-underestimate-income.ht...
> Barnabas Szaszi and colleagues conducted four studies to explore how well people understand the wealth held by others. In one study, 990 US residents recruited online were asked to estimate the minimum annual household income thresholds of various percentiles of American earners.
But more relevant is the top 1% of net worth is currently ~$11.6M [1], which is vastly more unattainable.
Also, the net worth of the bottom 99% is skewed by house prices. You might be sitting on a house worth $1M but when every other house also costs $1M and you have to live somewhere, you don't really have a net worth of $1M.
[1]: https://finance.yahoo.com/news/among-wealthiest-heres-net-wo...
I don't know how that particular poll was worded, but in general if your a politician who rails against the top 1%, you might suffer from the fact that people have widely varying conceptions of who the 1% are.
Because building psychopathic AI's is - at the moment - still frowned upon.
but as someone who has the honor of working witha really good ceo i can definitely say that you cannot automate them. maybe in some streamlined corporate machine like ibm or something, but not in a living growing company