In regards to the autopilot branding, would a reasonable person expect a plane on autopilot to fly safely if the pilot suddenly took over and pointed it at the ground?
Tesla has had it both ways for ages - their stock price was based on "self-driving cars" and their liability was based on "asterisk asterisk the car cannot drive itself".
Rudimentary 'autopilots' on aircraft have existed for about a century now, and the earlier versions (before transistorization) only controlled heading and attitude (if conditions and other settings allowed it), with little indication of failure.
The average person does know what an autopilot does, they're just wrong.
I think the example you provided supports that.
Pilots undergo rigorous training with exam after exam they must pass.
No one is handed the keys to a Boeing 747 after some weekly evening course and an hours driving test.
To me, it's reasonable to assume that the "autopilot" in a car I drive (especially back in 2019) is going to defer to any input override that I provide. I wouldn't want it any other way.
With Robotaxi it will get even higher as it will be clear 100% the company's fault.
1. Insurance companies price in the risk, and insurance pricing absolutely influences manufacturers (see the absolute crap that the Big 3 sold in the 70s) 2. The government can force a recall based on a flaw whether or not the manufacturer agrees
What's the difference? And does it matter?
Both are misleadingly named, per the OP:
> In December 2025, a California judge ruled that Tesla’s use of “Autopilot” in its marketing was misleading and violated state law, calling “Full Self-Driving” a name that is “actually, unambiguously false.”
> Just this week, Tesla avoided a 30-day California sales suspension only by agreeing to drop the “Autopilot” branding entirely. Tesla has since discontinued Autopilot as a standalone product in the U.S. and Canada.
> This lands weight to one of the main arguments used in lawsuits since the landmark case: Tesla has been misleading customers into thinking that its driver assist features (Autopilot and FSD) are more capable than they are – leading drivers to pay less attention.
FSD has much more sophisticated features, explicitly handling traffic stops and lights. I would not expect the sort of accident to happen with FSD.
The fact that Tesla misleads consumers is a different issue from Autopilot and FSD being different.
FSD at one point had settings for whether it could roll through stop signs, or how much it could exceed the speed limit by. I've watched it interpret a railroad crossing as a weirdly malfunctioning red light with a convoy of intermittent trucks rolling by. It took the clearly delineated lanes of a roundabout as mere suggestions and has tried to barrel through them in a straight line.
I'd love to know where your confidence stems from.
My argument was that the idea that the name Autopilot is misleading comes not from Tesla naming it wrong, it comes from what most people think "Autopilots" on an aircraft do. (And that is probably good enough to argue in court, that it doesn't matter what's factually correct, it matters what people understand based on their knowledge)
Autopilot on a Tesla historically did two things - traffic aware cruise control (keeps a gap from the car in front of you) and stays in its lane. If you tell it to, it can suggest and change lanes. In some cases, it'll also take an exit ramp. (which was called Navigate on Autopilot)
Autopilots on planes roughly also do the same. They keep speed and heading, and will also change heading to follow a GPS flight plan. Pilots still take off and land the plane. (Like Tesla drivers still get you on the highway and off).
Full Self Driving (to which they've now added the word "Supervised" probably from court cases but it always was quite obvious that it was supervised, you had to keep shaking the steering wheel to prove you were alert, same as with Autopilot btw), is a different AI model that even stops at traffic lights, navigates parking lots, everything. That's the true "summon my car from LA to NY" dream at least.
So to answer your question, "What's the difference" – it's huge. And I think they've covered that in earlier court cases.
But one could argue that maybe they should've restricted it to only highways maybe? (fewer traffic lights, no intersections), but I don't know the details of each recent crash.
Tesla’s Autopilot being unable to swap from one road to another makes is way less capable than a decades old civilian autopilots which will get you to any arbitrary location as long as you have fuel. Calling the current FSD Autopilot would be overstating its capabilities, but reasonably fitting.
Can you elaborate? My very limited knowledge but of very real airplane autopilots in little Cessna and Pipers is that they are in fact far easier than cars - they are a simple control feedback loop that maintains altitude and heading, that's it. You can crash into ground, mountain, or other traffic quite cheerfully. I would not be surprised to find adaptive cruise in cars is far more complex of a system than basic aircraft "autopilot".
Levels of safety are another consideration, car autopilot’s don’t use multiple levels of redundancy on everything because they can stop without falling out of the sky.
It's trivially easy to fly a plane in straight level flight, to the extent that you don't actually need any automation at all to do it. You simply trim the aircraft to fly in the attitude you want and over a reasonable timescale it will do just that.
That seemingly shifts the difficulty from the autopilot to the airframe. But that’s not actually good enough, it doesn’t keep an aircraft flying when it’s missing a large chunk of wing for example. https://taskandpurpose.com/tech-tactics/1983-negev-mid-air-c...
Instead, you’re talking about the happy path and if we accept the happy path as enough there’s the weekend equivalents of self driving cars built using minimal effort, however being production worthy is about more than being occasionally useful.
Autopilot is difficult because you need to do several things well or people will defiantly die. Self driving cars are far more forgiving of occasional mistakes but again it’s the or people die bits that makes it difficult. Tesla isn’t actually ahead of the game, they are just willing to take more risks with their customers and the general public’s lives.
Accidents like this are obviously tragic, but let's remember that self driving software is already ~10x safer than a human. Unfortunately, lawsuits like this will slow down the rollout of this life-saving technology, resulting in a greater loss of life.
This is, technically speaking, pure bullshit. You have no proof because none exists.
I can understand the argument that in the abstract over-regulation kills innovation but at the same time in the US the pendulum has swung so far in the other direction that it’s time for a correction.
It should discourage them from making unsafe products. If it's not economical for them to make safe products, it's good that they go bankrupt and the economic resources - talent, money - go to someone else. Bankruptcy and business failure are just as fundamental to capitalism as profit.
Businesses also claim that, all the time. We need some evidence.
I remember doctors claiming that malpractice lawsuits were out of control; research I read said that it wasn't an economic issue for doctors and that malpractice was out of control.
My read is that people overpowered the safety interlock, after which the lid (predictably) flew off, and they were injured (mostly by the hot steam and bits of food). I think it's ridiculous for people to expect safety mechanisms to be impossible to bypass, but maybe you disagree!
It seems clear that "autopilot" was a boisterous overclaim of its capabilities that led to people dying.
It may be minorly absurd to win founder-IPO-level wealth in a lawsuit, but it's also clear that smaller numbers don't act as an effective deterrent to people like Elon Musk.
So... informally, "Tesla Co-Pilot" => "You're still the pilot but you have a helper", vs "Tesla Autopilot" => "Whelp, guess I can wash my hands and walk away b/c it's AuToMaTiC!"
...it's tough messaging for sure, especially putting these powertools into peoples hands with no formal training required. Woulda-coulda-shoulda, similar to the 737MAX crashes, should "pilots" of Teslas required training in the safety and navigation systems before they were "licensed" to use them?
Tesla Autopilot seems to mostly drive hubris. The fine print says you're still supposed to maintain control. They don't have as sophisticated sensors as competitors because Elon decreed "humans don't have LiDAR, so we don't need to pay for it."
Nobody is saying it has to be perfect, but Tesla hasn't demonstrated that it's even trying.
> The company essentially argued that references to Elon Musk’s own public claims about Autopilot, claims that Tesla actively used to sell the feature for years, were somehow unfair to present to a jury. Judge Bloom was right to reject that argument.
Of course, since Elon Musk has lied and over-promised a lot about Tesla's self-driving technology. It's an interesting defense to admit your CEO is a lair and can't be trusted.
> Ask the CEO? Based on recent incentives and acquisitions, are they planning to remain a car company?
I believe Musk wants to hype humanoid robots, because he can't get away with irrationally hyping electric cars or self-driving technology like you used to.
Tesla was never a car company, their real product is sci-fi dreams.
By the fact that they don't have autonomous driving. And this very judgement demonstrates that.
If you have to keep your full attention on the road at all times and constantly look out for the 10% case where the autopilot may spectacularly fail, it instantly turns off the vast majority of prospective users.
Funny enough the tech that Musk's tweets and the Tesla hype machine has been promising for the last decade is actually on the streets today. It's just being rolled out by Waymo.
That's like paying for a "self-juicing juicer" that only works with proprietary juice packages sold through an overpriced subscription.
Edit: Mostly a criticism. I have no bone to pick with Elon, but subscription slopware is the reason why Chinese EVs are more desirable to average Joes like me.
Waymo on the other hand has a level 4 system, and has for many years, in many cities, with large service areas.
Tesla is unquestionably in the dust here, and the delusional, er, faithful are holding out for this mythical switch flip where Elon snaps his fingers and every Tesla turns into a level 4 robotaxi (despite the compute power in these cars being on the level of a GTX 5090, and the robotaxis having custom hardware loadouts)
If you want to call this "autonomous" well then we are arguing semantics. But I think colloquially, autonomous means "no human".
though they did update the model y (looks like a duck), they just cancelled the model S and X
In 2 years Tesla will be replacing most factory workers with fully autonomous robots that will do most of the work. This will generate trillions in revenue and is totally definitely trust me bro possible.
Expect huge updates on this coming in the near future, soon. Tesla will be the most valuable company on Earth. Get in the stock now.
(cars, solar panels, energy storage, and robotaxis are no longer part of the roadmap because optimus bots will bring in so much money in 2 years definitiely that these things won't matter so don't ask about them or think about them thanks.)
Tesla Model 3 highest overall in owner satisfaction.
"Left in the dust?"
That's why Chevy has a bunch from them, including "Highest initial quality"
> Consider also how many lives have been saved by the autopilot.
> Be careful what you wish for.
How many? Tell me.
https://www.youtube.com/watch?v=A3K410O_9Nc
https://www.youtube.com/watch?v=Qy6SplEn4hQ
So? I just watched those. They don't prove anything about "lives [that] have been saved by the autopilot." They all look like scenarios a human driver could handle (and *I, personally have handled situations similar to some of those). If autopilot is saving lives, you have to show, statistically, it's better than human drivers in comparable conditions.
Also the last one appears to be of a Tesla fanboy who had just left a Tesla shareholder meeting, and seems pretty biased. I'd say his Cybertruck actually reacted pretty late to the danger. It was pretty obvious from the dashcam that something was wrong several seconds before the car reacted to it at the last second.
I think that's true. Though I recall that Waymo limits their cars to safer and more easily handled conditions, which is totally the right thing to do, but it probably means that statistic needs an asterisk.
> Not sure if that generalizes to Tesla FSD, though.
I don't think it does. They're two totally different systems.
And I have doubts about that man's reliability.
You can also google "how many lives has tesla autopilot saved?" and the results suggest that the autopilot is safer than human pilots.
If you accidentally kill someone with your car, do you think you should have to pay $243m?
It would be a large portion of my net worth for sure. Maybe also prison time. Can we put Autopilot or Tesla in a prison?
If you have billions of dollars, and somehow can't go to prison, yes you should. If not in compensation to the victim, then in some kind of fine scaled to wealth. The amount paid needs to be high enough to be a deterrent to the individual who did wrong.
Besides, nobody makes you turn on the autodrive.
What is autodrive? Are you talking about basic autopilot, enhanced autopilot, or full self-driving? They are separate modes:
https://en.wikipedia.org/wiki/Tesla_Autopilot#Driving_featur...
Which revision of the hardware and software is the "good one"? Remember that Tesla claimed in 2016 that all Teslas in production "have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver". But that was, of course, a lie:
https://web.archive.org/web/20240730071548/https://tesla.com...
https://electrek.co/2025/10/22/tesla-changes-all-cars-have-s...
What Tesla used to claim was "full autonomy" is now called "Full Self-Driving (supervised)", whatever that's supposed to mean. How many times has "Full Self-Driving (supervised)" gone dangerously wrong but was stopped? How many times was supervision not enough:
https://electrek.co/2025/05/23/tesla-full-self-driving-veers...
Show me some concrete numbers to back your claims. If you can't do that then I think you've fallen victim to Tesla's dishonest marketing.
My current bet is that optimus will fail spectacularly and Tesla gets left far behind as Rivian's R2 replaces it.
One thing I will note: I know folks that work at TSLA. Musk is more of a distraction. If he goes and if competent leadership is brought in, there's still enough people and momentum to make something happen...
You’re a lot more optimistic about this than I am.
Appealing is expensive because they have to post a bond with 100% collateral, and you pay for it yearly. In this case, probably around 8 million a year.
So in general its not worth appealing for 5 years unless they think they will knock off 25-30% of the judgement.
Here it's the first case of it's kind so i'm sure they will appeal, but if they lose those appeals, most companies that aren't insane would cut their losses instead of trying to fight everything.
> Tesla has indicated it will appeal the verdict to a higher court.
The appeal will go to the 11th circuit.
"""Results. At a discount rate of 3 percent, males and females aged 20-24 have the highest PVLE — $1,517,045 and $1,085,188 respectively. Lifetime earnings for men are higher than for women. Higher discount rates yield lower values at all ages."""
1. It could be ANY car with similar at that time auto steer capabilities. 2. Why the hate , because of some false promise ? Because as of today same car would save the guy in exact same situation, because FSD now handles red lights perfectly. Far better and safer vs ANY other tech included in the avg car price of same segment ( $40-50k).
Perfectly isn’t a descriptor I would use. But this is just anecdotal.
Another name for "false promise" when made for capital gain is "fraud". And when the fraud is in the context of vehicular autonomy, it becomes "fraud with reckless endangerment". And when it leads to someone's death, that makes it "proximate cause to manslaughter".