(Though, there is still an element of owner/operator maintenance for level 4/5 vehicles -- e.g., if the owner fails to replace tires below 4/32", continues to operate the vehicle, and it causes an injury, that is partially the owner/operator's fault.)
The product you buy is called "FSD Supervised". It clearly states you're liable and must supervise the system.
I don't think there's law that would allow Tesla (or anyone else) to sell a passenger car with unsupervised system.
If you take Waymo or Tesla Robotaxi in Austin, you are not liable for accidents, Google or Tesla is.
That's because they operate on limited state laws that allow them to provide such service but the law doesn't allow selling such cars to people.
That's changing. Quite likely this year we will have federal law that will allow selling cars with fully unsupervised self-driving, in which case the insurance/liability will obviously land on the maker of the system, not person present in the car.
FSD isn't perfect, but it is everyday amazing and useful.
If the company required a representative to sit in the car with you and participate in the driving (e.g. by monitoring and taking over before an accident), then there's a case to be made that you're not fully autonomous.
> https://news.ycombinator.com/newsguidelines.html
You keep posting the same thing across the thread. Do better.
This is news to me. This context seems important to understanding Tesla's decision to stop selling FSD. If they're on the hook for insurance, then they will need to dynamically adjust what they charge to reflect insurance costs.
Also, self driving is a feature of a vehicle someone owns, I don't understand how that should exempt anyone from insuring their property.
Waymo and others are providing a taxi service where the driver is not a human. You don't pay insurance when you ride Uber or Bolt or any other regular taxi service.
Well practically speaking, there’s nothing stopping anyone from voluntarily assuming liability for arbitrary things. If Tesla assumes the liability for my car, then even if I still require my “own” insurance for legal purposes, the marginal cost of covering the remaining risk is going to be close to zero.
And the system is designed to set up drivers for failure.
An HCI challenge with mostly autonomous systems is that operators lose their awareness of the system, and when things go wrong you can easily get worse outcomes than if the system was fully manual with an engaged operator.
This is a well known challenge in the nuclear energy sector and airline industry (Air France 447) - how do you keep operators fully engaged even though they almost never need to intervene, because otherwise they’re likely to be missing critical context and make wrong decisions. These days you could probably argue the same is true of software engineers reviewing LLM code that’s often - but not always - correct.
Really? Thats crazy.
The last few years of Tesla 'growth' show how this transition is unfolding. S and X production is shutdown, just a few more models to shutdown.
Any car has varying degrees of autonomy, even the ones with no assists (it will safely self-drive you all the way to the accident site, as they say). But the car is either driven by the human with the system's help, or is driven by the system with or without the human's help.
A car can't have 2 drivers. The only real one is the one the law holds responsible.
Cars are traditionally sold as the customer has liability. Nothing stops a car maker (or even an individual dealer) from selling cars today taking all the insurance liability in any country I know of - they don't for what I hope are obvious reasons (bad drivers will be sure to buy those cars since it is a better deal for them an in turn a worse deal for good drivers), but they could.
Self driving is currently sold as customers has liability because that is how it has always been done. I doubt it will change, but it is only because I doubt there will ever be enough advantage as to be worth it for someone else to take on the liability - but I could be wrong.
In reality, you acquired a license to use it. Your liability should only go as far as you have agreed to identify the licenser.
If Tesla didn't want Lemonade to provide this, they could block them.
Strategically, Tesla doesn't want to be an insurer. They started the insurance product years ago, before Lemonade also offered this, to make FSD more attractive to buyers.
But the expansion stalled, maybe because the state bureaucracy or maybe because Tesla shifted priority to other things.
In conclusion: Tesla is happy that Lemonade offers this. It makes Tesla cars more attractive to buyers without Tesla doing the work of starting an insurance company in every state.
If the math was mathing, it would be malpractice not to expand it. I'm betting that their scheme simply wasn't workable, given the extremely high costs of claims (Tesla repairs aren't cheap) relative to the low rates that they were collecting on premiums. The cheap premiums are probably a form of market dumping to get people to buy their FSD product, the sales of which boosts their share price.
It'll come back.
Hey Lemonade or Tesla if you find this, let's collab, i'm a founder in sunnyvale, insurtech vertical at pnp
They released the Tesla Insurance product because their cars were excessively expensive to insure, increasing ownership costs, which was impacting sales. By releasing the unprofitable Tesla Insurance product, they could subsidize ownership costs making the cars more attractive to buy right now which pumped revenues immediately in return for a "accidental" write-down in the future.
[1] https://peakd.com/tesla/@newageinv/teslas-push-into-insuranc...
Now that they are offering this program, they should start getting much better data by being able to correlate claims with actual FSD usage. They might be viewing this program partially as a data acquisition project to help them insure autonomous vehicles more broadly in the future.
What do you mean?
In fact, Tesla Insurance, the people who already have direct access to the data already loses money on every claim [1].
[1] https://peakd.com/tesla/@newageinv/teslas-push-into-insuranc...
Teslas only do FSD on motorways where you tend to have far fewer accidents per mile.
Also, they switch to manual driving if they can't cope, and because the driver isn't paying attention this usually results in a crash. But hey, it's in manual driving, not FSD, so they get to claim FSD is safer.
FSD is not and never will be safer than a human driver.
For the rest - many of them live in a place where not enough others will follow the same system and so they will be forced to own a car just like today. If you live in a not dense area but still manage to walk/bike almost everywhere (as I do), renting a car is on paper cheaper the few times when you need a car - but in practice you don't know about that need several weeks in advance and so they don't have one they can rent to you. Even if you know you will need the car weeks in advance, sometimes they don't have one when you arrive.
If you live in a very dense area such that you almost regularly use transit (but sometimes walk, bike), but need a car for something a few times per year, then not owning a car makes sense. In this case the density means shared cars can be a viable business model despite not being used very much.
In short what you say sound insightful, but reality of how cars are used means it won't happen for most car owners.
Or, if they are Hertz, they might have one but refuse to give it to you. This happened to my wife. In spite of payment already being made to Hertz corporate online, the local agent wouldn't give up a car for a one-way rental. Hertz corporate was less than useless, telling us their system said was a car available, and suggesting we pay them hundreds of dollars again and go pick it up. When I asked the woman from corporate whether she could actually guarantee we would be given a car, she said she couldn't. When I suggested she call the local agent, she said she had no way to call the local office. Unbelievable.
Since it was last minute, there were... as you said, no cars available at any of the other rental companies. So we had to drive 8 hours to pick her up. Then 8 hours back, which was the drive she was going to make in the rental car in the first place.
Hertz will hurts you.
Half-jokes aside, if you don't own it, you'll end up paying more to the robotaxi company than you would have paid to own the car. This is all but guaranteed based on all SaaS services so far.
If all of those people switch to cars, you end up with it taking an hour to travel 1 mile by car.
It's almost as if they have busses for a reason.
Efficient for who, is the problem
The point of a car is takes you door to door. There's no expectation to walk three blocks from a stop; many US places are not intended for waking anyway. Consider heavy bags from grocery shopping, or similar.
Public transit works in proper cities, those that became cities before the advent of the car, and were not kept in the shape of large suburban sprawls by zoning. Most US cities only qualify in their downtowns.
Elsewhere, rented / hailed self-driving cars would be best. First of all, fewer of them would be needed.
Cars are mostly idle and could be cheaper if shared. But why make them significantly cheaper when you can match the price and extract more profits?
Even better — charge 10% less and corner the market! As long as nobody charges 10% less than you…
And with just 6 people the overhead if an imperfect route and additional stops will be measured in minutes.
And of course, it's pretty easy to imagine an option to pay a bit more for a fully personal route.
Yeah, this would rely on robust competition.
Subscription for self driving will almost be a given with so many bad actors in tech nowadays, but never even being allowed to own the car is even worse.
And what do you even mean by subscription to changes to the law?
Law - when a government changes the driving laws. Government can be federal (I have driven to both Canada and Mexico. Getting to Argentina is possible though I don't think it has ever been safe. Likewise it is possible to drive over the North Pole to Europe), state (or whatever the country calls their equivalent). When a city changes the law they put up signs, but if a state passes a law I'm expected to know even if I have never driven in that state before. Right turn on red laws are the only ones I can think of where states are different - but they are likely others.
Laws also cover new traffic control systems that may not have been in the original program. If the self driving system can't figure out the next one (think roundabout) then it needs to be updated.
Please don't use rules as a cudgel or at least have more tact doing so.
genocide /jĕn′ə-sīd″/ noun
The systematic and widespread extermination or attempted extermination of a national, racial, religious, or ethnic group. The systematic killing of a racial or cultural group.“Uhm aktually it’s not a genocide it’s just a fascist police state”
Multiple humanitarian organizations define mass displacement as genocide and/or ethnic cleansing.
The holocaust literally started with mass deportations/detentions. Then the nazis figured out that it was easier to kill detainees.
It may not be on the marketing copy but it’s almost certainly present in the contract.
Anybody know??
Tesla FSD is still a supervised system (= ADAS), afaik.
> Fair prices, based on how you drive [...] Get a discount, and earn a lower premium as you drive better.
I did get lots of traction issues with FWD EV, any sort of wet - you need to baby it.
Booting the go pedal at every stop sign or light just feels like being a bit of a childish jerk after a short while on public roads once the novelty wears off.
No thanks. I unplugged the cellular modem in my car precisely because I can't stand the idea that the manufacturer/dealer/insurance company or other unauthorized third parties could have access to my location and driving habits.
I also generally avoid dealers like the plague and only trust the kind of shops where the guy who answers the phone is the guy doing the work.
As an extreme end of a spectrum example, there's been worry and debate for decades over automating military capabilities to the point where it becomes "push button to win war". There used to be, and hopefully still is, lots of restraint towards heading in that direction - in recognition of the need for ethics validation in automated judgements. The topic comes up now and then around Tesla's, and impossible decisions that FSD will have to make.
So at a certain point, and it may be right around the point of serious physical harm, the design decision to have or not have human-in-the-middle accountability seems to run into ethical constraints. In reality it's the ruthless bottom line focused corps - that don't seem to be the norm, but may have an outsized impact - that actually push up against ethical constraints. But even then, I would be wary as an executive documenting a decision to disregard potential harms at one of them shops. That line is being tested, but it's still there.
In my actual experience with automations, they've always been derived from laziness / reducing effort for everyone, or "because we can", and sometimes a need to reduce human error.
Are you saying that the investments in FSD by tesla have been with the goal of letting drivers get a way with accidents? The law is black and white
On the surface, this looks like an endorsement of Tesla's claims about FSD safety.