52 pointsby Bender8 hours ago4 comments
  • LorenPechtel5 hours ago
    I suspect this comes down to the same problem we've seen in other forms--their system stinks at detecting that a stationary object is in the road.
    • AustinDev5 hours ago
      This is one thing LIDAR is pretty good at.
      • dzhiurgis4 hours ago
        Do people still believe this is the clutch?

        I see way more crash compilations from Waymo than Tesla (despite having something like 300k FSD subscribers and over 1M permanent purchasers).

        Sure LIDAR can fill like 5% of gaps, but let's not pretend it's the underlying AI model that does the grunt work. Which begs the question why Waymo hasn't scaled nationwide and why cybercab hasn't ramped up yet. Both aren't doing that amazing.

        • jerlaman hour ago
          Probably selection effect. Tesla owners with FSD are often aware of its shortcomings and will not use it in situations where it wouldn't work, much less post clips of their mistakes online. People seem to agree it works fine on highways where cars travel in consistent patterns.

          Waymos are in the exact opposite situation. They only run in busy cities so there are lots of bystanders to take a video of the situation, including the passenger, who has no incentive to hide the issue. Waymos can't revert to a driver in the car when things get tough; they call back to their monitoring center and come to a halt, which draws further attention and mockery.

          You cannot assume that online algorithms are giving you a unbiased, neutral view of the world. They are specifically tuned against that.

          • FireBeyond43 minutes ago
            > will not use it in situations where it wouldn't work

            Often "cannot". FSD will refuse to engage in those situations, often.

            But Elon will trot out "so much safer", omitting "for some conditions, on some road, in some weather", versus "all drivers, all conditions, all road, all weather".

            "You see, we win the vast majority of games when we just don't play the ones we thought we might lose!"

          • dzhiurgis33 minutes ago
            But assertion is that FSD is more dangerous as people don't monitor situation until it is too late.

            Claiming there are no Tesla's in busy cities is ridiculous.

            Given all the scrutiny Tesla gets (good, it made them unstoppable) you'd expect all sort of activists driving to Austin and literally crashing into robotaxis.

        • fooblaster3 hours ago
          Tesla has not pulled the driver. It's just not comparable.
          • dzhiurgis37 minutes ago
            Says 11 vehicles unsupervised https://robotaxitracker.com/?provider=tesla
          • dangus3 hours ago
            Their website now prominently states “supervised” since they got into so much hot water overselling the capabilities.

            Tesla FSD is really in a pointless middle ground where the steep $99/month they ask for it is just not worth it.

            It does basically nothing for you on the highway to alleviate fatigue above and beyond a standard adaptive cruise control system you can find in a Volkswagen Jetta.

            The FSD on city streets is not autonomous enough to take away supervision so for the 10-20 minutes people typically spend driving in city traffic situations before reaching their destination it’s not saving a whole lot of effort to just…drive yourself.

            I would think if I owned a car that wasn’t an old ass beater like I have I would mainly benefit from adaptive cruise control on long trips and perhaps some convenience stuff like automatic parking.

            • dzhiurgis31 minutes ago
              What is the point of such trolling?
    • FireBeyondan hour ago
      Tesla has always had a weirdness with trains. A couple of years ago in Pennsylvania, I watched, bemused, as a train rolled by at a crossing (we were driving manually). It looked like an erratic convoy of trucks, depending on whether there was a container on the car or not.

      Tesla stans will say "well, just because it doesn't visualize the train properly doesn't mean it doesn't know it's a train", but shit like this today just bolsters that that's garbage.

      I still want to see how Tesla does in my town where there's a fun intersection, where four lanes coming west hit a T. Drivers can turn north or south, but there's only two lanes on the north south road, so there's a sequence where the left two lanes can turn north, or south, and then the right two lanes can do it (i.e. staggered so drivers in the left two lanes turning north don't hit drivers in the right two turning south, and also don't have to try to merge 4 lanes into 2 while turning).

      I guarantee FSD would absolutely shit the bed (sorry, I mean, "disengage" to preserve Elon's stats, I mean "your safety") on this intersection.

      It's not ready for primetime. And it's still not close.

  • qwerpy4 hours ago
    My gated community has a gate similar to a railroad gate. My FSD 12 HW3 model Y cannot be trusted at it. My FSD 14 HW4 Cybertruck does fine except if another car is in front of me. Then it tries to tailgate the car in. Strangely, the Y has the ultrasonic distance sensors and the cybertruck does not. The truck seems to be able to handle the gate detection but doesn’t understand the rule that only one car can go at a time.

    That being said, if I were first in line at a railroad crossing I think I’d disengage FSD to be safe. If I were in a Waymo I’d be very nervous. LiDAR or not, an error can be catastrophic.

    • strogonoff4 hours ago
      If one claims that an error at a railroad gate can be catastrophic and therefore FSD should be disabled in that situation, how does one ethically reconcile that with enabling FSD on any regular street with pedestrians?

      The principal difference that comes to mind is that in the latter case it would be catastrophic to others as opposed to yourself: you are the train in that situation, except pedestrians have no airbags and without the railroad gate equivalent they are not made aware of taking this risk.

      • qwerpy3 hours ago
        That’s a very interesting way to look at it! But my reasoning for continuing to do what I do is that FSD is bad at thin gates and much better at avoiding pedestrians. So it’s not an all or nothing thing for me.
  • 6 hours ago
    undefined
  • dzhiurgis4 hours ago
    [flagged]