43 pointsby voxadam7 hours ago10 comments
  • hannahstrawbrry7 hours ago
    "Something unexpected where the machine drove like a machine rather than a person..." is nonsense. There were over 1500 reports of humans driving the wrong way on Arizona roads last year. Humans drive that way all the time.
    • prewett5 hours ago
      I don't generally have to entrust myself to humans doing crazy things, though. An actual person driving a taxi is familiar enough with the roads that they are unlikely to do something like that. Probably also Uber/Lyft drivers. And there is usually an option to rent a car and drive yourself, in which case any craziness the driver suffers from is at least my own fault.
    • munk-a7 hours ago
      Do we think it's acceptable to target a level of driving proficiency for automation that could be featured in dashcam footage that gets uploaded to youtube because it's so extremely inept or are we aiming for a higher target that might be a middling amount of driving proficiency?
      • potato37328426 hours ago
        Most autonomous failures aren't failures in a way that make good video though. It's like your grandma or 16yo daughter that gets stuck at a yield for no good reason. Nobody is gonna watch that so nobody uploads it.

        A robotaxi that has a "low enough to be acceptable" frequency of the above failure mode is likely to have enough occasional "full send" failure modes to make for Youtube fodder when deployed at scale even if they're comparatively rare compared to humans or the other failure type, or some other standard.

      • deepspace6 hours ago
        That is a very disingenuous take on the comment. We should of course target a higher level of proficiency than that, but the point is that many humans make stupid driving decisions every day. We can hold machines to a higher standard, but perfection is an unrealistic standard.
        • munk-a6 hours ago
          I don't think I was being disingenuous but I did try to specifically call out aiming for a middling (and not perfect) proficiency. Driving onto tram tracks on a clear day is unacceptably poor performance. This is something that a good driver is unlikely to ever do in their lifetime and - if it happened - likely involved some extreme circumstances.
    • puttycat3 hours ago
      > Humans drive that way all the time.

      Why do we hold calculators to such high bars? Humans make calculation mistakes all the time.

      Why do we hold banking software to such high bars? People forget where they put their change all the time. Etc etc.

    • shputil7 hours ago
      1500 out of a lot. It's probably more accurate to say that those 1500 didn't drive like humans either.
      • Bjartr7 hours ago
        By definition, they did. Humans do weird, irrational things sometimes. It's part of being human.
        • shputil6 hours ago
          yeah, sure, if you want to take everything that any human does as "being human" "by definition." Then I guess it's human to eat spiders and bathe in your own shit. I think it would be more useful to at least consider the normal level of behavior.
    • drob5187 hours ago
      But how many humans drove on train tracks? Yes, humans make scads of mistakes, but they don’t typically make this one.
      • eightysixfour7 hours ago
        > But how many humans drove on train tracks?

        I have seen this twice in my life. One person who freaked out because they stopped in the tracks and then turned onto them, the second I still have no idea how they got there.

        I think there are two really big issues with the roll out of self-driving cars that are going to be hard for us to overcome:

        1. Their mistakes are going to be highly publicized, but no one is publicizing the infinite number of dumbass things human drivers do every day to compare it to.

        2. They're going to make mistakes that are extremely obvious in hindsight or from a third party perspective, that most humans will say no human would have ever done. It is likely that a human has and would have made similar and worse mistakes, and makes them at a higher rate, and we will have to accept these as a reality in a complex world.

        • arcfour7 hours ago
          > Their mistakes are going to be highly publicized, but no one is publicizing the infinite number of dumbass things human drivers do every day to compare it to.

          Idea: "Waymo or Human," a site like Scrandle where we watch dashcam clips of insane driving or good driving in a challenging situation and guess if it's a human or self-driving car.

        • mmmlinux6 hours ago
          > 1. Their mistakes are going to be highly publicized, but no one is publicizing the infinite number of dumbass things human drivers do every day to compare it to.

          People still complain about that one cat that got run over. As if the Waymo jumped the curb and chased it down.

        • jsbisviewtiful7 hours ago
          Frustratingly Americans seem to inherently despise public transit (probably because owning a car has become so necessary due to poor city planning ON TOP OF the classist appeal) despite the advantages and local/state govs refuse to give public transit options proper funding and oversight - leading to even more distaste for public transit.

          Personally I won't be using one of these cars because I want to contribute to other humans' paychecks, but I would much rather be using public transit over adding more and more cars to more and more roads/lanes.

          All of the negative publicity around the autonomous cars is justified IMO because, even if these cars are "safer" than a human, they are still clearly not as safe as they need to be when considering liability, local laws, basic driving etiquette and the cost to other humans' incomes.

          • eightysixfour6 hours ago
            > but I would much rather be using public transit over adding more and more cars to more and more roads/lanes.

            Good luck rearchitecting the entire way of life of the vast majority of Americans, not to mention somehow tearing out and replacing the entirety of our transportation infrastructure. I'm generally of the persuasion that we should reduce our reliance on cars and I intentionally live in a dense city with half-decent transit but this fever dream that highly individualistic Americans are going to get on board with shared transit is just that, a fever dream.

            It would be good for us, but that doesn't mean it is inevitable or even possible at this time. Acknowledging that is important because it means you invest in alternatives that may actually get adopted.

            > All of the negative publicity around the autonomous cars is justified IMO because, even if these cars are "safer" than a human, they are still clearly not as safe as they need to be when considering liability, local laws and the cost to other humans' incomes.

            So now we come to the other half of your argument. Waymos are safer and it isn't even close. If I am an insurance company and you are asking me to cover a human or a Waymo I'm taking the Waymo 10/10 times. Humans are actually pretty bad at driving and we're getting worse as we're more distracted, not better. The simple math of the liability is going to move us towards self-driving cars more rapidly, not slow it down.

            The only other argument I see buried in here is "cost to other human's incomes." Whether you mean gig economy workers, taxi drivers, or transit operators, I have a dozen arguments here but the simplest is maybe you should prioritize the 40k lives lost every year to motor vehicle accidents over income. We'll find other places for productivity and income. You don't get the dead people back.

          • etempleton6 hours ago
            Americans don’t despise public transit. They despise poorly maintained / insufficient public transit. Outside of New York and San Francisco, public transit is really not sufficient to get you where you need to go.

            Many cities could do better to have more robust public transit, but the reality is America is vast and people commute long distances regularly. The cost of deploying such vast amounts of public transit would be prohibitively expensive.

            • eightysixfour6 hours ago
              > Americans don’t despise public transit. They despise poorly maintained / insufficient public transit. Outside of New York and San Francisco, public transit is really not sufficient to get you where you need to go.

              I used to believe this, I'm not sure it is actually true though for a large percentage of Americans. There is some unmet demand that would be satisfied, but beyond that, most Americans value their individualism and control (even if it is controlling where a driver takes them via an app) too much unless they were raised around good transit. That means that even if we build good transit, it will probably take more than a generation for someone to use it fully and effectively.

              • kyleee2 hours ago
                It also depends a lot on the culture of other riders. It takes relatively few undesireables to cause the preference to swing back to personal transport options
      • nwallin7 hours ago
        These are light rail/tram tracks, not railroad tracks. The road is the same type of road that you normally drive up, they just have train tracks embedded in the road surface, signs telling you not to drive there, and every now and then a tram drives along it.

        Functionally, they're no different than bus lanes or a wide shoulder. Humans drive on them all the time, because there's no traffic on them and they can get to where they're going faster. They shouldn't, it's illegal, and they can get ticketed for it, but they do it anyway. If you load up google street view in Phoenix/Tempe/Gilbert you can see a few people driving on them.

        • lokar7 hours ago
          People have been getting confused and driving cars towards/into the SF muni tunnel for ages.
        • drob5186 hours ago
          Okay, so this is tracks embedded in and parallel with road surface, not tracks with cross-ties, sitting on ballast. That’s a bit more understandable, then.
      • yibg7 hours ago
        I don't know if there are stats for this but it wouldn't surprise me if there were non zero incidents of it. Drivers that are high / drunk, mentally impaired etc. More broadly, lots of cars driven by humans collide with trains, which is the at least one of the core issues here.

        EDIT: anecdotally at least for this type of ground level light rail, I've seen people drive on similar streetcar tracks (that are not shared with cars) in Toronto more than one time.

        • paleotrope6 hours ago
          Happens all the time in Boston on the green and sometimes the silver lines.
      • 7 hours ago
        undefined
      • hannahstrawbrry7 hours ago
        Seen this exact thing happen a handful of times myself lol (I live in Phoenix)
      • thunfischbrot7 hours ago
        Spending 2 minutes on google news or youtube equipped with keywords such as "car", "train tracks" and "stuck" will show you otherwise.
      • gretch6 hours ago
        I’ve seen it personally in San Jose. Guy turned left but instead of continuing onto the crossing road, he turned onto the VTA rails in the middle of the road. Then proceeded to get stuck on the concrete partition once the intersection was over, and work crews had to come out to fix the mess.
      • nickff7 hours ago
        There are many pictures of cars driving in bike lanes in my city (and yes, the bike lanes are very small and well-signed).
      • arcfour7 hours ago
        I absolutely believe that humans would drive on train tracks. There is no shortage of terrible, insane, ignorant, and purely self-interested drivers on the road. Just look at any dashcam video compilation!

        The difference, of course, is that when a human does it we just say "what an idiot!" But when a machine does it, some people say "well, obviously machines can't drive and can never drive as well as a human can," which is silly.

        • drob5187 hours ago
          Of course humans have driven on tracks. The point was that humans driving on tracks is far more rare than driving the wrong way on a one-way street, so this sticks out.
        • munk-a7 hours ago
          I think there's an epistemological issue in your statement. When a human does it we say "what an idiot" because the driver is performing at a level below the generally accepted proficiency for driving. I think our reasonable expectations for autonomous driving is around an average level of proficiency. I also don't think it's reasonable to delay implementing technology until it's better than the best of us - but this was an utter failure and is not within the bounds of acceptability.

          I do think it's fair to argue that this is probably an oversight that can be corrected now that it has been revealed, though.

          • arcfour7 hours ago
            But in many ways, self-driving cars are better than or equal to the average driver. If I make a mistake once but am otherwise an exemplary driver, am I a bad driver? Same goes with these. The question isn't are they perfect (which is unfair), it's whether they are in aggregate safer than humans, which is objectively true, but people making big examples of issues like this serve only to muddy the water and scare the public needlessly.

            So, in the interests of avoiding needless regulation that would make us less safe, I think it's important to point out that these comparisons are unfair & examples are typically extremely rare edge cases.

            • munk-a6 hours ago
              I agree that in many ways self-driving cars are better than average. And I selfishly want to accelerate the adoption due to the fact that I'll appreciate it the most when other drivers are using it to reduce the chances to getting hit by a drunk or highly distracted driver.

              However, I think that driving on tram tracks is unacceptably bad - it is something that a good driver would simply never do outside of really strange circumstances (like complete vision loss due to heavy storm weather). This example shouldn't be used as a single example to bar autonomous vehicles but it should also be properly recognized as unacceptable to be repeated.

    • lopifjun7 hours ago
      This is such a terribly ineffectual comparison, but okay... and I'd jump out of their car if I was a passenger and they were ignoring my pleading with them to stop.
    • kgwxd7 hours ago
      Driving yourself onto tracks is natural selection. AI driving a person onto tracks is artificial selection.
    • kazinator7 hours ago
      Do they continue to drive that way if they have a passenger yelling, "we are on train tracks?" to the point that said passenger has to bail out?

      (Sure, within the wide boundaries of mental health, it is not impossible.)

      • duskwuff6 hours ago
        Bingo. Humans can make mistakes, but they can also recognize that something's gone wrong and change tactics to recover from it. Current self-driving systems can't do that effectively; they'll just keep going, even when they probably shouldn't.
      • potato37328427 hours ago
        >Do they continue to drive that way if they have a passenger yelling, "we are on train tracks?"

        On a non-separated rail like in the video where you can just turn off at any time I can see a lot of people continuing to do it just to spite their spouse for screeching about the obvious.

        Or on the other side of the coin I can see a lot of people just say nothing because it's probably fine and they'd rather not have the argument.

      • kyleee2 hours ago
        That would open up some very fun avenues for mischief if you could start yelling at the Waymo and pleading for it to change course
    • 7 hours ago
      undefined
    • comrade12347 hours ago
      lol. I knew someone was going to come in with this reasoning for dismissing it. hn has become a parody of itself.
      • lopifjun7 hours ago
        If the "It is difficult to get a man to understand something, when his salary depends on his not understanding it" and "notice me senpai" memes had a baby raised by an automaton nanny it would be the majority HN user.
  • gortok6 hours ago
    The comments in this thread are wild.

    Folks in this thread are trying to compare Waymo to human driving as some sort of expectation setting threshold. If humans can’t be perfect why should we expect machines to be?

    We don’t expect humans to be perfect. When a human breaks the law we punish them. When they are sued civilly and found liable, we take their money/property.

    There’s also a sense of self-preservation that guides human decision making that doesn’t guide computers.

    Until we account for the agency that comes along with accountability, and the self-preservation mechanisms that keep humans from driving someone else onto a light rail track, we are making a false equivalence in saying that somehow we can’t expect machines to be as good as humans. We should expect exactly that if we’re giving them human agency but not human accountability, and while they still lack the sense of preservation of self or others.

    • shputil6 hours ago
      Because we necessarily need higher standards for a self-driving system than for humans. A human failure is isolated; a machine failure is systemic.

      I, as a somewhat normal driver, am not personally at much risk if some other driver decides to drive on the rails. That won't be true if I'm in a Waymo and there's nothing I can do about its bugs.

      And I don't blame people who are skeptical that Waymo will be properly punished. In fact, do you suppose they were punished here?

      • xnx5 hours ago
        > A human failure is isolated; a machine failure is systemic.

        Some truth to this, but a machine failure can be patched for all cars. There's no effective way to patch a problem with all human drivers.

  • drob5187 hours ago
    The biggest issue for the wide deployment of autonomous cars is the legal liability for stuff like this, where it’s obvious that a human was unlikely to make this mistake. The robot can make human mistakes, but it can’t make robot mistakes.
    • bryanlarsen7 hours ago
      If a human kills or seriously injures somebody with a car, the insurance company will pay out 6 or 7 figures.

      If Waymo et al can successfully limit payouts to that, they'll be fine.

      OTOH, if every death or serious injury comes with a 9 figure+ punitive damage award, they'll be in trouble.

      • drob5186 hours ago
        Exactly. The bots are probably saving lives at this point just being more reliable than humans on the basics, but if they make mistakes that lead to outsized payouts, then the market won’t be insurable and will fail for legal reasons, not for technical reasons. I suspect we’ll figure it out in the long run because it’s worth it to do so, but the short run might be pretty rocky.
      • supertrope5 hours ago
        Many state minimum insurance limits are laughably low. Like $30,000.
        • potato37328425 hours ago
          Good. Auto insurance suffers from the same cost spiraling principal-agent problems as healthcare.

          I think states that nudge but don't completely require insurance have it right (I live near one of them). Even though most people have insurance the plausible threat of having to actually litigate against someone and attempt to collect seems to put huge downward pressure on everything saving even the insured people a ton of money more than offsetting the "risk" of sharing a state with uninsured people. Having laughably low minimums is the next best thing.

  • xnx6 hours ago
    As an aside, light rail that shares roadways with cars seems like such a dumb idea. More expensive and less flexible than buses, can get stuck in traffic, lots of accidents because drivers don't expect it, etc.
    • supertrope5 hours ago
      Leaders see the cost of a raised right of way or even just a dedicated lane and balk. Which is really dumb because after spending all the money on rail you screw up the last 1/10 by having a tram get stuck behind single occupancy cars. One person illegally parking can hold up dozens of riders. But our politicians still can exploit our bias of rail = good, bus = bad and show up for the ribbon cutting ceremony.
      • xnx5 hours ago
        > But our politicians still can exploit our bias of rail = good, bus = bad

        There's also a lot more campaign donation money for huge train projects from engineering and construction firms.

  • justin_hancock6 hours ago
    The expectation that a machine should not make these mistakes is reasoned, humans might make these mistakes but they will generally realise their mistake. The machine doesn't know its wrong. A human learns to drive within 20-40 hours with reasonable competency, versus the effectively millions of hours of computation that's used to train the models, Given this the expectations aren't as unrealistic as some on here claim.
  • csb65 hours ago
    Around the same time not far from there I saw a Waymo car partially blocking a lane of traffic as if it had frozen partway through turning. Wonder if they temporarily shut down all cars in that area until they could figure out what was going on.
  • tapper6 hours ago
    I am glad I was not the passenger. I am blind and would not even know we are driving down tracks.
    • kyleee2 hours ago
      Ooh boy, that would be a good civil lawsuit. They should actually address that edge case (and related ones) before people are killed/maimed. Hopefully they have
  • returnInfinity7 hours ago
    "The society will tolerate some amount of deaths to AI vehicles"
    • wmf7 hours ago
      That's the correct answer; the problem is that society may not tolerate those deaths.
  • moralestapia7 hours ago
    >“I actually felt a little sorry for the car. It obviously made a bad decision and got itself in a difficult place,” said Andrew Maynard, an emerging and transformative technology professor at ASU.

    Oof, psychopath much?

    • jlebar7 hours ago
      This is literally the opposite of psychopathic behavior.
      • chrisco2557 hours ago
        The machine is risking not only his life but the life of every other person on the road. Meanwhile, the machine has no feelings and has no life to lose. It feels no pain. It has no memory nor identity.
    • arcfour7 hours ago
      Feeling empathy (for a machine) is... psychopathic?
      • lopifjun7 hours ago
        > empathy: the ability to understand and share the feelings of another.

        Does this car "feel"?

        When these shit suckers peddling these slop slingers want you to anthropomorphize them it's a positive signal to stop doing that.

        • lokar7 hours ago
          Having empathy for someone or something does not require the target of the empathy to itself have any feelings on the subject, or at all.
          • lopifjun6 hours ago
            Show me a single source supporting this convoluted claim.

            > or at all

            That's called anthropomorphizing, as noted in my gp, and it is a different phenomenon from empathy.

            > Anthropomorphism (from the Greek words "ánthrōpos" (ἄνθρωπος), meaning "human," and "morphē" (μορφή), meaning "form" or "shape") is the attribution of human form, character, or attributes to non-human entities [0]

            "There is a medical condition known as delusional companion syndrome where people can have these feelings of empathy to a much more extreme extent and can be convinced that the objects do have these emotions — but it is much less common than the average anthropomorphizing, Shepard said." [1]

            [0] https://en.wikipedia.org/wiki/Anthropomorphism

            [1] https://www.cnn.com/2024/09/07/health/empathize-inanimate-ob...

            • lokar5 hours ago
              Empathy is an emotional response people have to someone or something.

              It is an internal (to the person experiencing it) phenomenon. Feeling empathy does not require the object of the empathy to be intelligent, have emotions, or even thoughts. And it does not require the person experiencing it to believe that the object has the attributes, it does not require anthropomorphizing.

              People feel empathy towards all sorts of people, things, and groups.

  • sgt1016 hours ago
    "Waymo passenger flees after remote operator drives on Phoenix light rail tracks"

    There, fixed it.