The company left out some key details regarding the incident involving one of its robotaxis and a pedestrian.


On October 2, 2023, a woman was run over and pinned to the ground by a Cruise robotaxi. Given the recent string of very public malfunctions the robotaxis have been experiencing in San Francisco, it was only a matter of time until a pedestrian was hurt by the self-driving cars. New reports, though, suggest that Cruise held back one of the most horrifying pieces of information: that the woman was dragged 20 feet by the robotaxi after being pushed into its path.

The LA Times reports:

A car with a human behind the wheel hit a woman who was crossing the street against a red light at the intersection of 5th and Market Streets. The pedestrian slid over the hood and into the path of a Cruise robotaxi, with no human driver. She was pinned under the car, and was taken to a hospital.

But this is what Cruise left out:

What Cruise did not say, and what the DMV revealed Tuesday, is that after sitting still for an unspecified period of time, the robotaxi began moving forward at about 7 mph, dragging the woman with it for 20 feet.

read more: https://jalopnik.com/woman-hit-by-cruise-robotaxi-was-dragged-20-feet-1850963884

archive link: https://archive.ph/8ENHu

  • FoundTheVegan@kbin.social
    link
    fedilink
    arrow-up
    56
    arrow-down
    2
    ·
    edit-2
    1 year ago

    I remember someone here chiding others for critizing Cruise. They were talking up the “fact” that the car stopped and let emergency services dictate what to do instead of risk harming her further. It was GOOD that the car stopped on her.

    No matter what happens, anything or anywhere. There will always be people defending it. I wonder if that person is glad to hear the car heroically dragged her out of traffic?

      • supercriticalcheese@lemmy.world
        link
        fedilink
        arrow-up
        10
        arrow-down
        1
        ·
        1 year ago

        Tell that self driving car "experts. Because there is a silicon chip controlling they think it’s automatically better than a human. Said experts have never heard of random failures (that’s besides bugs).

      • Not_mikey@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        9
        ·
        1 year ago

        The question isn’t whether they’re infallible, just whether they’re less fallible then humans, which is a far lower bar when it comes to driving.

        • theluddite@lemmy.ml
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 year ago

          A bold argument to make on a thread about a car dragging a woman stuck under it for 20 feet and then the company covering it up. The one story contains both the technical problems that people like me have been warning about since day one (bad at edge cases) and the structural and political problem of corporate control over infrastructure.

          • Not_mikey@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            1 year ago

            Yeah but you don’t have to look far to see humans doing way worse with cars. Even in this case the most reckless, irresponsible actor wasn’t the AV, or the company but the person who did the initial hit and run in the first place.

            Ideally we’d get all cars off the streets, there use is dangerous in and of itself. But after being around these things for 3 years now I’d take them over the human drivers who I repeatedly see speeding through intersections.

            • theluddite@lemmy.ml
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              1 year ago

              As far as I know, there doesn’t exist a single shred of actual, empirical evidence that we can make self driving cars actually better than humans, outside of our faith in technological improvement. Companies used to publish their data on simulated injury rates for their internal testing, and they were way way way worse than professional human drivers, like taxi drivers, which is what Cruise is trying to replace.

              I don’t want to be right. I want self-driving cars that work. It’d be personally very convenient for me, so much so that I would buy a brand new car for the first time in my life (I’m in my mid thirties) if I had actual, robust, empirical reason to believe that they work as advertised.

              • Not_mikey@lemmy.world
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                1 year ago

                That article was written over a year ago and since then a lot has changed. Both waymo and cruise have now been approved for av taxis in San Francisco. This decision seems to not have been motivated by hype but a good track record with less incidents then a human driver. Cruise claims it’s greater than 50% better then a comparable human driver and while that may be just their own flawed study this article also agrees that they’re about even if not better then human drivers.

                You’ll probably have to wait a while to buy one though as these are decked out with a larger assortment of sensors compared to a Tesla and probably cost a couple hundred thousand all said and done. They are also specifically trained on the surface streets of San Francisco, so they probably won’t be able to take you on your commute any time soon. Hell id give it another 5 years before their even able to take you to the airport 10 miles south of the city since it would have to get on the freeway. But in this relatively limited problem space they do quite well.

                • theluddite@lemmy.ml
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  Both of those links rely on the same self-reported data by Cruise… We’ll see, I suppose! Happy to be proven wrong. I live on a frozen hilly dirt road, so realistically, for me, it’s not going to happen.

    • yiliu@informis.land
      link
      fedilink
      arrow-up
      13
      arrow-down
      7
      ·
      1 year ago

      On the other hand, there’s people who will condemn the very concept of self-driving cars because of the kind of event that happens every day with regular ol’ human-piloted cars.

      This is a serious incident, it should be thoroughly investigated, regulations on safety and reporting should be seriously considered. But don’t strangle the baby in the crib: self-driving cars have the potential to be much safer than human-driven cars (arguably, they already are). When there’s stories about Cruise taxis stopping in an intersection and the response is an overwhelming flood of “ban all self-driving cars!”, it causes proponents to get overly defensive.

      • FoundTheVegan@kbin.social
        link
        fedilink
        arrow-up
        13
        arrow-down
        1
        ·
        1 year ago

        When there’s stories about Cruise taxis stopping in an intersection and the response is an overwhelming flood of “ban all self-driving cars!”, it causes proponents to get overly defensive.

        But the problem is those events are happening. People are wanting to strangle that baby because the kid is causing harm in the real world. The cars DO need to be taken off the streets, but that doesn’t mean they should be outlawed. However, they absolutely are not fit for public service. As someone who has “driven” a Tesla while it was “Full Self Driving”, and I gotta tell ya that it just can’t handle the rules of the road. It’s not better than a human. We can adapt to new situations on the fly, but a self driving car can only operate in certain parameters.

        While I sympathize that there is nothing you can do about this, the real fault lies with the companies putting vehicles on the road before they are ready. Without a doubt, someday they will be. But as corporations push this out the door in the rush to be the first and the new name brand, they do so at the public expense. It’s the root problem for the cycle of backlash, resentment and defensiveness.

        • Not_mikey@lemmy.world
          link
          fedilink
          arrow-up
          7
          arrow-down
          5
          ·
          1 year ago

          If these need to be taken off the streets then all cars need to be, which I’m not totally opposed to. These ones have been on the streets in SF for a while and Im more afraid of human drivers then these, they are very cautious and more often then not they’ll err on the side of just stopping. That’s what most of the incidents have been, it just stopping and holding up traffic. Even in this scenario it was a human who did the actual hit and run. I’ve been in them a couple times now and feel safer in them then an Uber most times, they never try and blow through a yellow light cause they want to get to their next ride, they wont even speed.

          These also aren’t comparable to Tesla auto pilot. They have way more sensors while Tesla seems to be focused purely on cameras. Teslas are also trying to make a general solution for the whole country whereas these were specifically trained and work in SF. There’s a reason they got approved for full self driving in the city while Tesla hasn’t even applied yet.

      • yA3xAKQMbq@lemm.ee
        link
        fedilink
        arrow-up
        11
        arrow-down
        1
        ·
        edit-2
        1 year ago

        self-driving cars have the potential to be much safer than human-driven cars

        And if car makers have extensively proven that this is, in fact, the case, they might be allowed on the streets.

        (arguably, they already are)

        Narrator: they weren’t.

        “Tesla is having more severe — and fatal — crashes than people in a normal data set,”

        And I guess this normal data set is including US drivers only, who arguably are … not that good.

        it causes proponents to get overly defensive.

        Uh, no. Getting “overly defensive” is your choice, and yours alone. “Look what you made me do” is really a terrible excuse for anything.

        • Not_mikey@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          Your comparing two very different systems. That’s like going to a gas station grabbing some sushi that gets you sick then saying all sushi is dangerous. Teslas have less sensors then the cruise cars and aren’t trained on a contained specific dataset (just San Francisco) like cruise cars have been for over 2 years. For these reasons they are at least even with humans if not safer already and have been approved for self driving in the city while Teslas are far off from even applying.

          • yA3xAKQMbq@lemm.ee
            link
            fedilink
            arrow-up
            3
            arrow-down
            2
            ·
            1 year ago

            You post this: “And because California law requires self-driving companies to report every significant crash, we know a lot about how they’ve performed” as a comment to an article where it’s being revealed that said company covered up some kind of important detail about a “significant crash”?

            Okay 👌

            • Not_mikey@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              Still better than the hit and run driver who caused all this and hasn’t reported anything.

      • Pipoca@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        That’s why we have grade-separated rails, and stations with barriers and doors to the train. Trains also have a fraction of the deaths per passenger mile vs cars.

        Public transit has a variety of benefits. For one thing, the natural enemy of the driver is literally other drivers. Cars are very space intensive, so car-centric cities tend to sprawl.

        Public transit supports walkable/bikeable density, because it does better with a good walkshed around stations and also has really good passenger throughput. That’s good for people’s health - people living in walkable areas are on average less sedentary, and have lower rates of obesity and diabetes. It also tends to be good for the creation of third spaces, and seems to be good for social engagement on average.

        The reason to oppose self-driving cars is really the same reason to oppose car-centric infrastructure broadly: the alternatives are way better.

  • can@sh.itjust.works
    link
    fedilink
    arrow-up
    28
    ·
    1 year ago

    I didn’t know Cruise was a company and at first envisioned this happening aboard a large ship.

  • glibg10b@lemmy.ml
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    The pedestrian slid over the hood and into the path of a Cruise robotaxi, with no human driver. She was pinned under the car, and was taken to a hospital.

    That’s one way to do it.

  • IvanOverdrive@lemm.ee
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    My vision of self driving cars was of an integrated system where all the parts weave together to create a safer and faster environment. But self driving cars are just not able to deal with the edge cases that will pop up. Even that would be okay, but GM tried to cover up this horrific accident. That inspires the opposite of trust. I gotta wonder how many other incidents have been covered up. GM is a company with limited resources. Alphabet, the parent company of Waymo, has a virtually infinite budget. How many incidents have they hidden from the public eye?

    • Omegamint [comrade/them, doe/deer]@hexbear.net
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 year ago

      Anyone with even a hobby level of coding knowledge knows it’s solving the edge cases that’s the real issue with resolving software problems. In this case I wouldn’t be surprised if automation reaches similar or lesser levels of traffic incidents, but the real shitty part is gonna be how much harder it is to get justice from a large corporate entity owning these robotaxi fleets versus nailing the little guy driving his Uber/taxi.

  • HowMany@lemmy.ml
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    1 year ago

    Still working out some of the bugs. Not to worry. Not many of you will have to die in order for us to get the software right.

  • selokichtli@lemmy.ml
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    1 year ago

    These cars should be monitored by human beings until their AI evolves enough to be actually more secure than human professional by-the-law pilots. If a human was monitoring the car, they probably could have stopped it immediately, or even hold it before it starts dragging that poor woman.

    Only if these cars can do the same or better than the human overseeing their activity, these cars will be safe enough to be offering a public service. Also, as shameful as it could be, this incident must get the most publicity because other competitors should test their AI against this specific situation as soon as possible.

    • ricecake@sh.itjust.works
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      1 year ago

      I mean, humans are perfectly capable of not knowing someone is trapped under the car and doing something like this. It’s awful, but it happens pretty regularly. Pulling over to the side after an accident is a pretty heavily ingrained thing.

      In this case, it’s not the technology that scares me, but the company developing it not being honest.
      The car could have a safety record vastly better than any human, but if the company making it isn’t transparent about incidents it entirely undermines our ability to trust that safety record.

    • discodoubloon@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      I don’t fully disagree but they have clearly tried to jump to market without fully investigating the problems. Things like this need serious regulation and that doesn’t exist right now.