• h3rm17@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        13
        ·
        7 months ago

        Yeah, for real, “Someone will 100%, do you want it to be your friends/family/people you know or some absolute random stranger?” Some lemmitors would surely answer “My people, for sure”

      • blackn1ght@feddit.uk
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        2
        ·
        7 months ago

        The human does it out of self preservation, but the car doesn’t need to feel too preserve itself.

        By getting the in the car, the passengers should be aware of the risks and that if there is an accident, the car will protect pedestrians over the occupants. The pedestrians had no choice but the passengers have a choice of not getting in the vehicle.

        I feel like car manufacturers are going to favour protecting the passengers as a safety feature, and then governments will eventually legislate it to go the other way after a series of high profile deaths of child pedestrians.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          8
          ·
          7 months ago

          You’re probably over-estimating the likelyhood of a scenario where a self driving car needs to make a such decision. Also take into account that if a self driving car is a significantly better driver than a human then it’s by definition going to be much safer for pedestrians aswell even if it’s programmed to prioritize the passengers.

        • ඞmir@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          7 months ago

          On the flip side, if you know a car will kill a passenger to save an outsider, it becomes very easy to “accidentally” murder a passenger and get away with it…

      • Hacksaw@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 months ago

        Nah, I think most people would crash into a tree rather than clear a sidewalk. Cars are designed to protect you in a crash. Pedestrians don’t have seatbelts, crash zones, and airbags.

        • dream_weasel@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          ·
          7 months ago

          I think you’re way over estimating driver reflexes and reaction capabilities. I don’t think most accidents give a good long time to consider the next step.

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      2
      ·
      7 months ago

      Who would buy a car that will sacrifice the passengers in the event of an unavoidable accident? If it’s significantly better driver than a human would be then it’s safer for pedestrians aswell.

    • Rinox@feddit.it
      link
      fedilink
      English
      arrow-up
      6
      ·
      7 months ago

      It’s not really an issue. 99.9% of the time the passengers will already be safe and the pedestrian is the one at risk. The only time I see this being an issue is if the car is already out of control, but at that point there’s little anyone can do.

      I mean, what’s the situation where a car can’t break but has enough control where it HAS to kill a pedestrian in order to save the passengers?

      • MeanEYE@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        Tesla on their autopilot during night. All the time basically. There were number of motorcycle deaths where Tesla just mowed them down. The reason? They had two tail lights side by side instead one big light. Tesla thought this was a car far away and just ran through people.

        • Rinox@feddit.it
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 months ago

          That’s a problem with the software. The passengers in the car were never at risk and the car could have stopped at any time, the issue was that the car didn’t know what was happening. This situation wouldn’t have engaged the autopilot in the way we are discussing.

          As an aside, if what you said is true, people at Tesla should be in jail. WTF