General Motors’ driverless Cruise taxis can no longer operate on California roads without a safety driver, effective immediately.

  • ImFresh3x@sh.itjust.works
    link
    fedilink
    arrow-up
    16
    arrow-down
    2
    ·
    1 year ago

    Self driving cars have been such a perfect example of something not living up to what it was touted to be. We are way behind in tech, and way ahead in what we’re being sold/told. Meaningful self driving (fully fleshed out level 4 or 5) reality is 10-20 years away.

    • guacupado@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Nothing is ever what it’s touted to be in the beginning. All the movies you see with this stuff working would have gone through what we’re going through now in their worlds.

    • dan1101@lemm.ee
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      edit-2
      1 year ago

      I’ve concluded either self driving vehicles need to be completely isolated from all the humans on/near public roads doing a bunch of unexpected things, or it would take an actual artificial intelligence to really share the roads with human drivers. And once you have actual artificial intelligence that opens up a whole new set of logistical, ethical, and possibly existential problems.

      • Johanno@feddit.de
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        1 year ago

        You mean like put them underground and maybe on rails and then make them bigger to support more people? Like like like a Subway?

        • dan1101@lemm.ee
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          1 year ago

          Yeah like a train or subway. Still, trains and subways don’t have to deal with traffic conditions but they still are expected to attempt an emergency stop if there is a hazard ahead. Self-driving cars don’t even do that reliably.

    • lepthesr@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      There are a lot of benefits for them, but you’re right, they aren’t all there.

      What I could agree to in the meantime is dedicated lanes on freeways and shit. They’d have to have their own infrastructure, and California does that really quick.

  • jeffw@lemmy.worldM
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    1 year ago

    My only concern with self-driving vehicles is their inability to respond in emergencies. Ambulances, fire trucks, etc. have all had serious issues with them in California.

      • jeffw@lemmy.worldM
        link
        fedilink
        arrow-up
        7
        arrow-down
        2
        ·
        1 year ago

        In terms of failure rates, it’s a tiny problem. Someone gets confused for a few seconds vs a computer stopping and completely blocking traffic

  • guacupado@lemmy.world
    link
    fedilink
    arrow-up
    5
    arrow-down
    2
    ·
    edit-2
    1 year ago

    Cruise to San Francisco: If you don’t stop criticizing us, we’re threatening to pull out of your city.

    California to Cruise: Actually, you’re leaving the state.

  • mctoasterson@reddthat.com
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    1 year ago

    Why the fuck would anyone sign up for the “safety driver” job?

    It would be boring- you wouldn’t have much control but also couldn’t multitask or do anything else during your shift. Presumably it wouldn’t pay all that much because the entire business model of this company is to create a market efficiency on the labor side. Finally there is the outstanding question of how much liability you assume as the “safety driver”. If the self-driving features start bugging out, but you don’t stop it in time, are you on the hook for damage it causes? I would want a ton of legal liability assurances before I took that job.

  • dan1101@lemm.ee
    link
    fedilink
    arrow-up
    7
    arrow-down
    5
    ·
    edit-2
    1 year ago

    That seemed very obvious from the start. Even if computers don’t get distracted or tired, they don’t have the intelligence to match even a moderately engaged human driver.

  • circuitfarmer@lemmy.sdf.org
    link
    fedilink
    arrow-up
    2
    arrow-down
    2
    ·
    1 year ago

    The barrier to entry here should be very, very high. These cars can’t just work most of the time or only under typical conditions. Human lives are on the line. When a typical car accident happens, sometimes we may blame the driver or assume that it was a perfect storm and that another driver may have fared differently. But these systems all operate identically, meaning any single failure is indicative of a widespread issue across the entire design in a way that just isn’t true of humans.

    When things get frenetic, I think most people don’t want to be the one sitting in the back of a confused self-driving car. And let’s not forget that the only ones really benefitting are companies that no longer need to employ drivers. Personally, I am not willing to jeopardize public safety for a little bit of novelty, nor for companies to save money.

    • wahming@monyet.cc
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      The barrier to entry here should be very, very high.

      No, the barrier should be higher than the average human driver.

    • Franzia@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      These companies need to stop beta testing on public streets and build their own Hollywood set to go drive on.