Employers are letting artificial intelligence conduct job interviews. Candidates are trying to beat the system.

“And when they got on the phone, Ty assumed the recruiter, who introduced herself as Jaime, was human. But things got robotic.”

  • deweydecibel@lemmy.world
    link
    fedilink
    English
    arrow-up
    134
    ·
    7 months ago

    And when they got on the phone, Ty assumed the recruiter, who introduced herself as Jaime, was human. But things got robotic.

    If regulators are trying to come up with AI regulations, this is where you start.

    It should be a law that no LLM/“AI” is allowed to pass itself off as human. They must always state, up front, what they are. No exceptions.

    • PM_Your_Nudes_Please@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      7 months ago

      I would argue that AI also shouldn’t be allowed to make legally binding decisions, like deciding who to hire. Since a computer can’t be held accountable for its decisions, there’s nothing stopping it from blatantly discriminating.

    • FlumPHP@programming.dev
      link
      fedilink
      English
      arrow-up
      10
      ·
      7 months ago

      It should be illegal to use an AI in the hiring process that can’t explain its decisions accurately. There’s too much of a risk of bias in training data to empower a black box system. ChatGPT can lie, so anything powered by it is out.

    • NoRodent@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      7 months ago

      They also should not harm a human being or, through inaction, allow a human being to come to harm.

    • FiveMacs@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      7 months ago

      Yes. I assume anyone from a company is a bot right out of the gate.