• edwardbear@lemmy.world
      link
      fedilink
      English
      arrow-up
      139
      arrow-down
      4
      ·
      16 days ago

      No reasons to be concerned, citizen. The former head of the largest surveillance agency in the world just joined as a C-level member to the largest data scraping company in the world

        • WhatAmLemmy@lemmy.world
          link
          fedilink
          English
          arrow-up
          28
          arrow-down
          2
          ·
          16 days ago

          You will tell the AI all of your most private thoughts and feelings. The AI will be your closest friend, lover, and confidant.

          If you refuse to let the AI know everything about you, you will be considered a terrorist pedophile… a TERROR-PEDO!

          • edwardbear@lemmy.world
            link
            fedilink
            English
            arrow-up
            11
            ·
            16 days ago

            How dare you have secrets? What are you hiding there? Why are you trying to have privacy? How dare you?

            • Optional@lemmy.world
              link
              fedilink
              English
              arrow-up
              10
              ·
              16 days ago

              Remain calm. Assume the position. Your patience is appreciated. A legally authorized operative will be with you shortly. Stop resisting. Or else it gets the hose again.

          • Etterra@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            16 days ago

            Remember that Friend Computer loves you. Returning that love TO Friend Computer is MANDATORY, and failure to comply will be considered treason and thus grounds for IMMEDIATE TERMINATION. Thank you citizen and have a mandatorily happy day!

    • ricecake@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      6
      ·
      16 days ago

      It’s a bit of a non-story, beyond basic press release fodder.

      In addition to it’s role as “digital panopticon”, they also have a legitimate role in cyber security assurance, and they’re perfectly good at it. The guy in question was the head of both the worlds largest surveillance entity, but also the world’s largest cyber security entity.
      Opinions on the organization aside, that’s solid experience managing a security organization.
      If open AI wants to make the case that they take security seriously, former head of the NSA, Cyber command and central security service as well as department director at a university and trustee at another university who has a couple masters degrees isn’t a bad way to try to send that message.

      Other comments said open AI is the biggest scraping entity on the planet, but that pretty handily goes to Google, or more likely to the actual NSA, given the whole “digital panopticon” thing and “Google can’t fisa warrant the phone company”.

      Joining boards so they can write memos to the CEO/dean/regent/chancellor is just what former high ranking government people do. The job aggressively selects for overactive Leslie Knope types who can’t sit still and feel the need to keep contributing, for good or bad, in whatever way they think is important.

      If the US wanted to influence open AI in some way, they’d just pay them. The Feds budget is big enough that bigger companies will absolutely prostrate themselves for a sample of it. Or if they just wanted influence, they’d… pay them.
      They wouldn’t do anything weird with retired or “retired” officers when a pile of money is much easier and less ambiguous.

      At worst it’s open AI trying to buy some access to the security apparatus to get contracts. Seems less likely to me, since I don’t actually think they have anything valuable for that sector.

      • exanime@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        15 days ago

        At worst it’s open AI trying to buy some access to the security apparatus to get contracts. Seems less likely to me, since I don’t actually think they have anything valuable for that sector.

        Didn’t you just also said this in this same post?

        The Feds budget is big enough that bigger companies will absolutely prostrate themselves for a sample of it

        • ricecake@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          15 days ago

          Those aren’t contradictory. The Feds have an enormous budget for security, even just “traditional” security like everyone else uses for their systems, and not the “offensive security” we think of when we think “Federal security agencies”. Companies like Amazon, Microsoft, and Cisco will change products, build out large infrastructure, or even share the source code for their systems to persuade the feds to spend their money. They’ll do this because they have products that are valuable to the Feds in general, like AWS, or because they already have security products and services that are demonstrably valuable to the civil security sector.

          OpenAI does not have a security product, they have a security problem. The same security problem as everyone else, that the NSA is in large part responsible for managing for significant parts of the government.
          The government certainly has interest in AI technology, but OpenAI has productized their solutions with a different focus. They’ve already bought what everyone thinks OpenAI wants to build from Palantir.

          So while it’s entirely possible that they are making a play to try to get those lines of communication to government decision makers for sales purposes, it seems more likely that they’re aiming to leverage “the guy who oversaw implementation of security protocol for military and key government services is now overseeing implementation of our security protocols, aren’t we secure and able to be trusted with your sensitive corporate data”.
          If they were aiming for security productization and getting ties for that side of things, someone like Krebs would be more suitable, since CISA is a bit more well positioned for those ties to turn into early information about product recommendations and such.

          So yeah, both of those statements are true. This is a non-event with bad optics if you’re looking for it to be bad.

          • exanime@lemmy.today
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            15 days ago

            So you are speculating this is all good and innocent while I’m speculating they hire this guy to aim their data harvesting in a way the government would want to pay tons for it… Yet your speculation is apparently more valid then mine because, checks notes, reasons

            • ricecake@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              15 days ago

              Yes, neither of us is responsible for hiring someone for the OpenAI board of directors, making anything we think speculation.

              I suppose you could dismiss any thought or reasoning behind an argument for a belief as “reasons” to try to minimize them, but it’s kind of a weak argument position. You might consider instead justifying your beliefs, or saying why you disagree instead of just “yeah, well, that’s just, like, your opinion, man”.