• jaamulberry @beehaw.org
    link
    fedilink
    English
    arrow-up
    37
    ·
    1 year ago

    Counter point. I would wager people are more productive scrolling 5 minutes through a Facebook post then taking a 30 minute coffee break talking to various coworkers. I would hate this. Also if you’re a developer how would you research something? No stack overflow? No access to forums to solve particular problems? Not sure this is sustainable.

    • HarkMahlberg@kbin.social
      link
      fedilink
      arrow-up
      18
      ·
      1 year ago

      Losing access to language reference docs would be huge. What are they gonna do, save them all locally? Maintain copies of those sites on the company intranet, at the company’s expense? What happens when the next version of Python is released?

      This is a real cut the nose the spite the face move. Google would hemorrhage developers.

      • Phoenix [she/they]@beehaw.org
        link
        fedilink
        arrow-up
        13
        ·
        1 year ago

        I mean, Google does index and cache most webpages internally already. So yeah, maybe. But after reading the article it doesn’t sound like they’re doing that.

        • HarkMahlberg@kbin.social
          link
          fedilink
          arrow-up
          6
          ·
          1 year ago

          I mean let’s say they solve that part, sure. Let’s go back to Google’s original intent for this maneuver: they want to beef up “security.”

          Ars Technica’s sub-title line says “You can’t get hacked if you aren’t on the Internet.” That is utter nonsense. I’ll take “What is E-Mail?” for 500 Alex. Surely they wouldn’t block EMAIL right? How would they communicate with vendors, partners, governments, etc? Does Google think phishing emails, ransomware, etc don’t work if you don’t have internet access?

          • t3rmit3@beehaw.org
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Actually, most email malware is staged now, so it wouldn’t work. PDFs with the malware embedded get flagged, so PDFs with a link to the malware replaced them. Even most ransomware is via an external link.

      • Mischala@lemmy.nz
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        Can’t Google your obscure package’s runtime error? Guess you aren’t gonna do anything of value for the rest of the day.

      • wim@lemmy.sdf.org
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Why not? They already do for the vast majority of this stuff. It’s not that much and releases of these things are structured and indexed everywhere anyway.

      • The Doctor@beehaw.org
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Storing local copies of docs is a thing some companies do. I’ve worked at a couple of places that did that. And when the next version of $foo is released, and the devs get the go-ahead to use it, wget gets executed to make a new copy. Sucks, but that’s the threat model in some places.

      • abhibeckert@beehaw.org
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        If I had access to a good LLM, that’d be enough for 99% of my research. And the other 1% I could probably do on a phone.

        • aksdb@feddit.de
          link
          fedilink
          arrow-up
          14
          ·
          edit-2
          1 year ago

          LLMs produce text. They don’t answer questions. If the probability of the keywords in the question are being used in correlation with the answer often enough, it might (re)produce the actual answer. But you can never be sure.

          LLMs are not a source for information.

    • Mischala@lemmy.nz
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      Jones on them, half of their developers coffee comes from stack overflow.

      Rip productivity

  • Whirling_Ashandarei@beehaw.org
    link
    fedilink
    English
    arrow-up
    20
    ·
    1 year ago

    Prepare for productivity to tumble lol switching tabs and keeping working is much more efficient than switching between your phone and your computer screen if you’re at a desk job. I guess I can understand why they want to do this but they better get a lot more lax with people being on phones, which I’m not gonna hold my breath on. Just more ways to shit on employees for other companies to emulate, love this capitalist innovation!

    • Hirom@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      I hope organisations invest in qubes os and other container/virtualization tools to make them more practical.

      Taking radical steps like cutting off internet would hurt productivity as much as it improve security.

    • conciselyverbose@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Honestly restricting access to those that require it and going the extra mile to make your whole building a faraday cage would still seem basically fine to me.

      You’d need to have a good way for people to get emergency messages, but it’s a genuine security hole that could genuinely (it’s not super likely but it’s also far from impossible) cost your business a boatload of money.

      • Bilb!@lem.monster
        link
        fedilink
        arrow-up
        11
        ·
        1 year ago

        I’ve worked in “secure” environments for the US military and yeah, open access to the internet while you’re on the job is absurd to expect

  • nickwitha_k (he/him)@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    Seems rather bizarre to me, though it could make sense for some non-technical roles. For developers, seems a bit impractical; much of language documentation is online and odd errors, common and esoteric, are frequently completely absent from docs. This seems likely to require devs to either use unauthorized devices or waste time digging through source (possibly for the programming language itself) to figure things out.

    However, the remark about root access makes me hope that there are not people logging into systems at Google as root. A sudoer, sure, but root is a big no-no.

    • RealAccountNameHere@beehaw.orgOP
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      su root

      rm -rf /SteveHuffmanData/SearchHistory/RealStuff

      mv HorseNPigPorn.jpg LemonParty.html TubGirl.png SteveHuffmanData/SearchHistory

    • skwerls@waveform.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Seems like they could have a machine with higher level access air gapped, and a less secure machine for browsing the internet but not internal tools. Would still suck for copy paste and things of the line, but would probably work in most cases.

      • nickwitha_k (he/him)@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I would think that this would be an approach that absolutely makes sense for corporate infra systems like domain servers, systems with access to network configs, etc.

        Maybe adding an additional security tier? Something like “sandbox dev” where new third-party libraries and technologies can be tested and a “production dev” which is more restricted. That might be the “right” way.

        The problem that I’d see is that productivity, development velocity, and release cadence would all take a nose-dive as software engineers have to continually repeat work, roughly doubling the real amount of work needed to release any piece of software. This would likely be seen as incompatible with modern business and customer expectations.

  • chunktoplane@beehaw.org
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Ars Technica just parroting a CNBC report third hand, when they could add some useful context by sharing traffic numbers to their site from Google-owned IP addresses and user agents.