I know a lot of people want to interpret copyright law so that allowing a machine to learn concepts from a copyrighted work is copyright infringement, but I think what people will need to consider is that all that’s going to do is keep AI out of the hands of regular people and place it specifically in the hands of people and organizations who are wealthy and powerful enough to train it for their own use.

If this isn’t actually what you want, then what’s your game plan for placing copyright restrictions on AI training that will actually work? Have you considered how it’s likely to play out? Are you going to be able to stop Elon Musk, Mark Zuckerberg, and the NSA from training an AI on whatever they want and using it to push propaganda on the public? As far as I can tell, all that copyright restrictions will accomplish to to concentrate the power of AI (which we’re only beginning to explore) in the hands of the sorts of people who are the least likely to want to do anything good with it.

I know I’m posting this in a hostile space, and I’m sure a lot of people here disagree with my opinion on how copyright should (and should not) apply to AI training, and that’s fine (the jury is literally still out on that). What I’m interested in is what your end game is. How do you expect things to actually work out if you get the laws that you want? I would personally argue that an outcome where Mark Zuckerberg gets AI and the rest of us don’t is the absolute worst possibility.

      • veridicus@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        No, I’m not your Google. You can easily read the background of Stable Diffusion and see it’s based on Markov chains.

        • IncognitoErgoSum@kbin.socialOP
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          LOL, I love kbin’s public downvote records. I quoted a bunch of different sources demonstrating that you’re wrong, and rather than own up to it and apologize for preaching from atop Mt. Dunning-Kruger, you downvoted me and ran off.

          I advise you to step out of whatever echo chamber you’ve holed yourself up in and learn a bit about AI before opining on it further.

          • veridicus@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 year ago

            My last response didn’t post for some reason. The mistake you’re making is that a neural network is not a neural simulation. It’s relatively simple math, just on a very large scale. I think you mentioned earlier, for example, you played with PyTorch. You should then know that NN stack is based on vector math. You’re making assumptions based on terminology but when you read deeper you’ll see what I mean.

            • IncognitoErgoSum@kbin.socialOP
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              I said it was a neural network.

              You said it wasn’t.

              I asked you for a link.

              You told me to do your homework for you.

              I did your homework. Your homework says it’s a neural network. I suggest you read it, since I took the time to find it for you.

              Anyone who knows the first thing about neural networks knows that, yes, artificial neurons are simulated with matrix multiplications, why is why people use GPUs to do them. The simulations are not down to the molecule becuase they don’t need to be. The individual neurons are relatively simple math, but when you get into billions of something, you don’t need extreme complexity for new properties to emerge (in fact, the whole idea of emergent properties is that they arise from collections of simple things, like the rules of the Game of Life, for instance, which are far simpler than simulated neurons). Nothing about this makes me wrong about what I’m talking about for the purposes of copyright. Neural networks store concepts. They don’t archive copies of data.