Which is why they should be legally compelled to publicize all of their datasets, models, research, and share any profits they’ve made with the works they can get provenance data for, because otherwise, it’s an unfair use of the public sphere of content.
One could very easily argue that adblockers are piracy, and those would be stealing from every social media creator, small blog, and independent news site, but I don’t see many people arguing against that, even though that very well includes people who aren’t wealthy corporations.
The issue isn’t necessarily the use of the copyrighted content, it’s the unfair legal stance taken on who can use the content, and how they are allowed to profit (or not profit) from it.
I’m not saying there are no downsides, but I do feel like a simple black and white dichotomy doesn’t properly outline how piracy and generative AI training are relatively similar in terms of who they steal from, and it’s more of a matter of what is done with the content after it is taken that truly matters most.
Generative AI is not going back into the bag. If not OpenAI, then someone else will control it. So we deal with them the next best way, force them to serve us, the people.
Then they can either pay for the copyrighted data they want to train on or lobby for copyright to be reigned in for everyone. Right now, they’re acting like entitled twats with a shit business model demanding they get a free pass while the rest of us would be bankrupted for downloading a Metallica MP3.
The problem isn’t necessarily the use of copyrighted works, (although it can be a problem in many ways) it’s the unfair legal determination of who is allowed to do so.
If OpenAI wants a pass, then just like how piracy services make content freely open and available, they should make their models open.
Give me the weights, publish your datasets, slap on a permissive license.
If you’re not willing to contribute back to society with what you used from it, then you shouldn’t exist within society until you do so.
Piracy steals from the rich and gives to the poor. ChatGPT steals from the rich and the poor and keeps for itself.
Which is why they should be legally compelled to publicize all of their datasets, models, research, and share any profits they’ve made with the works they can get provenance data for, because otherwise, it’s an unfair use of the public sphere of content.
One could very easily argue that adblockers are piracy, and those would be stealing from every social media creator, small blog, and independent news site, but I don’t see many people arguing against that, even though that very well includes people who aren’t wealthy corporations.
The issue isn’t necessarily the use of the copyrighted content, it’s the unfair legal stance taken on who can use the content, and how they are allowed to profit (or not profit) from it.
I’m not saying there are no downsides, but I do feel like a simple black and white dichotomy doesn’t properly outline how piracy and generative AI training are relatively similar in terms of who they steal from, and it’s more of a matter of what is done with the content after it is taken that truly matters most.
No they shouldn’t. They should cease to exist
Generative AI is not going back into the bag. If not OpenAI, then someone else will control it. So we deal with them the next best way, force them to serve us, the people.
Then they can either pay for the copyrighted data they want to train on or lobby for copyright to be reigned in for everyone. Right now, they’re acting like entitled twats with a shit business model demanding they get a free pass while the rest of us would be bankrupted for downloading a Metallica MP3.
I think this better solves the issue.
The problem isn’t necessarily the use of copyrighted works, (although it can be a problem in many ways) it’s the unfair legal determination of who is allowed to do so.
Nobody should profit from copyright violation. Yes, copyright law needs to change, but making money isn’t an exception
It probably will, though, once model collapse sets in.
That’s the irony, really… the more successful it is, the sooner it’ll poison itself to death.