This is also why I am thankful for the American first amendment.
It would appear that in that user’s country, it is considered additionally a crime to take the nude scene out of context.
I think a lot of people online take freedom of speech for granted, not realizing that many supposedly civilized countries have an increasing number of restrictions on unpopular speech, critical speech, or otherwise undesirable speech like this.
Make no mistake. The US is heading in the same direction. Look at the proposed anti-deepfake laws. That guy could be prosecuted extremely harshly under those.
It will be interesting to see that tested in court.
I don’t think anyone would complain about for example a pencil sketch of a naked celebrity, that would be considered free speech and fair use even if it is a sketch of a scene from a movie.
So where does the line go?
If the pencil sketch is legal, what if you do a digital sketch with Adobe illustrator and a graphics tablet?
What if you use the Adobe AI function to help clean up the image? What if you take screen grabs of a publicity shot of the actor’s face and a nude image of someone else, and use them together to trace the image you end up painting? What if you then use AI to help you select colors and help shading? What if you do each of those processes individually but you have AI do each of them? That is not very functionally different from giving an AI a publicity shot and telling it to generate a nude image.
As I see it, The only difference between the AI deepfake and the fake produced by a skilled artist is the amount of time and effort required. And while that definitely makes it easy to turn out an awful lot of fakes, it’s bad policy to ban one and not the other simply based on the process by which the image was created.
It’s messy legislation all around. When does it become porn vs art vs just erotic or satirical? How do you prove it was a deep fake and not a lookalike? If I use a porn actress to make a deep fake is that also illegal or is it about how the original source content was intended to be used/consumed?
I’m not saying that we should just ignore these issues, but I don’t think any of this will be handled well by any government.
That’s easy. The movie studios know what post-production went into the scenes and have the documents to prove it. They can easily prove that such clips fall under deepfake laws.
Y’all need to be more cynical. These lobby groups do not make arguments because they believe in them, but because it gets them what they want.
I was responding to an above comment. The guy who was arrested in op’s article was posting clips from movies (so not deep fakes).
That being said, for deepfakes, you’d need the original video to prove it was deepfaked. Additionally, you’d then probably need to prove they used a real person to make the deep fake. Nowadays it’s easy to make “fake” people using AI. Not sure where the law sits on creating deepfakes of fake people who resemble other people.
I didn’t make the point clear. The original scenes themselves, as released by the studio, may qualify as “deepfakes”. A little bit of digital post-processing can be enough to qualify them under the proposed bills. Then sharing them becomes criminal, fair use be damned.
Actually I was thinking about this some more and I think there is a much deeper issue.
With the advent of generative AI, photographs can no longer be relied upon as documentary evidence.
There’s the old saying, ‘pics or it didn’t happen’, which flipped around means sharing pics means it did happen.
But if anyone can generate a photo realistic image from a few lines of text, then pictures don’t actually prove anything unless you have some bulletproof way to tell which pictures are real and which are generated by AI.
And that’s the real point of a lot of these laws, to try and shove the genie back in the bottle. You can ban deep fake porn and order anyone who makes it to be drawn in quartered, you can an AI watermark it’s output but at the end of the day the genie is out of the bottle because someone somewhere will write an AI that ignores the watermark and pass the photos off as real.
I’m open to any possible solution, but I’m not sure there is one. I think this genie may be out of the bottle for good, or at least I’m not seeing any way that it isn’t. And if that’s the case, perhaps the only response that doesn’t shred civil liberties is to preemptively declare defeat, acknowledge that photographs are no longer proof of anything, and deal with that as a society.
One solution that’s been proposed is to cryptographic ally sign content. This way someone can prove they “made” the content. It doesn’t prove the content is real, but means you can verify the originator.
However, at the end of the day, you’re still stuck with needing to decide who you trust.
I’m not fine with that, as it will have wide-ranging repercussions on society at large that aren’t all good.
But I fully accept it as the cold hard reality that WILL happen now that the genie’s out of the bottle, and the reality that any ham-fisted legal attempt to rebottle the genie will be far worse for society and only delay the inevitable acceptance that photographs are no longer proof.
And as such, I (and most other adults mature enough to accept a less-than-preferred reality as reality) stand with you and give the statists the middle finger, along with everyone else who thinks you can legislate any genie back into its bottle. In the 1990s it was the ‘protect kids from Internet porn’ people, in the 2000s it was the ‘protect kids from violent video games’ and ‘stop Internet piracy’ people, I guess today it’s the ‘stop generative AI’ people. They are all children who think crying to Daddy will remake the ways of the world. It won’t.
That’s the appropriate reaction to many of these so-called threats to society. Internet chat rooms, generative AI, drugs, opioids, guns, pornography, trashy TV, you name it. I think it’s been pretty well demonstrated throughout history that the majority of the time some ‘threat to public safety’ comes out and a well-meaning group tries to get the government to shove the genie back in the bottle, the cure ends up being worse than the disease. And it’s a lot easier to set up bureaucracy then to dismantle it.
The sad thing is, whatever regulation they set up will be pointless. Someone will download an open source model and run it locally with the watermark code removed. Or some other nation will realize that hobbling their AI industry with stupid regulations won’t help them get ahead in the world and they will become a source for non-watermarked output and watermark free models.
So we hobble ourselves with some ridiculous AI enforcement bureaucracy, and it will do precisely zero good because the people who would do bad things will just do them on offshore servers or in their basement.
It applies everywhere else too. I’m all for ending the opioid crisis, but the current attempt to end opioids entirely is not the solution. A good friend of mine takes a lot of opioids, prescribed by a doctor, for a serious pain condition resulting from a car accident. This person’s back and neck are full of metal pins and screws and plates and whatnot.
For this person, opioids like oxycontin are the difference between being in constant pain and being able to do things like workout at the gym and enjoy life.
But because of the well-meaning war on opioids, this person and their doctor are persecuted. Pharmacies don’t want to deal with oxycontin, and the doctor is getting constant flack from insurance and DEA for prescribing too much of it.
I mean really, a pain management doctor prescribes a lot of pain medication. That’s definitely something fishy that we should turn the screws on him for…
It’s really infuriating. In my opinion, the only two people who should decide what drugs get taken are a person and their doctor. For anyone else to try and intrude on that is a violation of that person’s rights.
One is banned because it can affect someone’s earnings, and is theft, the other is not banned because noone is harming another party by making a pencil drawing of a celebrity or scene.
I also would’ve expected nudity to be less taboo there. Would it have been just as likely to be arrested for sharing fully clothed still shots? That would actually make a lot more sense: distribution of copyrighted, non-promotional material.
I think the issue isn’t nudity but sexualization-- IE nude scene in context of a film is fine, chopping the nude scene out of the film is basically turning the actress into a porn star and that’s not fine. Same attitude is why the actress called it molestation. Different attitude as a society I guess.
It seems to me these scenes are introduced in films to sexualize them. Most often than not they don’t add anything to the story. But blood & sex get more viewers. So I find the whole thing hypocritical.
Brings me to mind an episode of the hilarious series “Coupling”, where Jeff says that the actress in the film “The Piano” (?) was naked in the whole film. His friends say she wasn’t, it was only a scene in the film. And Jeff replies “it depends on how you watch it” 🤣
I agree it’s hypocritical, but for different reasons.
I think a nude/sex scene can be important to the plot and add a lot to the story- in some situations. Yeah it’s often thrown in as eye candy to get more viewers, but sometimes it counts for a lot. Look at Season 1 of Game of Thrones for example- there’s a couple sex scenes with Dany and Khal Drogo, and IMHO that does a lot more to further the story than to show T&A-- the first one Dany’s basically being raped, but as the season goes on you see her start to fall in love with Drogo and it becomes more making love. Hard to get the same effect without sex scenes.
Same thing anytime you have two people in bed- crappy unrealistic TV sex where the girl never takes her shirt off and then cut to half a second later they’re both wrapped tightly but conveniently in sheets can break suspended disbelief.
So I can sympathize with an actor who agrees to artistic nude scenes or sex scenes because they’re important to the plot, but then has that specific 20 seconds of video taken out of context and circulated on porn sites.
At the same time, an actor doesn’t get to order the audience to experience the film in any certain way. Just as you say about ‘the piano’, it depends on how you watch it. It’s not illegal to buy the film, fast forward to the nude scenes, and stop watching when they’re done. So to think you get any sort of control over that is hypocritical, it’s like ordering a reader to read the entire book and not share passages with a friend.
Personally I disagree on value of sex/nude scenes – but it’s a subjective matter of course. Your final argument is absolutely fair and logical, and very general too. Extremely well put – I subscribe 110% to it!
This is also why I am thankful for the American first amendment.
It would appear that in that user’s country, it is considered additionally a crime to take the nude scene out of context.
I think a lot of people online take freedom of speech for granted, not realizing that many supposedly civilized countries have an increasing number of restrictions on unpopular speech, critical speech, or otherwise undesirable speech like this.
Make no mistake. The US is heading in the same direction. Look at the proposed anti-deepfake laws. That guy could be prosecuted extremely harshly under those.
It will be interesting to see that tested in court. I don’t think anyone would complain about for example a pencil sketch of a naked celebrity, that would be considered free speech and fair use even if it is a sketch of a scene from a movie.
So where does the line go? If the pencil sketch is legal, what if you do a digital sketch with Adobe illustrator and a graphics tablet? What if you use the Adobe AI function to help clean up the image? What if you take screen grabs of a publicity shot of the actor’s face and a nude image of someone else, and use them together to trace the image you end up painting? What if you then use AI to help you select colors and help shading? What if you do each of those processes individually but you have AI do each of them? That is not very functionally different from giving an AI a publicity shot and telling it to generate a nude image.
As I see it, The only difference between the AI deepfake and the fake produced by a skilled artist is the amount of time and effort required. And while that definitely makes it easy to turn out an awful lot of fakes, it’s bad policy to ban one and not the other simply based on the process by which the image was created.
It’s messy legislation all around. When does it become porn vs art vs just erotic or satirical? How do you prove it was a deep fake and not a lookalike? If I use a porn actress to make a deep fake is that also illegal or is it about how the original source content was intended to be used/consumed?
I’m not saying that we should just ignore these issues, but I don’t think any of this will be handled well by any government.
That’s easy. The movie studios know what post-production went into the scenes and have the documents to prove it. They can easily prove that such clips fall under deepfake laws.
Y’all need to be more cynical. These lobby groups do not make arguments because they believe in them, but because it gets them what they want.
I was responding to an above comment. The guy who was arrested in op’s article was posting clips from movies (so not deep fakes).
That being said, for deepfakes, you’d need the original video to prove it was deepfaked. Additionally, you’d then probably need to prove they used a real person to make the deep fake. Nowadays it’s easy to make “fake” people using AI. Not sure where the law sits on creating deepfakes of fake people who resemble other people.
I didn’t make the point clear. The original scenes themselves, as released by the studio, may qualify as “deepfakes”. A little bit of digital post-processing can be enough to qualify them under the proposed bills. Then sharing them becomes criminal, fair use be damned.
Actually I was thinking about this some more and I think there is a much deeper issue.
With the advent of generative AI, photographs can no longer be relied upon as documentary evidence.
There’s the old saying, ‘pics or it didn’t happen’, which flipped around means sharing pics means it did happen.
But if anyone can generate a photo realistic image from a few lines of text, then pictures don’t actually prove anything unless you have some bulletproof way to tell which pictures are real and which are generated by AI.
And that’s the real point of a lot of these laws, to try and shove the genie back in the bottle. You can ban deep fake porn and order anyone who makes it to be drawn in quartered, you can an AI watermark it’s output but at the end of the day the genie is out of the bottle because someone somewhere will write an AI that ignores the watermark and pass the photos off as real.
I’m open to any possible solution, but I’m not sure there is one. I think this genie may be out of the bottle for good, or at least I’m not seeing any way that it isn’t. And if that’s the case, perhaps the only response that doesn’t shred civil liberties is to preemptively declare defeat, acknowledge that photographs are no longer proof of anything, and deal with that as a society.
One solution that’s been proposed is to cryptographic ally sign content. This way someone can prove they “made” the content. It doesn’t prove the content is real, but means you can verify the originator.
However, at the end of the day, you’re still stuck with needing to decide who you trust.
I’m fine with photos don’t prove anything.
Let statists cry about that one, cry little statists that you can’t inflict pain in that justified way that you love so much.
I’m not fine with that, as it will have wide-ranging repercussions on society at large that aren’t all good.
But I fully accept it as the cold hard reality that WILL happen now that the genie’s out of the bottle, and the reality that any ham-fisted legal attempt to rebottle the genie will be far worse for society and only delay the inevitable acceptance that photographs are no longer proof.
And as such, I (and most other adults mature enough to accept a less-than-preferred reality as reality) stand with you and give the statists the middle finger, along with everyone else who thinks you can legislate any genie back into its bottle. In the 1990s it was the ‘protect kids from Internet porn’ people, in the 2000s it was the ‘protect kids from violent video games’ and ‘stop Internet piracy’ people, I guess today it’s the ‘stop generative AI’ people. They are all children who think crying to Daddy will remake the ways of the world. It won’t.
I am infinitely more worried about the backlash and the enclosers than the tech itself.
That’s the appropriate reaction to many of these so-called threats to society. Internet chat rooms, generative AI, drugs, opioids, guns, pornography, trashy TV, you name it. I think it’s been pretty well demonstrated throughout history that the majority of the time some ‘threat to public safety’ comes out and a well-meaning group tries to get the government to shove the genie back in the bottle, the cure ends up being worse than the disease. And it’s a lot easier to set up bureaucracy then to dismantle it.
The sad thing is, whatever regulation they set up will be pointless. Someone will download an open source model and run it locally with the watermark code removed. Or some other nation will realize that hobbling their AI industry with stupid regulations won’t help them get ahead in the world and they will become a source for non-watermarked output and watermark free models.
So we hobble ourselves with some ridiculous AI enforcement bureaucracy, and it will do precisely zero good because the people who would do bad things will just do them on offshore servers or in their basement.
It applies everywhere else too. I’m all for ending the opioid crisis, but the current attempt to end opioids entirely is not the solution. A good friend of mine takes a lot of opioids, prescribed by a doctor, for a serious pain condition resulting from a car accident. This person’s back and neck are full of metal pins and screws and plates and whatnot.
For this person, opioids like oxycontin are the difference between being in constant pain and being able to do things like workout at the gym and enjoy life.
But because of the well-meaning war on opioids, this person and their doctor are persecuted. Pharmacies don’t want to deal with oxycontin, and the doctor is getting constant flack from insurance and DEA for prescribing too much of it.
I mean really, a pain management doctor prescribes a lot of pain medication. That’s definitely something fishy that we should turn the screws on him for…
It’s really infuriating. In my opinion, the only two people who should decide what drugs get taken are a person and their doctor. For anyone else to try and intrude on that is a violation of that person’s rights.
Same as anything else, if it causes someone harm (in american financial harm counts) it gets regulated.
There are exceptions to allow people to disregard laws as well. Its legal to execute a death row prisoner.
One is banned because it can affect someone’s earnings, and is theft, the other is not banned because noone is harming another party by making a pencil drawing of a celebrity or scene.
Guess if they pass we’ll see how they stand up to the 1st.
I also would’ve expected nudity to be less taboo there. Would it have been just as likely to be arrested for sharing fully clothed still shots? That would actually make a lot more sense: distribution of copyrighted, non-promotional material.
I think the issue isn’t nudity but sexualization-- IE nude scene in context of a film is fine, chopping the nude scene out of the film is basically turning the actress into a porn star and that’s not fine. Same attitude is why the actress called it molestation. Different attitude as a society I guess.
It seems to me these scenes are introduced in films to sexualize them. Most often than not they don’t add anything to the story. But blood & sex get more viewers. So I find the whole thing hypocritical.
Brings me to mind an episode of the hilarious series “Coupling”, where Jeff says that the actress in the film “The Piano” (?) was naked in the whole film. His friends say she wasn’t, it was only a scene in the film. And Jeff replies “it depends on how you watch it” 🤣
I agree it’s hypocritical, but for different reasons.
I think a nude/sex scene can be important to the plot and add a lot to the story- in some situations. Yeah it’s often thrown in as eye candy to get more viewers, but sometimes it counts for a lot. Look at Season 1 of Game of Thrones for example- there’s a couple sex scenes with Dany and Khal Drogo, and IMHO that does a lot more to further the story than to show T&A-- the first one Dany’s basically being raped, but as the season goes on you see her start to fall in love with Drogo and it becomes more making love. Hard to get the same effect without sex scenes.
Same thing anytime you have two people in bed- crappy unrealistic TV sex where the girl never takes her shirt off and then cut to half a second later they’re both wrapped tightly but conveniently in sheets can break suspended disbelief.
So I can sympathize with an actor who agrees to artistic nude scenes or sex scenes because they’re important to the plot, but then has that specific 20 seconds of video taken out of context and circulated on porn sites.
At the same time, an actor doesn’t get to order the audience to experience the film in any certain way. Just as you say about ‘the piano’, it depends on how you watch it. It’s not illegal to buy the film, fast forward to the nude scenes, and stop watching when they’re done. So to think you get any sort of control over that is hypocritical, it’s like ordering a reader to read the entire book and not share passages with a friend.
Personally I disagree on value of sex/nude scenes – but it’s a subjective matter of course. Your final argument is absolutely fair and logical, and very general too. Extremely well put – I subscribe 110% to it!