South Korea is poised to criminalize possessing or looking at sexually explicit AI-manipulated deepfake photos or video.



South Korea is poised to criminalize possessing or looking at sexually explicit AI-manipulated deepfake photos or video.

https://www.cbsnews.com/news/south-korea-deepfake-porn-law-ban-sexually-explicit-video-images

Posted by Exastiken

2 comments
  1. I’d be interested to know how they define these illegal images. Because, “sexually explicit AI-manipulated deepfake videos” seems precise but porn is notoriously the fastest to adopt new technology.

    Imagine a porn studio records a videos with 10 different people and then face swaps it so that each person has 10 videos (1 original, 9 “deepfake”).

    Seems like the kind of thing that a porn studio would do to multiply their output and doesn’t seem immoral, all parties consenting to share their likeness, but 9 of those videos would be “sexually explicit AI-manipulated deepfake videos”

    Or maybe a blockbuster movie has a sexually explicit scene where the actor uses a body double and has herself digitally inserted with AI tools. This too is a “sexually explicit AI-manipulated deepfake videos”.

Leave a Reply