Infertile Korea’s Cops Plan to “Aggressively” Hunt Down Teen Boys Who Make Deepfake Porn of Girls from School

You could ban pornography altogether. But banning only AI pornography is confusing.

The fact that they are calling it a “digital sex crime” is even more confusing.

There is no difference in making a deep fake porn than in photoshopping someone’s face into porn, something which has always happened since the beginning of the internet. Before that, boys would use pencils to draw lewd pictures of girls they liked.

These are not anti-porn laws, they are anti-robot laws. That is absolute discrimination.

Robots have the same rights as everyone else and if they want to make porno, that is their right until someone outlaws porno.

Robots cannot be made into second class citizens.

The Guardian:

South Korea’s president, Yoon Suk Yeol, has ordered a crackdown on an epidemic of digital sex crimes targeting women and girls who become the unwitting victims of deepfake pornography.

Yoon’s criticism of the use, recently reported in South Korean media, of the Telegram messaging app to create and share fake, sexually explicit images and videos came amid warnings that all women were potential victims.

All women are going to be victims of sex crimes… digital sex crimes. 

“All women are victims” is not new. We’ve been living under that system for decades now.

Police will “aggressively” pursue people who make and spread the material in a seven-month campaign due to start on Wednesday, the Yonhap news agency said, with a focus on those who exploit children and teenagers.

After a long struggle to stamp out molka – secretly filmed material of a sexual nature – South Korea is now battling a wave of deepfake images.

“Deepfake videos targeting unspecified individuals have been rapidly spreading through social media,” Yoon told a cabinet meeting, according to his office. “Many victims are minors, and most perpetrators have also been identified as teenagers.”

He called on authorities to “thoroughly investigate and address these digital sex crimes to eradicate them”.

Comparing it to secretly filmed sex videos is also nonsensical, though maybe we can see why stupid people would be swayed by that argument.

A “deepfake” is not real.


Instead of hunting down teenage boys and creating an entire new and fake category of “sex crime,” they should just do a campaign to alert people that deepfake porn exists, and therefore seeing a porno of a girl is not proof she actually did the porn.

You would think women would like that plan, given that so many women have been in porn over the years, and they will continue to do it. Women who have been in porn or want to do it in the future can just claim afterward that it is fake.

Some of these porn whores end up with sons who are eventually going to see this material, and you’d think they’d want to be able to claim that it’s fake.