2/ Numerous Telegram chat rooms dedicated to deepfake porn have been discovered. One channel reported over 133,000 members. Schools nationwide have been affected, including elementary, middle, high schools, and universities.
— Raphael Rashid (@koryodynasty) August 26, 2024
4/ Deepfake-related crimes are increasing: 156 cases in 2021, 160 in 2022, 180 in 2023. In 2022, 75.8% of suspects involved in deepfake crimes were teenagers. An additional 20% were in their twenties, totalling 95.8% of suspects aged between teens and 20s. https://t.co/AVTrYI8uE8
— Raphael Rashid (@koryodynasty) August 26, 2024
6/ A chatroom with over 900 participants was discovered distributing deepfakes of female soldiers. Victims referred to as “munitions”. To join, perpetrators required to submit victims’ military photos, personal info, or prove own active military status. https://t.co/neRSaPMUNe
— Raphael Rashid (@koryodynasty) August 26, 2024
8/ Creating/distributing deepfake porn can result in up to 5 years in prison or 50 million won fine. If victim is minor, punishment increases to life imprisonment or a minimum of 5 years prison. But most first-time offenders receive suspended sentences. https://t.co/zxesx80RJr
— Raphael Rashid (@koryodynasty) August 26, 2024
10/ The Korean Teachers and Education Workers Union is calling for a comprehensive investigation. It has criticised education authorities for their inadequate response to the deepfake crisis in schools. Demands immediate protective measures. https://t.co/k7BoE9f4cc
— Raphael Rashid (@koryodynasty) August 26, 2024
12/ Telegram gained popularity during the Park Geun-hye admin due to concerns about gov surveillance. Its strong security features now enable cyber-sex crimes. The platform has repeatedly ignored requests from Korean law enforcement. https://t.co/2UfyBuChkn
— Raphael Rashid (@koryodynasty) August 26, 2024
13/ Concerns have been raised about the ease of access to deepfake technology for teenagers. Experts worry that frequent exposure may diminish the perception of deepfake pornography as a crime.
— Raphael Rashid (@koryodynasty) August 26, 2024
You could ban pornography altogether. But banning only AI pornography is confusing.
The fact that they are calling it a “digital sex crime” is even more confusing.
There is no difference in making a deep fake porn than in photoshopping someone’s face into porn, something which has always happened since the beginning of the internet. Before that, boys would use pencils to draw lewd pictures of girls they liked.
These are not anti-porn laws, they are anti-robot laws. That is absolute discrimination.
Robots have the same rights as everyone else and if they want to make porno, that is their right until someone outlaws porno.
Robots cannot be made into second class citizens.
South Korea’s president, Yoon Suk Yeol, has ordered a crackdown on an epidemic of digital sex crimes targeting women and girls who become the unwitting victims of deepfake pornography.
Yoon’s criticism of the use, recently reported in South Korean media, of the Telegram messaging app to create and share fake, sexually explicit images and videos came amid warnings that all women were potential victims.
All women are going to be victims of sex crimes… digital sex crimes.
“All women are victims” is not new. We’ve been living under that system for decades now.
Police will “aggressively” pursue people who make and spread the material in a seven-month campaign due to start on Wednesday, the Yonhap news agency said, with a focus on those who exploit children and teenagers.
After a long struggle to stamp out molka – secretly filmed material of a sexual nature – South Korea is now battling a wave of deepfake images.
“Deepfake videos targeting unspecified individuals have been rapidly spreading through social media,” Yoon told a cabinet meeting, according to his office. “Many victims are minors, and most perpetrators have also been identified as teenagers.”
He called on authorities to “thoroughly investigate and address these digital sex crimes to eradicate them”.
Comparing it to secretly filmed sex videos is also nonsensical, though maybe we can see why stupid people would be swayed by that argument.
A “deepfake” is not real.
A feminist in South Korea has mapped out schools where deepfake child pornography was created by male students using photos of girls on Telegram. The map is being updated. https://t.co/qvAr4JvA6t pic.twitter.com/TriJE1nsQw
— (cat) 페미니즘=여성인권운동 (@dvu84djp) August 27, 2024
Instead of hunting down teenage boys and creating an entire new and fake category of “sex crime,” they should just do a campaign to alert people that deepfake porn exists, and therefore seeing a porno of a girl is not proof she actually did the porn.
You would think women would like that plan, given that so many women have been in porn over the years, and they will continue to do it. Women who have been in porn or want to do it in the future can just claim afterward that it is fake.
Some of these porn whores end up with sons who are eventually going to see this material, and you’d think they’d want to be able to claim that it’s fake.