Listening to women isn’t working.
Look, the world has big problems. Big, big problems.
Deep fake porn is not at the top of the list for any objective observer. But if you ask women, they’re probably going to list deep fake porn as the number one problem, or at least in the top 5.
No one can even explain why this is illegal.
The anger was palpable. For the second time in just a few years, South Korean women took to the streets of Seoul to demand an end to sexual abuse. When the country spearheaded Asia’s #MeToo movement, the culprit was molka – spy cams used to record women without their knowledge. Now their fury was directed at an epidemic of deepfake pornography.
For Juhee Jin, 26, a Seoul resident who advocates for women’s rights, the emergence of this new menace, in which women and girls are again the targets, was depressingly predictable. “This should have been addressed a long time ago,” says Jin, a translator. “I hope that authorities take precautions and provide proper education so that people can prevent these crimes from happening.”
Yeah, of course she’s a translator. She’s able to read all of these Western feminist publications.
The National police agency said this week that it was investigating 513 cases of deepfake pornography – in which the faces of real women and girls are digitally superimposed on to a body without their knowledge or consent. That represents a 70% jump in cases in just 40 days, the Yonhap news agency said, underlining the country’s struggle to rein in the use of digital technology to sexually abuse women and girls.
It’s been established that you don’t have a copyright on your face.
So why is this illegal?
No one knows.
Recent reports about the rapid rise in deepfake porn have prompted a new round of soul-searching in a country whose positive contribution to global pop culture is being sullied by its status as the world’s digital sex crime capital.
The exact number of victims is difficult to verify, but if the current trend continues South Korea is expected to reach a record high by the end of the year. The number of reported cases of deepfake porn has risen steadily in recent years, from 156 in 2021 to 180 in 2023.
The victims are predominantly young women and girls, including students, teachers, and soldiers. Last year almost two-thirds were in their teens. Local media reports say the perpetrators are also often minors. Teenagers accounted for 79% of those detained in the first nine months of this year, according to Yonhap.
Yeah.
Not a real problem.
It’s teenage boys having fun.
No one can explain how it is illegal. If porn is legal and you don’t have a copyright on your face, how is this illegal?
The scale of the problem has stunned many South Koreans. One Telegram chatroom known for creating and distributing deepfake pornography reportedly had 220,000 members, another more than 400,000 users. Some rooms encouraged members to humiliate or degrade women through deepfakes.
Telegram became aware of the severity of the problem of deepfake crimes in South Korea through the media, so decided to remove the videos at the request of the Korea Communications Commission. They provided the Korean government with a dedicated email address for collaboration https://t.co/aTh5I8Xnf9
— 🇰🇷 (cat) 페미니즘=여성인권운동 (@dvu84djp) September 3, 2024
Several years after South Korea made international headlines with its molka problem, the government is again under pressure to stamp out this wave of online sex crimes. A large protest is scheduled to be held in Seoul on 21 September.
South Korea holds the unenviable title of the country most targeted by deepfake pornography. Its female singers and actors constitute 53% of the individuals featured in deepfakes worldwide, according to a 2023 report by Security Hero, a US startup focused on identity theft protection.
Police have launched an investigation into Telegram, and the country’s media regulator plans to hold talks with the messaging app’s representatives to discuss a joint response to the problem. The education ministry has launched a taskforce to investigate incidents at schools, teach children how to protect their images and support victims.
Oh, so it ends up as another way to censor Telegram.
Big shock there.
🇰🇷South Korean journalist Jung-joo Lee (@jungjoo_seoul) reveals first-hand details about the deepfake porn crisis gripping her country.
Full interview coming soon https://t.co/gsc6imynrR pic.twitter.com/twmmWd2ozj
— Yawen Xu (@YawenXu17) September 4, 2024