In September 2024, South Korean news agencies rushed to cover the case of teenagers and university students producing and distributing sexually explicit deepfake images using photographs of young women via Telegram. Reports suggested that thousands of people may have joined these illicit chatrooms, prompting growing concerns over women’s privacy in the online sphere, as well as the rapid development of artificial intelligence (AI) outpacing the awareness of its ethical implications.
According to several reports investigating the chatrooms, each group appears to have its own system for entry qualifications and payment requirements for production. For example, one group, which allegedly has the largest number of participants, generates illicit images from provided photos with the assistance of an AI-powered bot. Once a user has generated more than three deepfake images, the bot prompts them to either invite a new participant to the chatroom or make a payment.
What made the case even more alarming is the existence of chatrooms specifically targeting middle and high school students and university members. Hundreds of groups carried the names of educational institutions in their titles, sharing deepfake images made from photos of students or staff. Many are found to be operated by teenagers. A list of hundreds of schools reportedly linked to such Telegram rooms quickly spread across X (formerly Twitter), accumulating over 21 million views at the time of writing.
Digital sex crimes in South Korea
Even before the recent deepfake case on Telegram, South Korea has been grappling with a series of digital sex crimes for years.
Some of the most well-known crimes include the use of “spycams” — small cameras secretly installed in locker rooms, restrooms, and hotels. There have also been cases of perpetrators sharing footage filmed without consent, including instances where victims, ranging from complete strangers to people the perpetrators were very close to, were secretly filmed. Non-consensual sharing of such footage has also been prevalent, where individuals featured in the videos did not agree to the distribution of their images.
The vast majority of victims of these digital sex crimes are women. In the case of spycam crimes, for example, more than 93 per cent of the victims were female as of 2020. In contrast, most of the perpetrators were male. Furthermore in 2020, over 96 per cent of those convicted for spycam crimes were men.
Following the exposure of several web archives containing non-consensual footage, a wave of protests against digital sex crimes erupted in 2018. Young women, chanting the slogan “My Life is Not Your Porn”, took to the streets in the heart of Seoul. In response, the South Korean Government at the time promised to strengthen regulations on digital sex crimes.
Another notorious digital sex crime case in the country is the “Nth Room” scandal, which involved blackmail, sex trafficking, and the distribution of sexually exploitative videos via Telegram, primarily between late 2018 and March 2020. The case centred on a man named Cho Joo-bin and several accomplices who blackmailed women into sending sexual images. The members of their chatrooms, including those who paid to see more extreme photos, numbered up to 260,000. Reportedly, there were more than 100 victims in total.
The Government’s countermeasures against digital sex crime
Perpetrators of digital sex crimes rarely face significant punishment under the Korean legal system. Human Rights Watch found that, in 2017, only two per cent of the 5,437 individuals arrested for digital sex crimes were imprisoned. Most offenders received relatively modest fines or were subjected to mandatory social service programmes, often accompanied by a period of probation.
Even when a digital sex crime is reported to the police, prosecutors may choose to dismiss the case during the investigation process. According to the human rights watchdog, over 40 per cent of digital sex crime cases — including those involving sexual violence through cameras, the non-consensual distribution of images, and the production and distribution of child and youth sexual exploitation material — were dropped by prosecutors.
Deepfake crimes follow a similar pattern.
A legal provision to address deepfake sex crimes was introduced in March 2020, categorising them as “fake imagery” following the public outcry surrounding the Nth Room case. Under this provision, perpetrators can face up to five years in prison, and if the imagery was created for profit, the sentence can be extended to as long as seven years.
However, court rulings have generally been lenient in the majority of cases. Of the 125 perpetrators accused of creating false imagery, approximately half received suspended sentences, and only five were sentenced to five or more years in prison.
Moreover, the legal provision does not provide a basis for prosecuting individuals who watch or possess deepfake imagery. It states explicitly that a perpetrator can only be punished if they “distribute” the imagery, meaning evidence of intent is required to secure a conviction.
Amid growing concerns over deepfake crimes in South Korea, the National Assembly recently passed a bill that allows the prosecution of individuals who possess, purchase, archive, or watch deepfake imagery. The bill stipulates a maximum penalty of three years in prison or a fine of up to 30 million won (US$ 22,500). Lawmakers also removed the term “distribute,” thereby eliminating the requirement that the purpose of creating and sharing sexually exploitative deepfake content had to be proven.
Why is South Korea susceptible to digital sex crimes?
Telegram has become a hotspot for digital sex crimes in South Korea, as the app had long been considered free from government surveillance. Recently, however, the messaging platform announced plans to expand its privacy policy, stating it would provide user information to authorities not only in cases of suspected terrorism but also for problematic content driven by AI. When deepfake crimes made headlines in September, Telegram took down some of the videos shared in these chatrooms at the request of South Korea’s Communications Standards Commission (KCSC) and offered to share additional content information if the organisation sought further assistance.
However, chatrooms that share AI-generated sexually exploitative imagery can still be found on Telegram. Recent posts on X have shared screenshots of active chatrooms where sexual images are created using photos of both ordinary individuals and celebrities. 4i MAG has also identified several active accounts with Telegram contact information that are receiving requests to create sexual deepfakes, as reported on X.
South Korea is indisputably one of the most vulnerable countries to digital sex crimes. A 2023 report by Security Hero found that 53 per cent of deepfake pornography victims are from South Korea, with many of the victims being singers.
Some argue that the country needs to educate children on AI ethics. “The problem is that schools and private institutions teach advanced computer programming, but don’t address what the worst-case outcomes could be, to prevent (digital sex crimes),” said Lee Soo-jung, a forensic psychology professor at Kyonggi University, during a policy discussion on November 3 at the National Assembly, aimed at preventing and finding solutions to deepfake-related digital sex crimes.
On the other hand, many point out that the root of the issue lies in gender education, emphasising that framing digital sex crime as a “new” byproduct of technological development fails to offer a comprehensive solution. “We should not overlook the fact that any powerful technology can be used to harm women in a society that tends to view women as a tool,” said Kim Yeo-jin, president of the Korea Response Center for Cyber Sexual Violence.