Exclusive
As of: January 5, 2024 at 6:08 AM
AI is increasingly being misused to create child pornography, e.g Zoom in on the screenResearch has shown. Some images are artificially created, others are based on real images of children or even real child abuse.
Written by Fabian Sigurd Severin, SWR
Thousands of doctored images of children and young people are distributed across the social media platform Instagram under specific hashtags. Many of them were photographed in skimpy lingerie, swimsuits, or in sexual positions. This is what research shows SWR– Investigation format Zoom in on the screen.
Some accounts that share such images are associated with trading, crowdfunding, or community platforms. Images of explicit child sexual abuse created using artificial intelligence (AI) are sometimes distributed. Experts warn that images of abuse generated by artificial intelligence are dangerous: they make investigative work more difficult and could increase the willingness of child molesters to commit real attacks.
Trading images generated by artificial intelligence from Child abuse
Among the associated community platforms Zoom in on the screen– According to research, a Japanese website is sharing explicit AI-generated images of child abuse. It is also known to BKA and, according to user reviews, appears to be used by people with pedophilia disorder to communicate with others.
Many users from there link to a website containing real images of child abuse. This is confirmed by the Internet Watch Foundation, an international child sexual abuse reporting center based in Great Britain.
AI-generated images based on real child abuse are also circulating. This is indicated by user comments within the images generated by artificial intelligence, as well as observations by the Internet Watch Foundation.
Missing Security mechanisms
“People sell access to such images or get paid to produce them,” says Dan Sexton of the British NGO. So far, the organization has found most images of AI-generated child abuse on the dark web. According to Sexton, as the number increases, the risk of more and more of them accessing the open Internet increases. “This is not something that might happen in the future, but it is something that is already happening now.”
Artificial images of pornography are produced for children and young people Zoom in on the screen-According to research, primarily with a version of the AI software Stable Diffusion. Unlike the two major competing AI programs DALL-E And the middle of the trip, Which is used to create images, Stable Diffusion is open source software and its code is publicly available. The software version does not include any security mechanisms that prevent, for example, the creation of nude images. This is shown through self-experience Zoom in on the screen-Editorial Board.
Possible overload of the authorities
The BKA does not separately record cases of AI-generated pornography, but classifies the overall risk of pornography to children and young people as high. In 2022, the number of cases increased by 7.4 percent for child pornography and by 32.1 percent for youth pornography. In addition, real, everyday images could serve as the basis for AI-generated pornography. According to the BKA, synthetic images are difficult to distinguish from real ones.
“What I find worrying is that the number of materials available will increase, but above all the quality of the content,” says Senior Prosecutor Markus Hartmann. He heads the Cybercrime Center and Contact Point (ZAC) in North Rhine-Westphalia. According to Hartmann, this development may lead investigative authorities to incorrectly evaluate AI-generated images as new actual abuse and thus reach the limits of their resources.
Amnesty InternationalChild pornography It can provoke perpetrators
But artificially displaying child pornography also poses a risk for people with child sexual exploitation disorders, says Professor Klaus-Michael Beier, director of the Institute for Sexual Sciences and Sexual Medicine at the Charite in Berlin. He also runs Don't Be a Perpetrator in Berlin, a support service for pedophiles.
The problem: As with real child pornography, AI-generated images also distort perception, Pierre says. They trick pedophiles into believing that sexual intercourse between children and adults is possible and that children desire it.
His warning is supported by an international study conducted by the Finnish NGO “Protect Children”. More than 8,000 people consuming child abuse images on the Darknet participated in this campaign. About a third of those surveyed stated that they actually sought to connect with children after seeing the illustrations.
A potential legal loophole in Germany?
In Germany, the distribution, acquisition and possession of pornographic images of children or young people is prohibited according to Article 184b and c of the Criminal Code. But since there is still no case law on the exploitation of children and young people in AI-generated pornography, this creates uncertainty, according to the Minister of Justice of Rhineland-Palatinate, Herbert Merten (FDP): “One problem may be that Simply produce it without distributing it.” „They want to get away with it if they do it with artificial intelligence. If they do it with real children, it's punishable by law.”
The Conference of Justice Ministers therefore asked Federal Minister of Justice Marco Buschmann (FDP) to set up an expert committee to deal with the new developments, says Merten.
Required under EU law Platform provider
This is what the spokeswoman for the Federal Ministry of Justice wrote Zoom in on the screen– Demanding the examination of “criminal law instruments” on an ongoing basis. In addition to real images, “pornographic representations of children and young people regularly generated using artificial intelligence” are also punishable. If illegal content is distributed via online platforms such as Instagram or the Japanese site mentioned above, the Digital Services Law will apply. EU law requires platforms to establish reporting procedures and to take action against improper use of their services.
Spokespeople from Instagram and the Japanese community platform are putting themselves out there Zoom in on the screen– Clearly investigate child pornography. Instagram takes action not only against sexually explicit content, but also against profiles, pages or comments that share images of children that do not have a clear sexual connotation, if the captions, hashtags or comments contain signs of inappropriate emotion.
Platforms are insufficiently responsive
Zoom in on the screenHowever, research shows that Instagram and the Japanese site do not adequately comply with the obligation to remove illegal content. During the research, the editorial team reported dozens of Instagram accounts whose owners advertised the sale of real-life pornography to children and young people.
Only a third of accounts were deleted by Instagram within 48 hours. For others, no violation of community guidelines was initially identified. The remaining accounts were not deleted until further notice.
A spokeswoman for the Japanese site wrote that… Zoom in on the screen AI-generated images containing child and youth pornography have been deleted. But even after their answer, similar AI-generated images of child abuse could be found on the community platform.
“In Germany we have appropriate legal framework regulations that can be used to remove something like this from the Internet,” says Rhineland-Palatinate Justice Minister Merten. „Our problem is always bringing the pollutant under control.”
Many perpetrators reside abroad and are therefore difficult to catch. In addition, international cooperation is sometimes difficult. Prosecutor Hartmann sees the problem above all in the fact that it is not easy for platform operators to recognize the corresponding images.