The images were purportedly sparked by a Wisconsin guy, who then used them to entice a 15-year-old child.
A guy from Wisconsin was taken into custody by the United States Department of Justice last week for the purpose of delivering child sexual abuse material (CSAM) that was generated by artificial intelligence. The Department of Justice is attempting to set a judicial precedent that exploitative products are still criminal even when no children were employed to generate them. As far as we are aware, this is the first case of its sort. Deputy Attorney General Lisa Monaco stated in a press release that “CSAM generated by AI is still CSAM.” This statement was made in a straightforward manner.
A software developer named Steven Anderegg, who is 42 years old and lives in Holmen, Wisconsin, is accused of using a fork of the open-source artificial intelligence picture generator known as Stable Diffusion to create the photos. These images were then utilized by Anderegg to attempt to coerce a young child into engaging in sexual activities. It is quite probable that the latter will play a significant part in the trial that will eventually be held for the four counts of “producing, distributing, and possessing obscene visual depictions of minors engaged in sexually explicit conduct and transferring obscene material to a minor under the age of 16.”
In the words of the authorities, Anderegg’s photographs depicted “nude or partially clothed minors engaging in sexual intercourse with men or lasciviously displaying or touching their genitals.” To encourage the generator to develop the CSAM, the Department of Justice asserts that he employed particular prompts, including negative prompts (which provide more instruction for the artificial intelligence model by instructing it what not to make).
Despite the fact that cloud-based picture generators such as Midjourney and DALL-E 3 have precautions against this kind of activity, Ars Technica says that Anderegg apparently employed Stable Diffusion 1.5, a form that has fewer bounds. Fork was developed by Runway ML, according to Stability AI, which was quoted in the publication.
The Department of Justice claims that Anderegg had an online conversation with the child, who was 15 years old, in which he described how he had utilized the AI model to make the photographs. The government agency claims that the accused sent direct messages to the teenager on Instagram, which included a number of artificial intelligence images depicting “minors lasciviously displaying their genitals.” Instagram deserves credit for reporting the photographs to the National Center for Missing and Exploited Children (NCMEC), which then notified law enforcement about the situation.
If found guilty on all four counts, Anderegg could face a sentence ranging from five to seventy years in jail. His hearing is slated to take place on May 22nd, and he is now being held in federal prison.
It is possible that some people believe that the criminal character of CSAM is solely based on the children who are exploited in the process of creating them. This case will dispute that view. Despite the fact that AI-generated digital CSAM does not include any living humans (other than the person who enters the prompts), it still has the potential to normalize and encourage the content, as well as to be used to lead youngsters into circumstances that are predatory. As the technology continues to rapidly improve and gain popularity, it would appear that the federal government is interested in gaining clarity on this matter.
“Our commitment to protecting children will not change, even if technological advancements occur,” wrote Deputy Attorney General Monaco. The Department of Justice will pursue anyone who make and disseminate child sexual abuse material, also known as CSAM, with a strong and aggressive approach, regardless of the manner in which the material was made. To put it another way, CSAM that is generated by AI is still CSAM, and we will hold accountable those individuals who use AI to make images of children that are obscene, abusive, and increasingly photorealistic.