On Wednesday, American attorneys general from all 50 states and four territories sent a letter to Congress urging lawmakers to establish an expert commission to study how generative AI may potentially be used to exploit children through the creation of child sexual abuse material (CSAM). They also call for expanding existing laws against CSAM to explicitly cover AI-generated materials.
“As Attorneys General of our respective States and territories, we have a deep and grave concern for the safety of the children within our respective jurisdictions,” the letter reads. “And while Internet crimes against children are already being actively prosecuted, we are concerned that AI is creating a new frontier for abuse that makes such prosecution more difficult.”
In particular, open source image synthesis technologies such as Stable Diffusion allow the creation of AI-generated pornography with ease, and a large community has formed around tools and add-ons that enhance this ability. Since these AI models are openly available and often run locally, there are sometimes no guardrails preventing someone from creating sexualized images of children, and that has rung alarm bells among the nation’s top prosecutors. (It’s worth noting that Midjourney, DALL-E, and Adobe Firefly all have built-in filters that bar the creation of pornographic content.)
“Creating these images is easier than ever,” the letter reads, “as anyone can download the AI tools to their computer and create images by simply typing in a short description of what the user wants to see. And because many of these AI tools are ‘open source,’ the tools can be run in an unrestricted and unpoliced way.”
As we have previously covered, it has also become relatively easy to create AI-generated deepfakes of people without their consent using social media photos. The attorneys general mention a similar concern, extending it to images of children:
“AI tools can rapidly and easily create ‘deepfakes’ by studying real photographs of abused children to generate new images showing those children in sexual positions. This involves overlaying the face of one person on the body of another. Deepfakes can also be generated by overlaying photographs of otherwise unvictimized children on the internet with photographs of abused children to create new CSAM involving the previously unharmed children.”