close
close

Alaska Man Reported Someone for AI CSAM, Then Arrested for Same Thing

Alaska Man Reported Someone for AI CSAM, Then Arrested for Same Thing

If you are going to contact the police and report someone who claims to be interested in child sexual abuse material (CSAM), it may not be the best idea to have the same material on your own devices. Or granting further consent to a search so law enforcement can gather more information. But allegedly a man from Alaska did it. This resulted in him being placed in police custody.

404 Media reported Earlier this week, it was claimed that artificial intelligence-generated child sexual abuse material (CSAM) was revealed in the police search of the devices of a man named Anthaney O’Connor.

From 404:

According to new file charging documentsAnthaney O’Connor reached out to law enforcement in August to warn them that an unidentified airman had shared child sexual abuse (CSAM) material with O’Connor. While investigating the crime and with O’Connor’s consent, federal authorities searched his phone for additional information. According to the criminal complaint, an examination of the electronics revealed O’Connor allegedly offering to build a virtual reality CSAM for the airman.

According to police, the unidentified airman shared with O’Connor a photo of a child he had taken at a grocery store, and the two discussed how they could place the young child in an open virtual reality world.

Law enforcement claims to have found at least six explicit, AI-generated CSAM images that O’Connor says were intentionally downloaded on his devices, along with several “real” images that were unintentionally scrambled. At Connor’s home, law enforcement uncovered a computer with multiple hard drives hidden in the home’s air vent; An examination of the computer allegedly revealed a 41-second video of child rape.

In an interview with authorities, O’Connor said she regularly reported CSAM to her internet service providers but “still felt sexually gratified by the images and videos.” It’s unclear why he decided to report the airman to law enforcement. Maybe he was feeling remorse, or maybe he truly believed that his AI CSAM wasn’t breaking the law.

AI image generators are often trained using real photos; So the children’s pictures “generated” by artificial intelligence are basically based on real images. There’s no way to separate the two. AI-based CSAM is not a victimless crime in this sense.

The first such arrest of someone for possession of AI-generated CSAM occurred recently back in may When the FBI arrested a man who used Stable Diffusion to create “thousands of realistic images of teenage minors.”

Proponents of AI will say that it has always been possible to create clear images of little ones using Photoshop, but AI tools make it exponentially easier for everyone. A recent report found that: one in six members of Congress Targeted by AI-generated deepfake porn. Many products have guardrails to prevent the worst uses, such as printers not allowing photocopying of money. Implementing barriers prevents at least some of these behaviors.