Last month, a detective in a small town outside of Lancaster, Pennsylvania, invited dozens of high school girls and their parents to the police station to undertake a difficult task: one by one, the girls were asked to confirm that they were depicted in hundreds of AI-generated deepfake pornographic images seized by law enforcement.

In a series of back-to-back private meetings, Detective Laurel Bair of the Susquehanna Regional Police Department slid each image out from under the folder’s cover, so only the girl’s face was shown, unless the families specifically requested to see the entire uncensored image.

“It made me a lot more upset after I saw the pictures because it made them so much more real for me,” one Lancaster victim, now 16, told Forbes. “They’re very graphic and they’re very realistic,” the mother said. “There’s no way someone who didn’t know her wouldn’t think: ‘that’s her naked,’ and that’s the scary part.” There were more than 30 images of her daughter.

The photos were part of a cache of images allegedly taken from 60 girls’ public social media accounts by two teenage boys, who then created 347 AI-generated deepfake pornographic images and videos, according to the Lancaster County District Attorney’s Office. The two boys have now been criminally charged with 59 counts of “sexual abuse of children,” and 59 counts of “posession of child pornography,” among other charges, including “possession of obscene materials depicting a minor.”

  • Serinus@lemmy.world
    link
    fedilink
    arrow-up
    5
    arrow-down
    8
    ·
    edit-2
    10 days ago

    This act of sexual abuse is going to change how 60 girls and soon to be woman respond to sex, likely for the rest of their lives. These images may follow them forever

    No, it’s not. No, it shouldn’t.

    First, it’s so, so much easier to deal with when you have the response of “that’s not me”. Second, it’s current AI. How real do these things even look?

    These girls were not sexually abused. Sexual harassment is more a appropriate crime. Maybe libel. Maybe a new crime that we can call “sexual libel” or something.

    • Coskii@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      2
      arrow-down
      4
      ·
      10 days ago

      Current AI for generating sexual images is on the real side of the uncanny valley at this point. If you’re really looking you might be able to tell, but I don’t think most people looking for porn are going to scrutinize anything too closely in the first place… So real enough.

      However, I don’t see how 60 images of what’s effectively a face plastered on a miscellaneous body doing something sexual would follow anyone for anything. Anyone who knows of them and outs themselves just admitted to child porn…

      Most people don’t have such unique facial features that would be something that could even follow them in the first place.

      As for the criminal aspect of it, that’s a societal thing to figure out, so here they go figuring it out.