• Neshura@bookwormstory.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Also CSAM detection algorithms are known to misfire on occasion (it’s hard to impossible to tell apart a picture of a naked child sent for porn purposes and one not send for that) and people want to avoid any false allegations of that if at all possible.