• Dukeofdummies@kbin.social
    link
    fedilink
    arrow-up
    3
    arrow-down
    8
    ·
    11 months ago

    So fix that. Don’t make an AI to dole out justice against police like some messed up lottery. This is such a hollow solution in my mind. AI struggles to identify a motorcycle, people expect it to identify abuse?

    • quirzle@kbin.social
      link
      fedilink
      arrow-up
      11
      arrow-down
      1
      ·
      edit-2
      11 months ago

      So fix that.

      Were it so simple, it would have been fixed decades ago. The difference is that having AI review the footage is actually feasible.

      • Dukeofdummies@kbin.social
        link
        fedilink
        arrow-up
        3
        arrow-down
        2
        ·
        11 months ago

        Until you realize that the people who make the final decision on whether something the AI saw is indeed too far or extreme are the exact same people making the decision now and all we’ve succeeded in doing is creating a million dollar system that makes it look like they’re trying to change.

        • quirzle@kbin.social
          link
          fedilink
          arrow-up
          7
          arrow-down
          1
          ·
          11 months ago

          So what’s you’re proposed solution? Your directive to “fix that” was a bit light on details.

          This is a step in the right direction. The automated reviews will supplement, not replace, the reviewing triggered by manual reports you supported in your initial comment. I’d argue the pushback from police unions is a sign that it actually might lead to some change, given the reasoning the give in the article.