placeholder

A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal.

Google has an automated tool to detect abusive images of children. But the system can get it wrong, and the consequences are serious.

Click to view the original at nytimes.com

Hasnain says:

Sigh. This is just a scary situation all around. Appeals processes need to be better as well as detection of CSAM. So many trade offs here but I wish companies would strive to do better. Or at least apologize when they got it wrong (though I understand why they can’t — lawsuits)

“A Google spokeswoman said the company stands by its decisions, even though law enforcement cleared the two men.

Ms. Hessick, the law professor, said the cooperation the technology companies provide to law enforcement to address and root out child sexual abuse is “incredibly important,” but she thought it should allow for corrections.

“From Google’s perspective, it’s easier to just deny these people the use of their services,” she speculated. Otherwise, the company would have to resolve more difficult questions about “what’s appropriate behavior with kids and then what’s appropriate to photograph or not.”

Posted on 2022-08-21T16:34:10+0000