Experts say case highlights well-known dangers of automated detection of child sexual abuse images

Google has refused to reinstate a man’s account after it wrongly flagged medical images he took of his son’s groin as child sexual abuse material (CSAM), the New York Times first reported. Experts say it’s an inevitable pitfall of trying to apply a technological solution to a societal problem.

Experts have long warned about the limitations of automated child sexual abuse image detection systems, particularly as companies face regulatory and public pressure to help address the existence of sexual abuse material.

Continue reading…

You May Also Like

People are just realising clever password trick instantly boosts their Wi-Fi speed

IT’S well-known that too many devices can slow down your internet speeds.…

Jodie Whittaker exits Doctor Who with surprise regeneration twist

First female lead bows out with an unexpected end in a cameo-laden…

Google AI flags dad who had photos of his child’s groin infection on his phone to share with doctors

A father was locked out of his Google Photos account after storing…

Six of Europe’s best slow, scenic rail journeys

The views are better from a slow train, says our rail expert,…