UK Police’s Porn-Spotting AI Keeps Mistaking Desert Pics for Nudes
Posted by EditorDavid on 23rd December 2017
| 286 views

link to original article

An anonymous reader quotes Gizmodo:
London’s Metropolitan Police believes that its artificial intelligence software will be up to the task of detecting images of child abuse in the next “two to three years.” But, in its current state, the system can’t tell the difference between a photo of a desert and a photo of a naked body… “Sometimes it comes up with a desert and it thinks its an indecent image or pornography,” Mark Stokes, the department’s head of digital and electronics forensics, recently told The Telegraph. “For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin colour.”
The article concludes that the London police software “has yet to prove that it can successfully differentiate the human body from arid landscapes.”

Read more of this story at Slashdot.

Please disable your AdBlocker so our free service can continue delivering you breaking news, insightful analysis, and a collection of aggregated content that will keep you informed like no other.