UK Police Computer Mistakes Desert Photography for Illegal Porn -- Curvy sand dunes mistaken for suggestive female poses

Police AI Mistakes Desert Photography for Illegal Porn

Police have launched AI bots to spot illegal pornography, but the algorithm mistakes desert photography for porn.

London Police are using the image recognition software, capable of identifying drugs and guns, to recognize adult content to keep officers from having to search through gigabytes of porn themselves.

“For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin color,” said lead forensics expert Mark Stokes.

As of now, it cannot discern the difference between a desert landscape and a naked body, and it’s likely the peach color and curvatures of sand are to blame.

While a seemingly benign mistake, the over-reliance on AI underscores concerns of an inevitable Orwellian state fueled by absolute black-and-white, “Judge Dredd”-style enforcement.

Comment

You need to be a member of 12160 Social Network to add comments!

Join 12160 Social Network

Comment by DTOM on December 21, 2017 at 8:35am

The UK police have no interest in going after the real pedophiles and deviants in power - i.e the establishment, just as they have no interest in dealing with real crime.

Much easier to go after the proles for revenue, with the endless numbers of new laws designed to suppress free speech, what freedoms remain and cash-in on victimless crimes.

Photos

  • Add Photos
  • View All

Please remember this website is supported by your donations...

© 2018   Created by truth.   Powered by

Badges  |  Report an Issue  |  Terms of Service

content and site copyright 12160.info 2007-2015 - all rights reserved. unless otherwise noted