Police have launched AI bots to spot illegal pornography, but the algorithm mistakes desert photography for porn.
London Police are using the image recognition software, capable of identifying drugs and guns, to recognize adult content to keep officers from having to search through gigabytes of porn themselves.
“For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin color,” said lead forensics expert Mark Stokes.
As of now, it cannot discern the difference between a desert landscape and a naked body, and it’s likely the peach color and curvatures of sand are to blame.
While a seemingly benign mistake, the over-reliance on AI underscores concerns of an inevitable Orwellian state fueled by absolute black-and-white, “Judge Dredd”-style enforcement.
MySpace Tweet Facebook Facebook
Comment
The UK police have no interest in going after the real pedophiles and deviants in power - i.e the establishment, just as they have no interest in dealing with real crime.
Much easier to go after the proles for revenue, with the endless numbers of new laws designed to suppress free speech, what freedoms remain and cash-in on victimless crimes.
"Destroying the New World Order"
THANK YOU FOR SUPPORTING THE SITE!
© 2024 Created by truth. Powered by
You need to be a member of 12160 Social Network to add comments!
Join 12160 Social Network