Flickr’s Image Recognition Tool Is Making Some Embarrassing Errors

Flickr’s Image Recognition Tool Is Making Some Embarrassing Errors

Well, this is awkward. Flickr’s seemingly impressive image recognition system is making some embarrassing slips when identifying black people and concentration camps, according to the Guardian.

The newspaper explains that the new algorithm, which is designed to tag and then filter images by content, is “misfiring frequently.” It describes how a portrait of a black man named William got auto-tagged as “blackandwhite” and “monochrome” along with “animal” and “ape.” (Incidentally, a picture of a white woman also got the same treatment.) Elsewhere, pictures of the Dachau concentration camp were tagged with “jungle gym” and “sport”, while one of Auschwitz was also tagged as depicting “sport.”

But it’s worth pointing out that such mistakes are natural — and useful. Flickr uses a machine-learning approach to identify images, comparing new pictures to ones that it’s seen in the past to try and identify what they show. As a result, it won’t always get them right, so an important step is for users to delete inappropriate tags so that it can learn from its mistakes. In other words, it can learn faster by making a few mistakes and then having them corrected.

So, yes, the slips that Flickr’s algorithm has made are embarrassing. But they’re also going to make it less awkward in the future. [Guardian]


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.