Well, this is awkward. Flickrâs seemingly impressive image recognition system is making some embarrassing slips when identifying black people and concentration camps, according to the Guardian.
The newspaper explains that the new algorithm, which is designed to tag and then filter images by content, is âmisfiring frequently.â It describes how a portrait of a black man named William got auto-tagged as âblackandwhiteâ and âmonochromeâ along with âanimalâ and âape.â (Incidentally, a picture of a white woman also got the same treatment.) Elsewhere, pictures of the Dachau concentration camp were tagged with âjungle gymâ and âsportâ, while one of Auschwitz was also tagged as depicting âsport.âBut itâs worth pointing out that such mistakes are naturalâ"and useful. Flickr uses a machine-learning approach to identify images, comparing new pictures to ones that itâs seen in the past to try and identify what they show. As a result, it wonât always get them right, so an important step is for users to delete inappropriate tags so that it can learn from its mistakes. In other words, it can learn faster by making a few mistakes and then having them corrected.
So, yes, the slips that Flickrâs algorithm has made are embarrassing. But theyâre also going to make it less awkward in the future. [Guardian]
0 Response to "EnTech: Flickr's Image Recognition Tool Is Making Some Embarrassing Errors"
Post a Comment