Yelp's algorithms are being criticised after search results implied Korean and Chinese restaurants were serving diners dogs and cats. In cities across the US, Yelp suggested Korean and Chinese restaurants when "dog meat" or "cat meat" was entered into the search bar, alluding to a longstanding stereotype about East Asian restaurants.
Tagged With algorithmic bias
Most people are aware that algorithms control what you see on Facebook or Google, but automated decision-making is increasingly being used to determine real-life outcomes as well, influencing everything from how fire departments prevent fires to how police departments prevent crime. Given how much these (often secretive) systems have come to dominate our lives, it's time we got specific about how algorithms can hurt people. A new report seeks to do just that.
An alliance of more than 50 civil liberties groups and more than 50 individual AI experts sent dual letters to the US Department of Homeland Security (DHS) today, calling for the end of a plan to screen immigrants with predictive "extreme vetting" software. In a separate petition also launched today, several groups specifically urged IBM not to help build the extreme vetting tool. This winter, representatives of IBM, Booz Allen Hamilton, LexisNexis and other companies attended an information session with DHS officials interested in their capacity for predictive software, The Intercept reports.
When Apple debuted its new facial recognition unlock system, Face ID, in September, the company faced questions about how it would sidestep the security and bias problems that have undermined similar facial recognition systems in the past. Senator Al Franken was one of the many people curious about how exactly how Apple was going to ensure Face ID's success, and today, Apple responded to a series of questions sent by Franken's office the day after the system was announced.