The translation service in China's biggest messaging app, WeChat, is being retooled after offering a racist slur as a translation for the phrase "black foreigner."
Ann James, a black theatre director based in Shanghai, messaged her colleagues in English on Wednesday to say she was running late. When a coworker replied in Chinese, WeChat translated their message into English as "The nigger is late." As Sixth Tone explains, "hei laowai," the term the coworker actually used, is a neutral phrase meaning "black foreigner." But until the issue was raised following James' rude awakening, WeChat sometimes translated it as the n-word.
WeChat sent Sixth Tone the following apology, but gave no further explanation: "We're very sorry for the inappropriate translation. After receiving users' feedback, we immediately fixed the problem." The platform boasts a staggering 700 million users worldwide and, in China, is used for everything from booking plane tickets to paying utility bills to office communications.
WeChat confirmed their software uses neural machine translation, AI that's been trained on vast quantities of text to gain new vocabulary and, crucially, discern the specific contexts in which to use these new words. That second part may be what triggered the slur. From Sixth Tone:
A local English-language media outlet, That's Shanghai, reported the story and found that the translator gave neutral translations in some instances but used the slur when the phrase in question included a negative term, such as "late" or "lazy." Sixth Tone's own testing on Wednesday evening found similar results.
Recognising patterns is the core of language AI. Neural language processing AI picks up on patterns between associated words, then spits them back out. In 2016, for example, researchers used algorithms trained on Google News copy to uncover associations the news crawler was picking up. As the algorithm determined, "Emily" is to "Ebony" as "pancakes" are to "fried chicken." In another case, it found "man" is to "woman" as "doctor" is to "nurse."
This is essentially how you "teach" AI to be racist. AI literalizes negative connotations. If the phrases "black foreigner" or "black person" are used as slurs in conjunction with words like "lazy" or "slow" in the source text, the AI picks up on those patterns and makes them explicit. All the AI does is repeat the associations buried in its source.
Interestingly, the derogatory translations seem to have been provided by an unspecified service, while the inoffensive translations were explicitly performed by Microsoft Translator. That's Shanghai couldn't replicate the offensive translation in either Bing Translator or Microsoft's Neural Machine Translation system. "Hei laowai," even when coupled with words like "lazy" or "rude," still produced "black foreigner." One could infer that Microsoft's platform either removed the slur already or that association had never been made.
Neural language processing was invented so AI could speak and think more like humans. Sadly, they're learning the worst of what we offer.