Are we training our translation machines to be racist?

October 18, 2017
Are we training our translation machines to be racist?

One could be forgiven for assuming that machines using artificial intelligence would be neutral in their approach, free from prejudices and seeing all humans as essentially the same. However, it seems that machine learning is vulnerable to picking up racist and sexist attitudes, as Chinese messaging app WeChat has just demonstrated. 

Racial slur

WeChat has hit the headlines for its controversial translation of ‘hei laowai.’ This is a neutral phrase in China with a literal translation of ‘black foreigner.’ However, WeChat’s neural network-based translation function has been found in some instances to translate the phrase to something far more offensive: the n-word. 

The circumstances behind the translation are interesting. Hei laowai was not translated into the racial slur in every instance. When it was used in a positive context, the translation was literal, such as in the example “Black foreigners is cool” (the poor grammar is of course another reason why machine translation is not to be trusted, but that’s an aside from today’s main point!). However, when combined with more negative words such as ‘late,’ ‘lazy’ or ‘thief,’ hei laowai was translated very differently. Shanghai-based black American Ann James first reported the error to WeChat, when the system provided her with a translation reading: “The nigger’s still late.” 

Learning from humans

The problem seems to be that machine learning of this nature is undertaken using a vast array of texts written by humans. Those texts include countless instances of racism, sexism, ageism and pretty much any other -ism imaginable, both in their attitudes and in the language they use. As such, we’re training machine learning systems to incorporate just the same prejudices and biases as the human brain is capable of producing. At a basic level, machines learn from the data that we put into them. If that data is flawed, then the resulting program will reflect those flaws. 

Machine-based sexism

In the case of the WeChat controversy, the messaging system’s algorithm has now been updated, removing the use of the n-word entirely. This is not the first time that machine learning has been shown to be flawed when it comes to the provision of translation services

Google has come under scrutiny for the sexist bias of some of its machine-translated phrases. One issue was flagged up by Turkish and English speakers. Their efforts to translate gender-neutral Turkish phrases such as “they are a doctor” and “they are a nurse” were met with assumptions about gender as part of the translation process when converted to English. Google’s translation system assumed that the doctor was male and the nurse was female when providing the English translations. 

Human translation services

These examples highlight the ongoing need for human translation services, despite technological advances. In WeChat’s own words,

“As with other machine translation engines, which require time to learn and optimise their vocabulary banks in order to attain accurate translation, our automated translation engine is still undergoing the learning and optimisation process.”

Essentially, machine translation technology isn’t yet good enough to rival human translation. It may be serviceable in terms of making oneself understood, but grammatical errors and linguistic flaws mean that it is a long way from being a threat to the livelihoods of those making a living providing professional translation services. 

Add in the potential for machine-learned prejudices and it’s likely to be a long time before a truly convincing and trustworthy machine translation service emerges!

Final thoughts

Do you think it will ever be possible to train a machine to translate with the finesse of a human? Will attempts to do so always produce flawed results, based on our own flaws? Share your thoughts via the comments. 

STAY INFORMED

Subscribe to receive all the latest updates from Tomedes.

Post your Comment

I want to receive a notification of new postings under this topic

Tomedes Ltd - USA 9450 SW

Gemini Dr #34540, Beaverton, Oregon

US: +1 985 239 0142

UK: +44 1615 096140

support@tomedes.com