Google Translate is sexist and it needs a little gender-sensitivity training

Online translation tools have helped us learn new languages, communicate across linguistic borders, and view foreign websites in our native tongue. But the artificial intelligence behind them is far from perfect, often replicating rather than rejecting the biases that exist within a language or a society.

Such tools are especially vulnerable to gender stereotyping because some languages (such as English) do not tend to gender nouns, while others (such as German) do.

When translating from English to German, translation tools have to decide which gender to assign English words like “cleaner”. Overwhelmingly, the tools conform to the stereotype, opting for the feminine word in German.

Biases are human: they are part of who we are. But when left unchallenged, biases can emerge in the form of concrete negative attitudes towards others. Now, our team has found a way to retrain the AI behind translation tools, using targeted training to help it to avoid gender stereotyping….

Exit mobile version