Language defines the world. Machines that translate it are inevitable contributors to this definition. Although AI-based translation tools are constantly coming up with better results – with regard to “gender” and “race” they exhibit many shortcomings and distortions. In a project entitled Artificially Correct, the Goethe-Institut – which promotes inclusive language – is working with experts to develop new tools that minimise the bias in translations. The aim is to strengthen the position of translators and establish a conscious approach to translation machines – whilst ensuring that the realities of as many people as possible are included in the translation process. In this panel, participants and jury members of the Artificially Correct Hackathon demonstrate ways to approach this bias problem.
Speakers: Janiça Hackenbuchner, Masters student, Cologne; Marvin Mouroum, computer vision engineer, Berlin; Danielle Saunders, computer scientist, RWS Group
Moderation: Simon Caton, Professor of Computer Science, University College Dublin
The AI Festival WHEN MACHINES DREAM THE FUTURES has been a project organized by Goethe-Institut and Deutsches Hygiene-Museum, Dresden.
Goethe-Institut, Generation A=Algorithmus, 2021
www.goethe.de/ai-festival
www.goethe.de/generationa
Смотрите видео AI Festival: How to Fix Bias in Machine Translation онлайн без регистрации, длительностью часов минут секунд в хорошем качестве. Это видео добавил пользователь Goethe-Institut 14 Ноябрь 2021, не забудьте поделиться им ссылкой с друзьями и знакомыми, на нашем сайте его посмотрели 1,14 раз и оно понравилось 1 людям.