Ìý
Ìý
Published today, the ÌìÃÀ´«Ã½ (ÌìÃÀ´«Ã½) is pleased to have partnered with the University of Bristol on the 'Uses of AI Translation in UK Public Service Contexts'. This groundbreaking research, authored by Dr Lucas Nunes Vieira of the University of Bristol, examines a previously unstudied aspect of our public services: the use of machine translation tools by frontline workers.
The findings are both informative and concerning. They reveal significant use of AI-powered translation tools, including Google Translate and ChatGPT, in healthcare, legal, emergency, and police services - a practice that has largely gone unnoticed and unregulated.
The data, from over 2,500 UK professionals, shows that a third of respondents have used machine translation in their work, often in public-facing situations where miscommunication could have serious consequences. Of particular concern is the lack of institutional awareness and acknowledgement of this practice and the absence of appropriate policy frameworks to protect the public and public service workers themselves. The majority of respondents reported that machine translation had never been mentioned in their workplace training, despite its frequent use. This institutional silence means frontline workers are navigating complex linguistic situations with public service users and the public in ad hoc ways without guidance or support.
We wholly endorse the recommendations put forth in this report. The call for organisations to acknowledge the existence and potential use of AI/machine translation, to address that use in policies, and to place much more emphasis on staff education and training on AI and machine translation are all crucial steps. However, we believe these recommendations should be seen as a starting point rather than the end state. They should be implemented alongside robust safeguards and a commitment to maintaining human oversight by professional translators and experienced linguists in critical translation tasks.
The risks of getting translation wrong in public service contexts, through mistranslation, cultural insensitivity, or loss of nuance are simply too high to not use appropriately qualified language professionals. Another concern is the potential for AI to perpetuate or even amplify biases present in its ‘training data’, leading to systemic discrimination in translated content.
In light of these concerns, we strongly advocate for maintaining and, where possible, increasing public service budgets for professional translation services. While we recognise that it may not be realistic for human translation to be used in every circumstance, it is crucial that funding for skilled linguists is protected, especially in high-stakes situations where accuracy and cultural sensitivity are paramount.
Read the full report here
Ìý
The ÌìÃÀ´«Ã½ (ÌìÃÀ´«Ã½), IncorporatedÌýby Royal Charter, Registered in England and Wales Number RC 000808 and the IoL Educational Trust (IoLET), trading as ÌìÃÀ´«Ã½ Qualifications, Company limited by Guarantee, Registered in England and Wales Number 04297497 and Registered Charity Number 1090263.