AI Translations Are Adding ‘Hallucinations’ to Wikipedia Articles
Created: 2026-03-12
Wikipedia editors have implemented new policies and restricted a number of contributors who were paid to use AI to translate existing Wikipedia articles into other languages after they discovered these AI translations added AI “hallucinations,” or errors, to the resulting article.
See in context at AI Translations Are Adding ‘Hallucinations’ to Wikipedia Articles
Created: 2026-03-12
Wikipedia editors investigated how OKA was operating and found that it was mostly relying on cheap labor from contractors in the Global South, and that these contractors were instructed to copy/paste articles to popular LLMs to produce translations.
See in context at AI Translations Are Adding ‘Hallucinations’ to Wikipedia Articles
Created: 2026-03-12
Grok, which also produces an entirely automated alternative to Wikipedia called Grokepedia, is prone to errors precisely because it does not use humans to vet its output.
See in context at AI Translations Are Adding ‘Hallucinations’ to Wikipedia Articles
Created: 2026-03-12
Using AI to check the output of AI for errors is a method that is historically prone to errors. For example, we recently reported on an AI-powered private school that used AI to check AI-generated questions for students. Internal testing found it had at least a 10 percent failure rate.
See in context at AI Translations Are Adding ‘Hallucinations’ to Wikipedia Articles