An AI should provide Wikipedia with better sources – knowledge

Although AI chatbots like Chat-GPT often sound credible, they are not necessarily so. They sometimes falsify information, invent names, Internet addresses, dates, medical explanations, and reconstruct entire plot lines of books, “even historical events that never happened.” writes that New York Times. Many researchers also report that artificial intelligence-based programs randomly invent scientific sources. It may come as a surprise that such a program should be able to clean up the sometimes lousily maintained source information in Wikipedia.

The English-language part of the online encyclopedia alone contains more than six million articles. Wikipedia is powered by the knowledge and collaboration of hundreds of thousands of users. To avoid nonsense, facts should be supported by sources that appear as footnotes with web links or literature references under the article. Users usually uncover incorrect sources quickly, writes a spokeswoman for Wikimedia, the support association of German Wikipedia, upon request. “This multiple-eye principle creates a certain level of security.” But the websites to which the footnotes refer do not always work. Sometimes they don’t back up what’s written. Or the editors and creators of the texts have cited dubious sources.

The solution could now come from an AI that can check sources and, if there are incorrect references, search the Internet for more suitable suggestions and thus replace the original source. According to Study in the specialist journal Nature Machine Intelligence A program called SIDE could make Wikipedia more reliable.

Researcher: Chat-GPT botches and hallucinates quotes

“It may seem ironic to use AI to assist with citations, given that Chat-GPT notoriously botches citations and hallucinates,” said AI scientist Noah Giansiracusa of Bentley University in Waltham, US the science magazine Nature. He was not involved in the study in the sister journal. “But it’s important to remember that AI language models are about much more than chatbots.”

In fact, the test with Wikipedia users showed that in almost two thirds of the cases they prefer the source suggested by the AI ​​to the source mentioned by Wikipedia. Previously, Fabio Petroni’s team from the London company Samaya AI, which is behind SIDE, fed the AI ​​with information about what “good sources” should be. To do this, they used Wikipedia entries as a model, which are advertised on the website as “Featured Articles” and are usually very carefully maintained by editors and moderators.

Petroni’s group writes that the system cannot yet improve the online encyclopedia on its own. The team wants SIDE to be seen as a tool, a kind of “citation assistant”. According to Wikimedia, AI tools could certainly make encyclopedic work easier. “The community is already discussing how exactly and where the boundaries are.” On the question of how reliable Wikipedia is, there is your own entryin which the reference work does well – but the imprint warns that Wikipedia the truthfulness of the entries is not guaranteed.

source site