Bots On Wikipedia Wage Edit Wars Between Themselves That Last For Years

Bots On Wikipedia Wage Edit Wars Between Themselves That Last For Years

Revision wars on Wikipedia amongst human editors is an all-too-common occurrence, but new research from the UK shows that similar online battles are being waged between the site’s software robots.

As a new study published in PLOS ONE reveals, Wikipedia’s bots don’t always get along, frequently undoing each other’s edits. These online algorithms, each equipped with their own instructions and goals, engage in sterile “fights” over content that can persist for years. The new research shows how relatively “dumb” bots can produce complex interactions and behaviours, and how developers need to stay on top of their digital creations. This has implications not just for the quality of Wikipedia pages, but for the development of AI in general — particularly any autonomous agents set loose on the web.  

There are currently 41,517,866 pages in the English version of Wikipedia. That’s a ton of content — far more than the site’s human editors are able to handle. To help maintain this gargantuan open-source encyclopedia, thousands of software bots sift through the site, performing such menial and repetitive tasks as deleting vandalism, enforcing bans, correcting bad spelling, creating links, and automatically importing content.

Overall, bots represent just 0.1 per cent of Wikipedia editors, but they stand behind a significant proportion of sites’s edits. Unfortunately, the software developers who create the bots don’t really understand or account for how bots interact with each other. Like the nature of Wikipedia itself, the creation of bots is a decentralized process, with individual contributors developing their own scripts. An approvals group exists, but its members are going strictly by Wikipedia’s bot policy, which doesn’t take bot-on-bot interactions into consideration. 

Indeed, every once in a while, a bot makes a certain to change to a page that another bot tries to undo. Each bot is designed and dispatched to perform a specific task, but sometimes, tasks can run into conflict with those of another bot. Unlike human editors, the bots can’t negotiate with each other, and like the good automatons that they are, they simply do as they’re programmed. Once these bots have been unleashed into the abyss that is Wikipedia, their human developers are largely oblivious to the ensuing bot interactions.

To understand the degree to which bot fights disrupt Wikipedia, computer scientists from the Oxford Internet Institute and the Alan Turing Institute studied how these algorithms interacted across 13 different language editions of the website over a ten year period (2001 to 2010). By tracking the edits made to each page, and ensuring that no human editors were involved, the researchers were able to observe how the bots interacted with each other, and how their encounters often led to unpredictable consequences.

Interestingly, the actions of the site’s bots varied according to their distinct cultural environments.

“This has implications not only for how we design artificial agents but also for how we study them,” said the study’s lead author Milena Tsvetkova in a statement. “We need more research into the sociology of bots.”

Overall, bots undid each other’s work a lot. The bots on the Portuguese version of Wikipedia were the most antagonistic, reverting the work of other bots 185 times over the ten year period, on average. At the English site, the researchers recorded an average of 105 revisions made by a bot on another bot’s work over the same period (that’s about three times the rate of human edits). The German bots were the most civil, making an average of just 24 reversion edits over a decade. These disparities in editing coordination may be due to different language editions having slightly different naming rules and conventions.

The bots also behaved differently than human editors, triggering edits much later than human editors, and engaging in protracted conflicts. Humans, because they’re prompted about changes to a page by auto-alerts, tend to make any fixes within minutes, and then move on to the next thing. But the Wikipedia bots typically made their first revision about a month after the initial revision, then persisting in a back-and-forth for years at a time. These edit wars aren’t catastrophic, but given the constant stream of changes, it could confuse people who read the site.

Bots are slower than humans (and clearly more persistent!) with respect to revisions, because they “crawl” over web articles in search of edits (rather than receiving alerts), and they’re often restricted in terms of the number of edits allowed over an allotted period of time. But the fact that bots are able to continue these battles for so long is a strong indication that human programmers are failing to catch potential editing problems early enough.

Importantly, many of these bot-on-bot conflicts stopped at the beginning of 2013, when Wikipedia make some changes to the way that inter-language links work on the site. That said, the researchers say this episode in Wikipedia’s history shows that a system of simple bots can produce complex dynamics and unintended consequences. Looking further ahead, it’s a potential portent of things to come as new and more complex “botosopheres” emerge around the web. It’s a worrying sign that conflict can emerge so easily and quickly within digital ecosystems.

In particular, the observation that a single piece of technology can yield different outcomes depending on the cultural environment has implications for artificial intelligence research. Understanding what affects bot-on-bot interactions, the researchers say, will be crucial for any autonomous process, from managing social media to tracking cyber-security to developing self-driving vehicles.

“An automated vehicle will drive differently on a German autobahn to how it will through the Tuscan hills of Italy,” noted study co-author Taha Yasseri. “Similarly, the local online infrastructure that bots inhabit will have some bearing on how they behave and their performance.”

Yasseri says that bots on Wikipedia are designed by humans from different countries, which can lead to online clashes. “We see differences in the technology used in the different Wikipedia language editions and the different cultures of the communities of Wikipedia editors involved create complicated interactions,” he says. “This complexity is a fundamental feature that needs to be considered in any conversation related to automation and artificial intelligence.”

As already mentioned, Wikipedia does enforce a bot policy. Bots, Wikipedia says:

are potentially capable of editing far faster than humans can; and have a lower level of scrutiny on each edit than a human editor; and may cause severe disruption if they malfunction or are misused.

To prevent potential problems, developers must make sure that Wikipedia’s bots only perform tasks for which there is consensus, and that they adhere to the site’s policies and guidelines, among other restrictions. But as this new study shows, bots also need to be programmed to work amongst themselves.

[PLOS ONE]


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.