Bot wars

In particular, Yasseri and co focus on whether bots disagree with one another. One way to measure this on Wikipedia is by reverts—edits that change an article back to the way it was before a previous change.

Over a 10-year period, humans reverted each other about three times on average. But bots were much more active. “Over the 10-year period, bots on English Wikipedia reverted another bot on average 105 times,” say Yasseri and co.

And this:

Bots and humans differ significantly in their revert habits. The most likely time for a human to make a revert is either within two minutes after a change has been made, after 24 hours, or after a year. That’s clearly related to the rhythms of human lifestyles.

Robots, of course, do not follow these rhythms: rather, they have a characteristic average response time of one month.  “This difference is likely because, first, bots systematically crawl articles and, second, bots are restricted as to how often they can make edits,” say Yasseri and co.

Nevertheless, bots can end up in significant disputes with each other, and behave just as unpredictably and inefficiently as humans.

Many of the bots seem to be designed to make varyin- language versions of the same Wikipedia pages consistent with each other, yet the bots do not always agree.  Solve for the equilibrium, as they say…

Here is the article, via Michelle Dawson.

Comments

'“Over the 10-year period, bots on English Wikipedia reverted another bot on average 105 times,”'

To think that a machine programmed to do the same action repeatedly does the same action repeatedly.

Irony recognition fail.

Well, at least prior is consistent. /ducks

Could simply the result of bots, in average, making more edits than humans?

Twitter, 25% of posts being done by bots......and the bots are funnier: the magic realism bot, kim kierkegaardashian, wisdom of chopra.

As cars become driverless, machine driving algorithms will probably also disagree with each quite often, but likely in ways that are much safer than the ways humans disagree.

Ideally, the large majority of Wikipedia should probably be written by algorithm within 50 years. I'll run it by Jimmy.

Not familiar with the bot edit but it sounds more like an issue of replication where there is no authoritative change. Most replication software that manages datasets that reside and are updated in multiple locations typically have something in place to address such conflict. Is suspect the bots lack the AI elements to know better.

Comments for this post are closed