Should Wikipedia have robot helpers?

P1-BQ717_WIKIGU_G_20140713185543Wikipedia is famously open to anyone to edit. In practice, whether your edits sustain themselves depends upon other Wikipedia editors and the enforcement of various norms and rules that have emerged.

The WSJ highlighted a new type of Wikipedia editor — essentially a robot.

Sverker Johansson could be the most prolific author you’ve never heard of.

Volunteering his time over the past seven years publishing to Wikipedia, the 53-year-old Swede can take credit for 2.7 million articles, or 8.5% of the entire collection, according to Wikimedia analytics, which measures the site’s traffic. His stats far outpace any other user, the group says.

He has been particularly prolific cataloging obscure animal species, including butterflies and beetles, and is proud of his work highlighting towns in the Philippines. About one-third of his entries are uploaded to the Swedish language version of Wikipedia, and the rest are composed in two versions of Filipino, one of which is his wife’s native tongue.

An administrator holding degrees in linguistics, civil engineering, economics and particle physics, he says he has long been interested in “the origin of things, oh, everything.”

It isn’t uncommon, however, for Wikipedia purists to complain about his method. That is because the bulk of his entries have been created by a computer software program—known as a bot. Critics say bots crowd out the creativity only humans can generate.

Mr. Johansson’s program scrubs databases and other digital sources for information, and then packages it into an article. On a good day, he says his “Lsjbot” creates up to 10,000 new entries.

It is actually not a simple task to build an algorithm to write Wikipedia entries. You have to understand context and sort the data appropriately. Moreover, what Johansson apparently does is create ‘stubs’ of entries that should be in Wikipedia but are not and populates it with some basic data. To complete an entry, someone will have to add to it. But there is a sense that having an entry available is better than not having one and many human editors identify and create stubs as well.

However, Johansson’s robot has created controversy.

Achim Raschka is one of the people who would like Mr. Johansson to change course. The 41-year-old German Wikipedia enthusiast can spend days writing an in-depth article about a single type of plant.

“I am against production of bot-generated stubs in general,” he said. He is particularly irked by Mr. Johansson’s Lsjbot, which prizes quantity over quality and is “not helping the readers and users of Wikipedia.”

Mr. Raschka says these items “only contain more or less correct taxonomic information, not what the animal looks like and other important things.”

This seems to me to be misplaced criticism. After all, Google organises the world’s information using algorithms and people don’t ordinarily call for the use of humans instead. Moreover, Johansson is really just using a tool to augment the powers of Wikipedia editors and, along the way, not violating any of the “rules.” Indeed, this suggests that there is more opportunity for algorithm supported editors to improve Wikipedia. After all, if robots can now write and deliver the news …

Leave a comment