Feeds:
Posts
Comments

Archive for July 17th, 2014

The Wall Street Journal writes “For This Author, 10,000 Wikipedia Articles Is a Good Day’s Work“. The “author” is, as some of you might have guessed, a “bot”, i.e. a program that searches for information on the Internet and compiles it into a Wikipedia entry.

For some, the problem with such an approach to encyclopaedic knowledge is that the entries are usually very short and very similar in style and structure. I suppose we all understand that a program isn’t going to produce 10,000 literary works of art every day – at least not today ;-)

Once it was monkeys doing the typing, now it's robots. Photograph: Getty Images (Click the image to go to the site of The Guardian, for an article  titled "Could robots be the journalists of the future?"

Once it was monkeys doing the typing, now it’s robots. Photograph: Getty Images(Click the image to go to the site of The Guardian,
for an article titled “Could robots be the journalists of the future?

On the other hand, if an encyclopaedia wants to be truly all-encompassing, then it needs more than just entries about popular themes. Let me quote the WSJ for an example:

On Swedish Wikipedia, for instance, he says, there are more than 150 articles on characters from “The Lord of the Rings,” and fewer than 10 about people from the Vietnam War. “I have nothing against Tolkien and I am also more familiar with the battle against Sauron than the Tet Offensive, but is this really a well-balanced encyclopedia?”

So after all is said and done (written) by a bot, it’s still up to humans to add more substance. But who knows what the future will bring? Machines have already transformed our society in the past; who knows what smart software machines will bring?

Advertisements

Read Full Post »