Friday, 26 March 2010


There are only a few buzzwords which irritate me more than this one. But what exactly is crowdsourcing? According to Wiktionary, it means “delegating a task to a large diffuse group, usually without substantial monetary compensation”. While Jeff Howe, who coined this horrid word back in 2006, gives us two definitions:
The White Paper Version: Crowdsourcing is the act of taking a job traditionally performed by a designated agent (usually an employee) and outsourcing it to an undefined, generally large group of people in the form of an open call.

The Soundbyte Version: The application of Open Source principles to fields outside of software.
Thus, according to the White Paper Version, crowdsourcing is kind of outsourcing. Which, like it or not, is a means to increase capitalist exploitation by paying less for the same job. I like the the Soundbyte Version better even though it is a bit too vague. But what it has to do with crowdsourcing?

Folk music is not created by some amorphous Volk mass but by the individual musicians, most of whom do have names. Similarly, the open content is not created by a crowd. In words of Dan Woods,
There is no crowd in crowdsourcing. There are only virtuosos, usually uniquely talented, highly trained people who have worked for decades in a field. Frequently, these innovators have been funded through failure after failure. From their fervent brains spring new ideas. The crowd has nothing to do with it. The crowd solves nothing, creates nothing.
Take this call to the community regarding stereochemistry of digitonin (mentioned on my other blog). Where is a crowd? Everyone who responded is an expert. (Yes, that includes me, even if I say so myself.) And was the problem solved? No.

Wikipedia’s list of crowdsourcing projects include Wikipedia itself, “despite objections by co-founder Jimmy Wales to the term” (and, I bet, to the annoyance of many authors of Wikipedia articles); InnoCentive, Goldcorp and other companies which give cash prizes to individual solvers responding to a challenge; a few nice examples of citizen science; and, wait, some bona fide crowdsourcing:
In January 2008, the State of Texas announced it would install 200 mobile cameras along the Texas-Mexico border, to enable anyone with an Internet connection to watch the border and report sightings of alleged illegal immigrants to border patrol agents.
Now you’d think that this latter initiative, being not as intellectually challenging as, say, virtual protein folding, may actually work. But no: all these millions of web hits so far failed to translate “into much law enforcement work”.