Donate

Is Wikipedia part of a new ‘global brain’?

Everyone from Time to TV networks is singing the praises of user-generated ‘people’s content’ on the worldwide web. But is it reliable?

Theresa Clifford

Topics Science & Tech

The internet is often celebrated for giving a voice to anybody and everybody. But in a world of wannabe journalists and self-appointed ‘experts’ on every subject, it’s worth asking whether these people know more (or even as much) as those established, apparently old-fashioned experts who came before them.

For Time magazine, 2006 was all about ‘community and collaboration on a scale never seen before’. It was about ‘the cosmic compendium of knowledge Wikipedia and the million-channel people’s network YouTube and the online metropolis MySpace’. Accordingly, the new web world has become ‘a tool for bringing together the small contributions of millions of people and making them matter’ (1). This trend looks set to continue in 2007, enabled by new Web 2.0 tools such as discussions groups, blogs, wikis and podcasts.

The rise of user-generated content has brought about an elevation of the role of amateurs. News organisations are crying out for ‘citizen journalists’, asking the public to help ‘make the news’; they are also offering money for eye-witness accounts and mobile-phone video clips (2). TV companies trawl the internet looking for entertaining clips to televise. In Britain, Channel 4 has just launched Homemade, a user-generated video show, and ITV1 gave primetime coverage to I Was There: The People’s Review, which aired grassroots, frontline footage from big and not-so-big events of 2006.

Meanwhile, big brand companies ask their customers to design new products, create original advertising campaigns and help to develop future company policies. Nokia’s ‘Concept Lounge’, L’Oreal’s ‘You Make the Commercial’ and Orange’s ‘Talking Point’ are cases in point (3).

With this proliferation of user-generated content has come the notion of a collective intelligence, or what some have termed ‘a global brain’ (4). Such collective intelligence is supposedly made up of an international community of ordinary people who contribute and peer review content to ensure that the highest level of knowledge on a particular subject is attained. Dan Gillmor, technology journalist and avid promoter of citizen journalism, sums up the thinking behind these developments with his mantra: ‘My readers know more than I do….’ (5)

The idea behind collective intelligence is that any one can contribute to the knowledge pool on any chosen subject. As such, all views contribute to the collective sum of all human knowledge. And increasingly, collective intelligence is seen as preferable to professional expertise. Indeed, many in the blogosphere seem fundamentally suspicious of funded or ‘engineered’ content – that is, researched, credible, verified and edited content.

Of course, everyone is entitled to voice their own opinion and contribute online, if they have access. But is it really the case that we no longer need professional expertise? And are all opinions equally valid – or are some opinions simply more valid than others?

Wikipedia is an online encyclopaedia that relies on volunteers to pen its millions of articles. Unlike Encyclopaedia Britannica, which charges for its content and pays a staff of experts to research and write its articles, Wikipedia gives away its content for free and allows anyone – amateur or professional, expert or novice – to submit and edit entries. Much was made of a study conducted by Nature magazine at the end of 2005, which found that Wikipedia was about as accurate in covering scientific topics as was Encyclopaedia Britannica. According to the survey, based on 42 articles reviewed, the average scientific entry in Wikipedia contained four errors or omissions, while the average entry in Encyclopaedia Britannica contained three. Of eight ‘serious errors’ the reviewers found, including misinterpretations of important concepts, four came from Wikipedia and four from Encyclopaedia Britannica.

However, soon after this report was published, Encyclopaedia Britannica published a damning response accusing Nature of misrepresenting its own evidence. Dozens of inaccuracies attributed to Encyclopaedia Britannica were, in fact, not inaccuracies at all, and a number of the articles examined were not even in Encyclopaedia Britannica. It has been reported that the study was poorly carried out and its findings were laden with errors; one publication accuses Nature of ‘cooking’ the report (6).

Yet hundreds of publications jumped on the Nature story, echoing the argument that Wikipedia (based on collective intelligence) was as good as Encyclopaedia Britannica (based on professional knowledge). Jim Wales, founder of Wikipedia, continues to cite the Nature survey in his defence when quizzed about the accuracy of information on Wikipedia.

Those who advocate collective intelligence believe that the whole is greater than the sum of any individual part – expert or amateur. But to reach such understanding requires the pulling together of all the disparate parts into a coherent, reliable and credible whole. Who acts as the filter? Surely the only way of achieving a coherent overview is to invite experts to sift through the barrage of content and judge what is quality and what is not?

It is true that a small proportion of bloggers are authorities in their field or are ‘professional amateurs’ (7). In fact, over 70 per cent of the content on Wikipedia is written by less than two per cent of the total number of contributors. But this does not warrant the idea that we are witnessing the ascendancy of a new collective intelligence that will develop and disseminate ideas across the globe.

As demonstrated by Wikipedia, even ‘best of breed’ advocates of collective intelligence cannot maintain or guarantee intellectual rigor by mere virtue of being a self-monitoring community. Quoted in The New Yorker, Eric Raymond, the open-source pioneer whose work inspired the development of Wikipedia, argued that ‘the open-source model is inapplicable to an encyclopaedia. For software there is an objective standard; either it works or it doesn’t. There is no such test for the truth.’ (8)

Ironically, Nature magazine has itself come up against the limits of user-generated content. It recently announced that it is abandoning its experiment with an open, online peer-review process to help vet scientific research, citing lack of interest in the review process where real knowledge was necessary for any meaningful participation (9).

As outlined in the Annual Online Customer Engagement Survey 2006, internet-based technologies are potentially extremely useful tools for interactivity. And despite its weaknesses, Wikipedia is a great entry point for finding information and linking to other sites of interest (10). There is undoubtedly a role for user-generated content today.

And yet, the new web world can surely only fulfil its true potential through more rigorous standards, frameworks and filters on the quality of content generated (11). Such content should not be elevated to the grandiose, over-blown status of an emergent ‘collective intelligence’ – especially if that is at the expense of expert knowledge, which remains invaluable today.

Theresa Clifford is director of digital agency cScape.

(1) Time’s Person of the Year: You, Time, 13 December 2006

(2) See the South Korean news website whose motto is ‘Every citizen is a journalist’

(3) Advertisers look to grassroot marketing, C.NET, 4 April 2006

(4) See Customer-Made

(5) We the Media: Grassroots Journalism By the People, For the People, Dan Gillmor, O’Reilly 2004, pxiv, 113

(6) Nature mag cooked Wikipedia study, The Register, 23 March 2006

(7) The Pro-am Revolution: How Enthusiasts Are Changing Our Economy and Society, Charles Leadbeater, Paul Miller, Demos, 2007

(8) Know it all, Can Wikipedia conquer expertise? Stacy Schiff, The New Yorker, 31.7.2006
http://www.newyorker.com/fact/content/articles/060731fa_fact

(9) Nature’s Failure Shows the Limits of User-Generated Content, Information Week, 22 December 2006

(10) Annual Online Customer Engagement Survey, cScape/E-Consultancy, 2006

(11) For further reading on the role of filters see The Long Tail: How Endless Choice is Creating Unlimited Demand, Chris Anderson, 2006

To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.

Topics Science & Tech

Comments

Want to join the conversation?

Only spiked supporters and patrons, who donate regularly to us, can comment on our articles.

Join today