Communication breakdown

Microsoft's chatroom shutdown won't protect children, but it might harm adults' freedoms.

Sandy Starr

Share
Topics Science & Tech

‘In the UK, we’ve taken a decision to close chat completely. There will be no moderation, there will be no unmoderated chatrooms. It just will not exist.’

So said Matt Whittingham, head of customer satisfaction at Microsoft Network (MSN) UK, earlier this week. He was speaking alongside me on the Jeremy Vine Show on BBC Radio 2, explaining Microsoft’s decision to close down all of its chatrooms in Europe, the Middle East, Latin America and most of Asia.

Microsoft’s aim is ‘protecting users from unsolicited information such as spam and to help safeguard children from inappropriate communication online’. Its decision was made in the wake of various paedophile panics, sexual abuse cases and teenage elopements, such as the high-profile disappearance of 12-year-old Shevaun Pennington with an older man she met in a chatroom (1).

Surely such rare incidents, while distressing, do not justify shutting down a service used by millions of people? And why should a communications medium bear responsibility for the activities of those who use it? Paedophilia was not created by the internet, but existed beforehand. And underage girls have been running away with older men long before the internet was invented.

As the UK Daily Telegraph points out, Microsoft’s arguments ‘could be advanced for closing down the telephone system, the Royal Mail, the Church of England, the Boy Scouts, the Girl Guides, the Youth Hostels Association, the Duke of Edinburgh’s Award Scheme, all schools, the NHS, the railway network, the seaside holiday…’ (2).

Guardian columnist Emily Bell argues that ‘disappointing a number of paedophiles for whom the forum is a low-effort alternative to visiting the local swimming baths or joining the Scouts or becoming ordained into the Catholic church’ does not justify a decision that ‘reinforces the disappointingly widely held belief that the internet is a tool of Satan’ (3).

But when such arguments were put to Matt Whittingham on the Jeremy Vine Show, he was adamant that ‘although the number of paedophiles may be very small, the cases that have happened are extremely serious and disturbing’. It seems that for Microsoft, the abhorrent nature of child abuse justifies any means taken to suppress it, regardless of the true scale of the problem and whether the internet bears any responsibility for it.

John Carr, associate director of the charity National Children’s Homes, was also on the show. He argued: ‘I’m sure for the great majority of children who used chatrooms, they were perfectly safe most of the time. But sadly, over the past two or three years, there have been at least 26, 27 cases, thereabouts, where children, typically 13- or 14-year-old girls, have gone in there, met somebody, been groomed by them, who’s persuaded them to meet them in real life, where they’ve then been raped or otherwise seriously sexually assaulted. And we only know about those because the guys were caught, convicted and sent to jail. What we don’t know about are all of the cases where the police couldn’t get enough evidence together.’

This argument highlights a problem with today’s reactions to the paedophile panic. It takes a small number of cases – ‘26, 27 cases, thereabouts’ – and blows them out of proportion. It invokes the category of ‘grooming’ to confuse communication between a child and an adult, where the child comes to no tangible harm, with actual child abuse. And it makes unknown quantities out to be sinister, suggesting that there is a terrible multitude of cases of child abuse that ‘we don’t know about’.

This last fallacy is perhaps the most misleading of all. Invoking ‘cases where the police couldn’t get enough evidence together’ belies the fact that under an equitable system of law, only when a conviction has been brought can it legitimately be assumed that a crime occurred. If the police ‘couldn’t get enough evidence together’, perhaps it was because there was insufficient reason to assume that child abuse had taken place.

While receiving praise from charities and child protection organisations for closing down its chatrooms, Microsoft has also received a fair amount of flak – not least from chatroom users, presumably unimpressed by the imputation that they are potential paedophiles (4). Ironically, however, much of the criticism of Microsoft is based on the same false assumptions that motivated its decision in the first place.

For example Microsoft’s decision has been declared ‘irresponsible’ by competitor Lycos, because Microsoft is supposedly abdicating its responsibility to champion ‘regulated chatrooms’. Another Microsoft competitor, Freeserve, declared the decision ‘reckless’, on the grounds that ‘MSN…is sending chatroom users underground’ (5).

Academic Rachel O’Connell, author of the alarmist research paper A Typology of Child Cybersexploitation and Online Grooming Practices, argues that by driving children to communicate more intimately outside of chatrooms, Microsoft’s decision creates a ‘perfect opportunity for paedophiles to exploit’ (6).

Such criticism misses the point, by buying into the questionable notions that there is a big threat to children from online paedophiles, and that it is incumbent on internet service providers to do something about it. In truth, it is the responsibility of parents, and parents alone, to supervise the activities of their children – whether online or off.

Microsoft’s more cynical critics have accused it of shutting down its chatrooms for economic reasons, and then passing the decision off as a moral one for PR purposes. Research company Gartner argues that ‘this is a business decision. Chatrooms in themselves do not drag in a whole lot of money’. The Guardian accused Microsoft of ‘making a virtue out of a necessity’ (7).

There may well be some truth to this. It would be reprehensible if Microsoft closed down its chatrooms for economic reasons, but dressed its justifications up in moralistic terms. Even if this was an economically driven decision, you could argue that shutting down chatrooms was still shortsighted; it would suggest a loss of faith on Microsoft’s part in the importance of free and easy internet communication (8).

It is also possible that Microsoft’s decision is a response to the increasing liability imposed upon internet service providers for the content that they host (9). If this is the case, then the decision is a cowardly one. As a leader in its field, Microsoft should be challenging such legal developments that threaten to stifle the internet, not capitulating to them.

Whatever Microsoft’s true motivations, there can be little doubt that the company is adopting an increasingly moralistic stance toward its customers, and internet users in general. Microsoft arrogantly describes its chatroom decision as ‘the latest in a series of measures to be announced by MSN and Microsoft in the battle against inappropriate use of the internet’ (10).

Who is Microsoft to dictate whether or not a particular use of the internet is ‘inappropriate’? Microsoft’s job is to provide internet users with world-class products that enable them to use the internet efficiently for whatever purpose they choose. To dictate user behaviour, as a way of cashing in on irrational internet panics, only sells the internet short.

Responding to my concerns about Microsoft’s decision, John Carr provided listeners of the Jeremy Vine Show with a chilling vision of what he would like to see the internet become: ‘People behave badly on the internet because they think they can get away with it. If we can convince them that there’s a 99.99 per cent probability that if they commit a crime, they can be quickly identified and apprehended, they’ll stop doing it. And then we can have the chatrooms back again, and they’ll be a lot safer than they are today.’

This vision of the internet may be comforting to Carr. But for the rest of us, an internet where ‘there’s a 99.99 per cent probability’ of being ‘identified and apprehended’ for what you say and with whom you fraternise, isn’t an internet worth having.

Sandy Starr has consulted and written on internet regulation for the Organisation for Security and Cooperation in Europe, and for the European Commission research project RightsWatch. He is a contributor to Spreading the Word on the Internet: Sixteen Answers to Four Questions, Organisation for Security and Cooperation in Europe, 2003 (download this book (.pdf 576 KB)); From Quill to Cursor: Freedom of the Media in the Digital Era, Organisation for Security and Cooperation in Europe, 2003 (download this book (.pdf 399 KB)); and The Internet: Brave New World?, Hodder Murray, 2002 (buy this book from Amazon (UK) or Amazon (USA)).

(1) MSN UK Takes Action on Chat, Microsoft, 24 September 2003. See Shevaun and the scaremongers, by Sandy Starr

(2) Chatting comes at a price, Daily Telegraph, 24 September 2003

(3) The myth of Satan’s web, Emily Bell, Guardian, 25 September 2003

(4) See Internet users debate MSN’s closure, BBC News, 24 September 2003

(5) Rivals condemn Microsoft chatroom closure, Helen Carter, Guardian, 25 September 2003

(6) Chatroom closure under fire, BBC News, 24 September 2003. See Shevaun and the scaremongers, by Sandy Starr

(7) Microsoft chat move ‘irresponsible’, BBC News, 24 September 2003; City diary, Guardian, Richard Adams, 25 September 2003

(8) See The worm turns on Gates, by Sandy Starr

(9) See Copycat copyright, by Sandy Starr

(10) MSN UK Takes Action on Chat, Microsoft, 24 September 2003

To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.

Share
Topics Science & Tech

Comments

Want to join the conversation?

Only spiked supporters, who donate regularly to us, can comment on our articles.

Become a spiked supporter
Share