Defending the indefensible online

The internet is a medium for words and pictures, not deeds. And words and pictures should be allowed free expression even where they appal us.

Sandy Starr

Topics Politics

The internet is potentially a great vehicle for freedom of speech – but only if we let it be.

This technology allows for the indulgence of any interest, and allows for communication between any number of like-minded individuals, via text, image and multimedia. But the emergence of the World Wide Web, and the explosion of email use over the past decade, has been met with suspicion, fear and hostility, and consequent attempts to limit free speech online.

Governments worldwide, faced with a medium that defies regulation and makes a nonsense of national legal boundaries, have responded with measures like the UK prime minister’s 80-strong team of ‘cyber cops’ (1).

Most of the moral panics surrounding the internet result from a fear of the diversity and quantity of information exchanged online. And given existing traditions of regulating publishing, broadcasting and other modes of expression (even in the world’s foremost democracies), fear of an unregulated internet is understandable.

But while the internet may be a document of the extremes of human interests, it says nothing about the extent of human actions. It is a medium for words and pictures, not deeds. And words and pictures should be allowed free expression even where they offend and appal us.

Those who support freedom of speech on the internet are often put in the position of defending the seemingly indefensible – ‘hate’ speech, pornography, or other commonly reviled internet material. Why? Because there is no ‘free speech for everybody online except racists’. There is just free speech online, and it is either defended or trampled upon.

Two recent examples illustrate the difficulties in defending offensive material on the internet – but also the importance of doing so. One is the issue of Nazi memorabilia; the other is child pornography.

In September 1999, internet portal and search directory Yahoo! was commended before a US Senate Judiciary Committee for combating racism on the internet. Those who testified approved of Yahoo!’s decision to remove listings of over 70 sites judged to be run by ‘hate groups’, and advocated pressuring Internet Service Providers (ISPs) to clamp down on undesirable online material (2).

This (entirely voluntary) gesture was great PR for Yahoo! – but it would come back to haunt the company, demonstrating that self-censorship makes censorship proper legitimate.

The International League against Racism and Anti-Semitism (LICRA) (3) and the Union of French Jewish Students (UEJF) (4), filed a lawsuit in France against Yahoo! in April 2000. Yahoo! was charged with hosting auctions of Nazi memorabilia on its US website (5). Under article R.645-2 of the French criminal code (6), the sale or display of any item that incites racism is illegal.

No Nazi memorabilia was being offered for auction on Yahoo!’s French website (7) (which adheres to French law). But the plaintiffs argued that the distinction between the websites was meaningless, given that the US website is only a mouse click away from its French equivalent.

In this case, the plaintiffs asserted the primacy of French law over US law (where free speech is protected by the First Amendment of the Constitution) (8). As Yahoo!’s lawyer Christophe Pecnard put it, ‘The question put before this court is whether a French jurisdiction can make a decision on the English content of an American site, run by an American company…for the sole reason that French users have access via the internet’ (9). The court answered ‘yes’ to that question. Yahoo! was forced to appeal to a US court that France had no jurisdiction in the case.

French law assumes that an object defined as ‘racist’ can lead automatically to racist violence. It is incontestable that items perceived to be racist cause offence – the French judge in the Yahoo! case argued that the auction of Nazi memorabilia offended ‘the collective memory of the country’ (10). But it is the case that, in other countries like the USA, the fact that something causes offence is not sufficient grounds for a ban – the law rightly distinguishes between an expression of racism and racist violence.

Although Yahoo! appealed against the French ruling, it announced in January this year that it would begin self-regulating the auction of Nazi memorabilia on its US website. The company now uses special software to bar the sale of Nazi-related items. This self-censorship represents a turnaround in the company’s defence against the initiators of the French lawsuit, but may open them up for further litigation in future.

Yahoo!’s previous voluntary self-censorship in fact added moral weight to the court case against the company, by implying that internet portals like Yahoo! have a responsibility for the content that they host and link to.

The idea that Yahoo! should take responsibility for the Nazi memorabilia being auctioned in this case is like holding a landlord responsible for the items sold in the garden sale of one of his tenants. Yahoo! is the content editor of an auction site – the users provide the content (the auction items).

When hosting content can be interpreted as an association of responsibility, ISPs suddenly have cause to dictate the content of their customers’ websites. When a mere hyperlink can be interpreted as an association of responsibility, anybody with an online presence becomes fair game for regulation: from Nazism to Yahoo! CEO in one move.

If Nazi memorabilia breeds a strong reaction, then child pornography is probably the most morally contentious type of internet content. It is distinct from almost all other kinds of pornography, in that photographing the abuse of a child is (in most countries) photographing a crime.

But the US and UK laws regarding child pornography have recently been extended for the digital age. They now conflate the actions of child sex offenders with artificially created images.

In the 1990s, authorities in both the USA and the UK became concerned that with widely available software such as Microsoft Photo Editor and Paint Shop Pro, sexual fantasies about minors could be satisfied by images artificially assembled from preexisting elements. The UK addressed this concern in a clause of the Criminal Justice and Public Order Act 1994. The Act incorporated revisions to a previous piece of legislation outlawing child pornography, the Protection of Children Act 1978. Every reference to an ‘indecent photograph’ in the earlier Act was changed to refer to an ‘indecent photograph [or pseudo-photograph]’ (11).

Two years later in the USA, the Child Pornography Prevention Act 1996 was signed into law by Bill Clinton. Again, this incorporated revisions to a previous piece of legislation outlawing child pornography, 18 USC 2256(8). A depiction of a minor could now be classified as child pornography (and therefore as illegal) where ‘such visual depiction has been created, adapted or modified to appear that an “identifiable minor” is engaging in sexually explicit conduct’ (12). (‘Identifiable minor’ here means indentifiable as being a minor, not identifiable as a particular minor).

Since in both the USA and the UK, a child no longer actually has to be photographed in order to commit the offence of creating an indecent photograph of a child, the emphasis of these laws has shifted from criminalising acts against children towards criminalising thoughts about children.

This has taken place largely without contest in the UK, but in the USA, where there is a strong tradition of free speech and civil liberties, the Child Pornography Prevention Act 1996 has been challenged by the American Civil Liberties Union (ACLU) (13). ‘There is a real difference between touching children sexually and touching computer keys to create images’, the ACLU rightly argues (14).

Most people find the very idea of paedophilia so objectionable that the distinction between paedophile acts and paedophile thoughts seems irrelevant. But the distinction is crucial – viewing pornography is a world away from enacting its contents.

A sexual predilection for minors rightly strikes most of us as vile. But in a free society, should we not tolerate individual fantasies that are not acted upon with harmful consequences? Otherwise, who is to say where we draw the line?

It is hard cases such as these, where content seems beyond the possibility of defence, where the case for free speech online is fought or lost. Standards of free speech on the internet should be maintained not by technical default (as they are at present – wherever regulation of the network is technically possible, it tends to be pursued), but by debate and argument on and offline.

The exchange of thoughts and items that profoundly offend your sensibilities, between people to whom you wouldn’t give the time of day, is a necessary (and relatively small) price to pay for the greatest communications medium in human history.

Sandy Starr has consulted and written on internet regulation for the Organisation for Security and Cooperation in Europe, and for the European Commission research project RightsWatch. He is a contributor to Spreading the Word on the Internet: Sixteen Answers to Four Questions, Organisation for Security and Cooperation in Europe, 2003 (download this book (.pdf 576 KB)); From Quill to Cursor: Freedom of the Media in the Digital Era, Organisation for Security and Cooperation in Europe, 2003 (download this book (.pdf 399 KB)); and The Internet: Brave New World?, Hodder Murray, 2002 (buy this book from Amazon (UK) or Amazon (USA)).
Read on:
spiked-issue: Free speech
(1) See the Guardian
(2) See, for instance, the testimony by Rabbi Abraham Cooper of the human rights group Simon Wiesenthal Center
(3) See the LICRA website
(4) See the UEJF website
(5) See Yahoo! US website
(6) See the
French Criminal Code
(7) See Yahoo! France
(8) See the
Bill of Rights
(9) See
BBC News Online
(10) See
BBC News Online
(11) Protection of Children Act 1978, Section 1. See also Criminal Justice and Public Order Act 1994, Section 84, subsection 2.
(12) 18 USC 2256(8). See
Cyber-rights and cyber-liberties
(13) See the ACLU website
(14) See the
New York Times

To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.

Topics Politics


Want to join the conversation?

Only spiked supporters and patrons, who donate regularly to us, can comment on our articles.

Join today