‘The answer to hate speech is more speech’
Parler’s chief policy officer, Amy Peikoff, on the app’s cancellation by Big Tech.
It’s early 2021 and Big Tech censorship has already escalated to unprecedented levels. As more and more accounts are shut down or silenced, many social-media users have decamped to alternative platforms in search of a less censorious environment. One such platform was Parler – until, earlier this month, when it was barred from the Apple App Store, the Google Play Store and, crucially, from Amazon Web Servers. These firms told Parler it had not yet done enough to tackle objectionable content on its app.
Amy Peikoff is a writer, professor of philosophy and law and chief policy officer at Parler.
spiked: What happened when Parler was taken down?
Amy Peikoff: On 8 January, we received notice from Apple asking us to present a plan to more effectively enforce our community guidelines. Apple said if we didn’t do so to its satisfaction within 24 hours, we would be removed from the App Store. That was also released to the press.
Then, about an hour-and-a-half later, we received similar communication from Amazon. We did not hear directly from Google, but heard through the media that we had been removed from the Google Play Store.
This all happened soon after Trump was banned from Twitter and Facebook. We were receiving a huge influx of people. Our app was number one on the App Store, for example, during that time period. Then, suddenly, everything was under threat of being cut off.
We said that we were starting to take action to deal with the issue. But Apple would not keep us up on the App Store and Amazon took us down from the server on 10 January.
spiked: What was it about the content on your platform that upset Apple, Google and Amazon?
Peikoff: The content the companies gave examples of did clearly violate our community guidelines. They were cases of incitement to violence, threats, calls for sedition and so on. The question was not about the desirability of removing it from the platform. It was about what processes to use to do that: they said our processes were inadequate.
They accused us of being unwilling to remove the content simply because we had a community jury model, whereby we asked people to report content and then let the jury deem whether it was a violation of our guidelines. We had already been exploring the use of AI. But we wanted to do it consistently with preserving the privacy of people on the platform.
spiked: How would you summarise Parler’s attitude towards freedom of speech?
Peikoff: Parler’s mission has always been to allow people to express themselves freely to the maximum extent possible consistent with the law and with our own business purposes (we don’t want spam, or attempts to circumvent our monetisation model).
At Parler, we agree with Nadine Strossen, the former head of the American Civil Liberties Union, who said the answer to hate speech is more speech.
We also want to respect the privacy of users. Unlike with Twitter and Facebook, there’s no data mining, profiling and targeting of ads based on profiles. The people on Parler are not the commodity.
spiked: What’s still in the way of Parler coming back online?
Peikoff: At the moment, as far as I know, Amazon has no intention of allowing us back on its servers. The server is the main thing we need in order to be online and fully functional. We have a static webpage up right now, but without a robust server vendor, we cannot get the full operation back up.
spiked: What does Parler’s cancellation tell us about the power and interests of Big Tech?
Peikoff: It shows how easily somebody can be taken offline, and how easily these larger companies can make it so that there is nowhere to go if you have been removed from Twitter or Facebook.
Section 230 of the Communications Decency Act provides legal immunity for service-providers when they remove content from their platforms. The decision-making process, the engagement-enhancing algorithms, the shadow bans – none of this is subject to scrutiny while they have that immunity.
Politicians are hauling tech CEOs before Congress and urging them to remove more and more content, even when the particular category of speech in question would be protected by the First Amendment or similar laws around the world.
It’s a scary prospect, because we get to a stage where we are not in a completely free country.
spiked: With the inauguration of Joe Biden, Big Tech firms have their preferred candidate in the White House. Do you think this means things will get worse for free speech before they get better?
Peikoff: I’m getting a sense of that already. I’m hearing there are people in Congress who are starting to call for Parler itself, an entire nonpartisan platform, to be investigated with respect to its role in supposedly helping to incite the events of 6 January.
Moreover, Mark Zuckerberg supports new regulations under which platforms would be required to issue so-called transparency reports. These are reports in which firms describe what they have done to deal with ‘objectionable content’, including speech that is protected by the First Amendment.
He has gone further to suggest platforms should be required to prove their effectiveness at dealing with that content. If that ends up being put into law, it would represent the government trying to achieve, via regulation of social-media companies, what it could not achieve by directly censoring.
At Parler, we did not realise the extent to which merely trying to offer this product on the market would itself be activism. Once we do get back online, we hope everyone who is interested in the future of freedom of expression and civil discourse will come and join us on Parler.
Amy Peikoff was speaking to Paddy Hannam.
To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.