Donate
Oakland vs Big Data Brother

Long-read

Oakland vs Big Data Brother

One American city took on the surveillance society – and won.

Timandra Harkness
Writer

Topics Books Long-reads

At one end of San Francisco Bay lies Silicon Valley, birthplace and home of Big Data technology, from Apple to Google. At the other lies the vibrant and prosperous city of San Francisco. And across the water from San Francisco – 15 minutes away by underground BART train – is the city and port of Oakland.

Oakland has a sleepy city centre, hipster bars and an alternative bookshop. It is also regularly listed as one of the three most violent cities in the US. Along with the rest of the San Francisco Bay Area, it’s also subject to frequent earthquakes, and its port, where two-million shipping containers enter or leave the US every year, is at the mercy of potential tsunamis.

When, in 2013, Oakland was offered federal money to install an extensive early warning system that would cover both crime and natural disasters, the authorities found it easy to say yes. Oakland would have a Domain Awareness Center (DAC), bringing together vessel tracking; tsunami and earthquake warning systems; 700 surveillance cameras; automatic licence-plate readers; facial-recognition software; and a technology called ShotSpotter that uses microphones to listen for gunshots. And Oakland would have all this without spending a cent of local taxpayers’ money.

So Oakland city council may have been shocked when a routine vote to approve the first phase of the DAC’s construction was challenged. A few citizens turned up to the council meeting and demanded to know: ‘Where is your privacy policy?’ The vote was passed anyway, but it didn’t stop the activists. They soon started the Oakland Privacy Working Group.

In 2015, I went to Oakland to hear the whole story from one of Oakland Privacy’s key players, Brian Hofer, an attorney who first read about the DAC in the local newspaper, the East Bay Express. ‘I had no idea it was happening’, he told me, as we walked across the peaceful lawn of Frank Ogawa Grand Plaza, in front of the city hall:

‘I pay attention to politics – I read the newspaper. But I didn’t know this was happening until I read a December 2013 article in the East Bay Express, which analysed a lot of the public-record documents that Oakland Privacy had received. By that point the project was already six months old. It just randomly happened that the very next day was an Oakland Privacy meeting so I showed up and said “How can I help?”.’

Why were Oakland residents so concerned? Local context is key here. Relations between the city authorities and police and the local population were already strained. On New Year’s Day 2009, a 22-year-old called Oscar Grant was pulled off the train he was travelling on at Fruitvale Station by a BART police officer, and shot dead while handcuffed, face down on the platform. Grant was African-American. His killer, who claimed the shooting was an accident, was convicted of involuntary manslaughter.

In October 2011, a group of protesters set up an encampment in Frank Ogawa Plaza, renaming it Oscar Grant Plaza. Calling themselves Occupy Oakland, they aligned themselves with other Occupy protests across the US and beyond. City and police efforts to remove the encampment escalated into violence and exacerbated tensions between authorities and the local community. So the proposed DAC was never going to be taken at face value; rather it was seen as part of the authorities’ attempts to control citizens. As Hofer explains:

‘[The DAC] had originally been sold as a port-infrastructure project. Then it was sold to us as this thing for first responders to help with efficiency. The problem being, the only time our previous version of [the DAC] had been activated was in response to protests. So we had some suspicions when this showed up on the city-council agenda.’

Add to this the impact of Edward Snowden’s revelations about just how much data national authorities were collecting on their own populations, and Oakland’s citizens felt they had cause to worry.

Ironically, the very lack of trust between the people and the authorities was one reason the latter wanted the DAC, says Hofer:

‘Because there’s distrust among a lot of the citizens with reporting crimes or being witnesses, Oakland has made this decision that they’re going to use technology to solve their problems. It’s shiny-gadget syndrome. We’re up the road from Silicon Valley, and everybody’s promising us all these wonderful things, selling it as this Big Data-driven solution that is going to solve all our society’s problems, and it’s just not playing out that way.’

Big Data technology does make it very easy to collect information automatically about all sorts of aspects of city life. Those diverse datasets can then be aggregated to build a multidimensional picture of all aspects of life evolving in real time. It can track wind patterns, for example, so a chemical fire at the docks might be controlled more easily, or schools and houses downwind from the fire can be evacuated. ‘That’s wonderful, that’s great’, says Hofer. ‘We have no problem with that. No one has ever made an argument before the city council against those types of warning systems.’

The problem, however, is that the same approach – the multidimensional picture-building, the tracking, the predicting – can be applied to people. Instead of a policeman noting down vehicle licence plates at a particular location, software linked to CCTV cameras can do it automatically. Other pieces of software can recognise faces, and identify that the same person is in certain places at certain times. And the DAC would make it very easy to link all these different datasets together, says Hofer:

‘That ability to aggregate all the data also allows you to create a mosaic, to see the patterns in someone’s daily travel habits or their life. Oh, Brian is going down the marijuana dispensary, now he’s hanging out at the abortion clinic, or he’s at that Occupy protest because we’re also tracking licence-plate numbers.

‘And so the good is also the risk in this type of capability. Of course we had a big 1989 earthquake – freeways fell down, radios weren’t working. It was a bit crazy here. So to be able to coordinate and move resources around faster would be wonderful. But with that you need to have safeguards built in, to address the civil-liberties concerns.’

And that’s why Oakland Privacy emerged. By 18 February 2014, its members had forged a coalition of 20 organisations, including the American Civil Liberties Union (ACLU), and 100 people spoke against the DAC at the council meeting, provoking a postponement of the vote. This gave the coalition more time to build an opposition for the next meeting on March 4 2014. As Hofer puts it:

‘You’ve got left, right, centre… well, you know, we don’t really have a right in Oakland, but those not so progressive, that were concerned about taxpayer costs. So we had 45 organisations, 200 public speakers showed up, the city council meeting started at 5.30 in the evening and the vote didn’t happen until one in the morning. All those people spoke, unanimously again, and opposed to the project.’

When the city council finally voted, it approved a scaled-back version of the DAC that was mainly about port infrastructure. ‘They got rid of facial recognition, automatic licence-plate readers’, says Hofer, ‘and they removed the city portion from the project, prohibiting retention of any data’.

Hofer thinks that part of the victory was down to educating the elected councillors about what they were being asked to approve:

‘It’s not just understanding the technology. It’s also about thinking: Well, if the technology can do that, and therefore that thing that we do want, like predicting a tsunami, then that means it could also do this, and follow these people around the city, and make a note of who’s been meeting with who, and whose cellphones are in the square.’

Oakland Privacy’s cause was certainly helped by the US Constitution. This gives citizens some clear grounds on which to defend their own privacy. The Fourth Amendment, for example, gives protection against ‘general searches and seizures’. It’s not acceptable to search every house in a street, for example. Authorities need a specific warrant.

This Fourth Amendment protection clearly touches on a technology that collects data indiscriminately, whether it’s a CCTV camera identifying licence plates or cell-site simulators such as Stingray, which intercept all the cellphone signals within range. That means not only identifying the phone but collecting the metadata: the duration of calls; who is phoning who; and who is sending and receiving messages.

Freedom of speech and of association, safeguarded by the First Amendment, also underpin a defence of privacy. Hofer cites the case of the National Association for the Advancement of Colored People (NAACP), a civil-rights campaign group. ‘There’s a famous Supreme Court case’, says Hofer, in which the authorities were ‘trying to get the NAACP to reveal its membership rolls, so they could target those involved with the NAACP. And the Supreme Court said: “No, they have freedom of association.”‘

Without the capacity to organise privately, members of the NAACP could not associate freely. The erosion of privacy is a problem that afflicts both left and right, says Hofer:

‘From the other end of the political spectrum, here in California with our Proposition Eight fight over same-sex marriage, the lefties, the progressives, were trying to force the conservatives to reveal their donors and membership rolls. And we’re like: No, we already decided this issue! It’s like we didn’t learn much. Whether you’re left or right we’re still going after freedom of conscience and trying to get rid of it.’

But when data is collected automatically, by default, these legal precedents are at risk of subversion through a technological back door. As Hofer points out:

‘Nowadays, sure, the NAACP doesn’t have to turn over its membership roll, but the National Security Agency can get in its computer anyway. Or it just uses a Stingray and intercepts phone communications while sitting outside your building. The NSA just uses a licence-plate reader and drives around the building and looks at the licence-plate numbers. So effectively court decisions are meaningless if surveillance equipment is able just to be used indiscriminately.’

This is why oversight, accountability and controls over who uses our data, and for what ends, are so crucial. And it’s why the other part of Oakland Privacy’s campaign is so important. Because the city council made a decision that, says Hofer, the campaigners did not anticipate. As well as scaling back the DAC infrastructure, ‘it created this ad hoc committee, the citizens committee, that would then draft the privacy policy to regulate [the DAC]’.

This ad hoc privacy committee, with help from the ACLU, drafted the policy governing the DAC, which was adopted by the city council in June 2015. It includes these words on privacy:

‘Privacy includes our right to keep a domain around us, which includes all those things that are part of us, such as our body, home, property, thoughts, feelings, associations, secrets and identity. The right to privacy gives us the ability to choose which parts in this domain can be accessed by others, and to control the extent, manner, and timing of the use of those parts we choose to disclose.’

It’s not about never letting anybody know anything; rather, it’s about having control over who knows what and when. The policy continues:

‘The importance of privacy can be illustrated by dividing privacy into three equally significant parts: 1) Secrecy – our ability to keep our opinions known only to those we intend to receive them. Without secrecy, people may not discuss affairs with whom they choose, excluding those with whom they do not wish to converse. 2) Anonymity – secrecy about who is sending and receiving an opinion or message. 3) Autonomy – our ability to make our own life decisions free from any force that has violated our secrecy or anonymity.’

Without privacy, nobody can be fully autonomous or free:

‘This policy is designed to promote a “presumption of privacy” which simply means that individuals do not relinquish their right to privacy when they leave private spaces and that as a general rule people do not expect or desire for law enforcement to monitor, record, and/or aggregate their activities without cause or as a consequence of participating in modern society.’

In other words, just because you can collect this information, it doesn’t mean that you should, or that we will give you permission to do so. Just because Hofer and I sat talking outside a café, it doesn’t mean we expect someone to come by and record everything we said to each other.

Now other organisations turn to Oakland Privacy and the ACLU for help. Santa Clara, Palo Alto, Berkeley and even the regional transport authority, BART, have used Oakland’s ordinance as a template, and asked for help in adapting it to their needs. Hofer is now kept busy speaking at meetings and offering advice. When I spoke to him again this month, his determination was palpable:

‘People are excited that a small volunteer group has been able to achieve so much success. A large part of that success is due to coalition building. By involving civil-liberties organisations, labour, social-justice causes and all sorts of regular folk, we are able to show that a great many people demand reform in this area.’

The one-time ad hoc committee overseeing the DAC is now a statutory Privacy Advisory Commission, which held its first meeting in July 2016, and elected Hofer as its chair. Its task is to draft laws that will govern the acquisition and use of future surveillance technologies in Oakland, and define a process through which public discussion will always happen before new technology is bought and used.

The PAC’s August 2016 agenda included the use of cell-site simulators like Stingray, for which, alongside Alameda County, it has drafted a policy allowing their limited use governed by warrant, and compatible with the Fourth Amendment. ‘Generally, the police here are solely intending to use it as a locating device’, Hofer told me this month. ‘The policy requires that use be pursuant to a warrant, that no content be intercepted, that no data be retained, and there are strong limitations on information sharing. An annual report must be presented to the public summarising use in the past year.’

It is a level of accountability to which we in the UK should aspire when thinking about, for example, the Investigatory Powers Bill, which is still making its progress through the Houses of Parliament.

The Privacy Advisory Commission’s work is only just beginning. But Oakland is an inspiring story of what people can achieve when they demand to have control of the technology being used to gather data about them.

Timandra Harkness is a writer, performer and the author of Big Data: Does Size Matter?, published by Bloomsbury. To receive a 30 per cent discount on Big Data: Does Size Matter?, order from Bloomsbury.com and quote DATA at the checkout.

Picture by: Mike Fleming, published under a creative commons license.

To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.

Topics Books Long-reads

Comments

Want to join the conversation?

Only spiked supporters and patrons, who donate regularly to us, can comment on our articles.

Join today