Donate

Is Big Data squishing our humanity?

The use of new technologies is turning us into objects of analysis, examination and manipulation.

Norman Lewis

Topics Science & Tech

It is almost impossible to read a newspaper, magazine or journal today without coming across claims about how Big Data and the Internet of Things (IoT) are on the brink of revolutionising our lives. The key assertion is that the increased ability of machines to communicate with one another, from computers, to smartphones to sensors embedded in our environments at home and at work, plus our ability to capture all of this data, process and visualise it in ways that can help make informed decisions, amounts to a third industrial revolution. The future is already here, apparently.

Not only is the future already here, this time it’s big. One report claims that by 2024, the world’s enterprise servers will annually process the digital equivalent of a stack of books extending more than 4.37 light years to Alpha Centauri, our closest neighbouring star system in the Milky Way. According to the CEO of Cisco, this revolution is going to generate trillions of dollars. Depending on whom you listen to, the exact numbers may not be the same, but they are all agreed that this process is relentless, inevitable and wonderful.

Of course, they would say that, wouldn’t they? After all, they are technologists, people whose mission in life is to develop transformative technologies and sell them. And they are right to stress the potential of these developments. The use of Big Data has already yielded incredible breakthroughs. The Higgs boson would never have been discovered without Big Data. The city of Los Angeles, the most congested city in the USA, has reduced traffic congestion by 17 per cent due to a smart traffic-control system that links traffic lights to Big Data sets on weather patterns and sensor-measured traffic flows. From supply-chain monitoring through to the air industry using embedded sensors and weather data to reduce delays and cut fuel consumption, the combination of Big Data and the IoT is already having a considerable impact.

But where the technologists are wrong is their assertion that Big Data and the IoT will not only continue to transform our lives, but that everything will be hunky dory. The first problem is that their perspective is a necessarily technologically determinist one. We can forgive them for this. After all, they are the true believers, the Silicon Valley soothsayers whose belief in the technology (and themselves) is as admirable as it is irritating. But technology adoption does not follow some inner law or inexorable, predetermined path. It is always framed by social, economic and political contexts and choices.

The example of younger people’s embrace of mobile technology is a case in point. This didn’t happen because the technology was so compelling that it seduced young people (who we are also told are naturally good with technology) into adapting it into their lives. No, young people internalised this technology, often by doing things with it that were never intended by the technologists (like texting), because it solved a problem for young people. In this case, the problem was that they were growing up in a risk-averse society where they were constantly under the gaze of adults. The internet, mobile technology and cyberspace offered a much-needed escape route, an autonomous space where they could express themselves and interact with their peers free from adult supervision. So it was social need, not magical technology, which drove the emergence of billion-dollar enterprises like Facebook. The fact that it is now predominantly adults who are behaving increasingly like children on social media, using Facebook as a vehicle of self-expression and self-affirmation, shouldn’t confuse us as to the initial drivers behind the technology’s rise. No one could have anticipated the emergence of Facebook – or even Google, Amazon or eBay – when the internet emerged.

Social contexts shape technology adoption and, in turn, shape the technology and its subsequent development. This interaction is always mediated by human experience, both at a technological development level and at a broader cultural and economic level. Numerous interests battle to adopt or use these technologies to meet specific ends or needs, often in conflict with one another. And this is where the second and more problematic tendency comes in: namely, the bigging up of data and smart technologies above human consciousness.

The implications of this for the future of free will, human choice, privacy and the distinction between public and private are immense.

Algorithmic fatalism

Fifteen years ago, the very same companies that are now at the forefront of today’s Big Data and IoT developments pioneered the slogan ‘always connected’. Although this has always been little more than an aspiration rather than a reality, ‘always connected’ meant that a person was the subject of a network. But as various devices gained faster processing power and greater storage capacity, the focus of ‘connected’ became the devices rather than the people – these are what we now refer to as ‘smart devices’. This is a significant and subtle shift. It meant the focus and attention shifted away from people being the subject of networks to them being the object of networks. The IoT, allied to Big Data, now posits human beings as simply another data point in a network of smart things; human beings, rather than being the subject of connectivity, become merely another ‘thing’ connected to everything else.

In this brave new world, people become passive data points. It is their data that is the active ingredient. Subject and object have been inverted. Algorithms, which are by definition more objective, have no human fallibilities. They only follow ‘if then’ formulae to plot outcomes, which can now be marshalled to ensure optimal outcomes. If that’s not worrying enough, Big Data analytics complete with garish infographics, it is asserted, now have the capacity to reveal things to us about our daily lives we weren’t even conscious of. Algorithms will set you free, brothers and sisters.

We have come a long way from Descartes, whose ‘I think therefore I am’ was a hallmark breakthrough in the celebration of human consciousness and human-centred history making. Today’s cogito ergo sum equivalent would read more like this: ‘I think I exist, or at least there’s an app that proves I exist because it generates data which can be analysed and presented back to me so I can make more sense of my life.’

But data analytics and visualising the patterns of one’s life in the form of an infographic are not the same as self-reflection. Human experience mediates between information and meaning. Information, or data, is the raw material, an undifferentiated stream of sense and nonsense, which we have to interpret and reflect on to arrive at meaning. The journey from information to meaning involves more than simply filtering the ‘signal from the noise’, as the data analysts like to describe it. It is a human-made transformation. It takes skill, time and effort, practice and patience – and there will be mistakes. This is what we gradually become better at as we move from childhood through adolescence to adulthood. And the scary and wonderful thing is that no matter how experienced we become, success cannot be guaranteed. Making mistakes is a human frailty that demands re-thinking and re-adjustment, that demands we improve.

Critically, this activity can never be the monopoly of experts. It is a very basic, deeply human activity, essential to our survival. It is the essence of what it means to be human. And essential to this conscious activity is our ability to withdraw into private space, a space where it is possible to experiment, make errors, reflect and do the kind of processing that allows us to gain insights, wisdom and develop meaning.

But this is far from what is envisaged by the high priests of Big Data and the IoT. From web search to marketing and stock-trading, and even education and policing, the power of computers that crunch data according to complex sets of if-then rules is increasingly presented as a way to make our lives better in every way. Amazon’s algorithm will tell you which book you should read next; dating websites will compute a perfect life-partner; self-driving cars will reduce accidents; and crime will be predicted and prevented algorithmically. In other words, reduce or minimise conscious human activity so better decision-making can be made on our behalf.

What we have here is a new algorithmic fatalism that elevates computing power above human subjectivity. If human activity can now be ‘objectively’ measured by the data trails we leave, and our conscious activity realised through algorithms ‘revealing’ the truth of our decisions, then the concept of free will, the idea of conscious choices and the quest for meaning are emptied of human agency. We are in danger of elevating the precision-tooled power of the algorithm over human judgement. It not only reduces what it means to be human, but also transforms the very concept of private and public space. In fact, private and public will cease to exist.

Human freedom and choice

As stated above, these outcomes are not inevitable. Whether we end up in a society where precision algorithms rule our lives or not depends on the choices we make about our freedom and our aspirations. But this is not a choice for the future. It is about the choices we need to make right now. The Big Data and IoT evangelists are not acting in a vacuum. The diminishing of subjectivity is well established in Western society, as spiked has reported on in its coverage of environmentalism and victim culture.

The reality is that this perspective is already entrenched in Western government policy. Just look at the government’s Behavioural Insights Team (BIT) (informally known as the ‘nudge unit’). The unit was inspired by the bestselling book Nudge (2008), written by Harvard law professor Cass Sunstein and the economist Richard Thaler. BIT’s approach to policymaking rests on the deliberate bypassing of the reflective or reasoning processes of the masses. BIT policy consists of ‘choice architectures’ in which citizens, seen as a cognitively weak mass, are surreptitiously encouraged to make what the authorities consider to be the correct decision. To help reduce obesity in schools, for example, healthy meals should be placed at eye level, while junk food should be in hard-to-reach places. To get more organ donors, the state should automatically enroll everyone so that you have to ‘opt out’ rather than ‘opt in’ if you want to be involved. Government policy becomes a kind of benevolent god, designing a garden maze that leads unconscious sinners to the right exit.

Now nudging and altering behaviour is set to reach new levels with the emergence of the ‘smart home’. Everything we do in our homes, and therefore in private, can now be captured, aggregated, analysed and visualised through sensors that connect our cars, lights and fridges, TVs and music systems, and the personal devices that monitor our bodies. Armed with such data, people’s behaviour can be altered in the ‘appropriate’ direction. We are all to become Pavlovian dogs driven by an algorithm that will reveal the righteous path. The ‘smart’ home can now be used to cajole the ‘dumb public’ into behavioural changes that serve the interests of whomever yields that power. The implications for the future of society, of privacy, free will and freedom are immense.

But there is nothing inevitable about all of this. The machines we have created in the first place will never subsume human subjectivity. But this depends on political, economic and social choices we consciously make today and in the future. The challenge is to ensure that these technological developments are used to benefit human society, not the narrow short-term interests of political elites or corporations. It is both a question of our ambitions as a society as well as how we can control the means of data production that will determine the future. Socrates, reflecting on the necessity for self-reflection, said ‘the unexamined life is not worth living’. But the over-examined life, promised by the intrusion of data analytics into the minutiae of our lives, is not really living at all.

Norman Lewis works on innovation networks and is a co-author of Big Potatoes: The London Manifesto for Innovation. He is writing in a personal capacity.

To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.

Topics Science & Tech

Comments

Want to join the conversation?

Only spiked supporters and patrons, who donate regularly to us, can comment on our articles.

Join today