From the Panopticon to the Snooper’s Charter: Surveillance in the Era of Big Data

In late 2016 the UK passed the Investigatory Powers Bill, commonly known as snooper’s charter. The bill, which has been regarded as the most extreme surveillance bill ever passed in a democracy, is the culmination of a series of “national security” and “anti-terrorist” type of laws that are now a standard of the post 9/11 world, which the UK, alongside with the US, champions. It will allow for an Orwellian level of internet surveillance by the government and its branches, which will have power to hack into phones and computers en masse as well as requiring Internet Service Providers (ISPs) to store all users’ data for a year. The bill comes three years after the infamous revelations by Edward Snowden that the GCHQ had “the largest programme of suspicionless surveillance in human history”. Although this is a shocking bill, it is not one that inaugurates a new set of practices by the state, but one that cements already existing ones; internet surveillance is not the surprise here, it is the fact that the state is willing to legitimize it by making it law. In the world of the Internet of things, virtually nothing that we do escapes the possibility of being recorded, watched, analyzed and intervened: this means, as the snooper’s charter reminds us, that we have entered a new paradigm of surveillance, one that shakes the pillars of the traditional liberal values of democracy, privacy, citizenship and inviolability.

In his 1977 book Discipline and Punish, Michael Foucault famously introduced his reading of Bentham’s prison building design the panopticon to explain how a system of surveillance symbolized a new rationality of power. Power is put to work in a way that by its appropriate display it becomes productive; the mere symbols of power generate discipline. In this way he pinpointed the rise of the disciplinary society in eighteenth century Europe. This was the ethos of society necessary to boost the industry-based capitalism of the time. In 1992 however, Gilles Deleuze published the Postcript on The Societies of Control, in which text he considered that the disciplinary society as described by Foucault was coming to an end to make way for a society of control. Taken together with other authors such as Bauman or Agamben, as well as scholars writing on the issues of subjectivity and surveillance in the internet era, this paper will try to reflect on philosophical terms on this situation: what does it mean for society and what is there to be done to resist it.

Panopticism and internet surveillance

The chapter on the panopticon in Foucault’s book mentioned above has been very influential in most social sciences in the decades following its publication, generating a whole field of literature stemming from this very idea, what has come to be called panopticism. Reading the chapter today taking into account the irruption of the internet in all spheres of social life, one can discern clear parallels between the panopticon as described by Foucault and the internet surveillance culminated in the snooper’s charter. I will try to sketch out some of these parallels here.

Firstly, as with the provisions of the government of the plague-stricken town that opens the chapter, we see how a period of security crisis is the perfect opportunity for an intensification and actualization of power. In order to save the town’s population from being swept out by the plague, the government installs a state of exception in which power relations are strengthened and new modes of surveillance and discipline are implemented. Similarly, the new wave of terror in the Western world inaugurated by 9/11 has provided a scenario of security crisis which has made room for such an intensification and actualization of power. The plague, as with terrorism in the homeland, situates the field of battle at home, and sets the possibility of anyone being a threat to the homeland.

Foucault moves on from the plague-stricken town to the panopticon as he considers Bentham’s project to be the “architectural figure of this composition” of confinement and surveillance. The essence of the panopticon is this: a tower in the middle of the building has complete sight on all the cells, which are disposed in a circumference around it, but because the windows of the tower are obscured, those in the cells do not know at which time they are actually being watched or not, but they “must be sure that [they] may always be so”. (Foucault 1977: 208) By this system those in the cells must act as if they were being watched because of the chance that they might be, thus interiorizing the gaze of those in power: this is how discipline works, one internalizes the relations of power and disciplines oneself to the point where even the supposedly intimate desires become a product of the discipline.

Foucault talks of the cells as “small theatres” and we can conceive of the proxys through which we connect to the internet as similar to the cells of the panopticon. The IP, the IMEI and the account on social media or any other internet service can be looked at at any given time by the guards in the tower. “In the central tower, one sees everything without ever being seen”(Foucault 1977: 202), a similar thing happens in internet surveillance. Those in governmental intelligence agencies can access any kind of datafied information at any time, but they are never revealed (in fact the tower was only revealed recently with the Snowden revelations). As Foucault points out, in Bentham’s ideal, the panopticon was a democratic tool in that the role of the observer could be occupied by any member of society; the tower is open not only to guards, but to the whole public who might have an interest in observing for many different purposes (medical, artistic, legal and so on). In the case of internet surveillance however the central tower is so obscured that it doesn’t need to be physically set against the observed. The observers—the corporations that own the data and the state—need not reveal their surveillance tower and make it transparent, in this way available to society as a whole as in Bentham’s vision. Big data further improves the panopticon in that not only its surveillance “is permanent in its effects even if discontinuous in its action” but action (meaning the presence or absence of the observer at any given time) is replaced by algorithms that do the work ceaselessly; they are always active. This is the magic of big data and algorithms for surveillance purposes: the machine records everything and curates the selection of what is interesting in the material recorded.

The rise of panopticism is identified in the effective extension of “power to the most minute and distant elements” (Foucault 1977: 216). Big data and metadata allow for a kind of surveillance to which no element escapes; the machine records everything, without any discretion on the importance of what is being recorded as it relates to the purposes of the record. Everything is recorded, so as to discretionally highlight that which might be of importance by means of studying the bulk of the previously obtained data.

Big data surveillance relies upon control over collection, storage, and processing infrastructures in order to accumulate and mine spectacularly large amounts of data for useful patterns. Big data surveillance is about intervening in that world based on patterns available only to those with access to the data and the processing power[; it] is not selective: it relies on scooping up as much information as possible and sorting out its usefulness later (Andrejevic and Gates 2014: 190)

From discipline to control – from the individual to the dividual

While Foucault’s panopticism can still be a useful metaphor for internet surveillance as we have seen here, more recent theories have stressed that the kind of society described by panopticism, namely disciplinary society, is fading away—disciplinary society was only a transient model that replaced the society of sovereignty (from ruling over death to ruling over life)—to make room for a new configuration of power, that of control (Deleuze 1992). The crux of hegemonic power is no longer discipline that forges the spirit of the worker in the confined spaces of the factory, the school or the hospital, but that of control, that allows to continuously capture and modulate new and existing profiles of consumers or dividuals that surf the network. Power thus operates not by confining, but by tracking. And to do this it needs to deprive the individual from the entitlement to its own confinement, by penetrating what used to be the most personal and untransferable space: the private sphere. Once this shift has taken place the private sphere is blurred into nothingness, it loses all meaning. Instead of the traditional opposition of mass/individual we have dividuals as nodes in big data. The individual of the disciplinary society is a body with an identity, and in this duality it is indivisible. In the notion of dividual this indivisibility is violated, identity is no longer bound to the body, but to a numerical code, or an aggregation of these. These are codes that are acknowledgeable to the computer. Here we find the cause for an important shift in the paradigm of panopticism: surveillance does not consist in looking anymore, but in analyzing the data trails that the dividual leaves in the ever growing pool of big data. The primary tool is not the gaze but the computation.

Identity is no longer constructed in a spatial and temporal discourse; it is instead made up of aggregate bits of data that conform a node in the network that allows tracing this metadata back to a specific person. What you like with your profile can give a very accurate sense of who you are, “sexual orientation, ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parental separation, age, and gender” (Kosinski, Stillwell and Graepel 2013: 1). But the scope to which data defines a person is not limited to personal traits and preferences, with meta-data an individual can be closely tracked in every activity she performs outside of the social media feed: where she goes, who she meets, what she buys. On the other hand, power and status are also defined by this code: it is not the physical capacity to travel or to be in the right place to know that will define your possibilities, but whether the machine considers your code valid – to cross borders, to access a building, to read an academic article, and so on. So we could say we are leaving biopolitics to enter a kind of datapolitics where the state is no longer concerned with administering bodies and the life that sustains them but rather with properly managing virtual identities and the data they generate.

Google claiming that it can predict viral diseases (and many other social trends such as unemployment, voting intentions, consumerist preferences, and so on) before governments can is a clear breach of discipline in a foucaultian sense and a move into the deleuzian notion of control. Where the only way to detect and claim knowledge of a viral disease up to the start of the twenty-first century was the expert knowledge of doctors alongside with the confinement of the hospital and a high degree of bureaucratization, Google (and other data-rich companies alike) can supervene this role by spotting trends in the data generated by its users: if a significant amount of people are typing in about the symptoms into their search engines, they can detect that a disease is present in a certain area. This means that mighty tech companies such as Google, Facebook, Amazon and the like can rival the state in the hegemony of information. This gives way for a problematic that I will further comment on in the next section.

The synopticon as described by Bauman is yet another inversion to foucault’s panopticon. Where in the panopticon the few watch the many, in the synopticon the gaze is turned on the few, and the many are not watched but it is they who watch through a spectacle of imagery. What there is for them to watch is crafted and manipulated by and for capital, but still the act of watching is carried out by seduction, not coercion. It is in watching that the many acquire values and role models; power and discipline are spread into an imagery that is not fully imposed, but neither fully escapable (Bauman 2000). In the era of social media however, the public is not a passive entity, it plays an active role in the knowledge production chain. In order to create and improve the content put out for the many to follow, a continuous feedback is allowed and demanded. Thereby where the members of the many try to define themselves individually by giving meaning to their consumerist preferences (I am what I follow), they are inevitably added up as big chunks of data that is only acted upon in its original state: views count, likes, shares, retweets. The few generating the content through which power discourses flow are also deprived of agency. They craft a brand upon their very name and body, to then sell it to capital. The commodified body and brandized name which are always disposable and replaceable serve as the masks of the corporations that are otherwise void of particular signifiers. The corporation is defined by imagery, not by practices; it is recognized by the public by their intended symbols, not by the actual impact of its modes of production.

Dataveillance

The term dataveillance was first coined by Roger Clarke as early as 1988, who defined it as “as the systematic monitoring of people's actions or communications through the application of information technology” (Clarke 1988: 1). From the 00s a number of scholars have picked up the term to describe the advent of a new kind of surveillance which takes big data as its object, and algorithms as its tool. This is already turning into a specific and relevant field of study that necessitates its own terminology and specific theoretical framing. In this section I will try to cover some of that literature and highlight its achievements as well as the problematic it suggests. In her definition of dataveillance Sara Degli Esposti defines four kinds of activity: A) Recorded observation—“watching, listening or sensing” carried out by CCTV, sensors, drones and so on. B) Identification and tracking—Identification as the recognition of an object or a person through its unique features (fingerprint, face, iris); then tracking as the tracing of the activity of the previously identified object or person. C) Analytical intervention—to take the information obtained by the first two methods and turn it into knowledge, by applying analytical reasoning to it. D) Behavioural manipulation—to influence objects’ and/or persons’ behaviour by setting up “a policy, a procedure, or a technological artifact” following the knowledge obtained by analytical intervention (Degli Esposti, 2014).

Although a great deal of surveillance technology has been first developed by the military—which might give clues to how this technology treats those under scrutiny as ever-potential terrorists—it has then spread to be used equally by governmental agencies and corporations (Ewbanks 2014). The internet and big data allow for an unprecedented access to vast, immediate information immediately with a very low cost, for which reason it has been labeled as a democratizing tool. Business-oriented literature has been pointing out the fact that big data could mean a new post-bureaucracy phase for organizations. Where access to information is made horizontal, replacing the hierarchized flow of information that has been in place traditionally, employees prove to perform better; access to big data is empowering (Berner, Graupner, Maedche 2014). But while this might be true for the business world, in what regards more generally to public knowledge, access to relevant information is increasingly being privatized by the corporations that own it on the one hand, and used with military secrecy by the state on the other. It has been estimated that no more than 0.03% of the whole internet is what is readily accessible to the common user. Any attempt to alter this balance and open information to the public, in this way blowing against corporate or military interests, is met with extreme hostility, as we have seen in the cases of Edward Snowden and Aaron Swartz.

Data-rich companies are accumulating a power that is tending towards a kind of oligopoly of knowledge. They are in fact rivaling the state directly in that they know the population better than the state itself does, and increasingly governments seem to accept this; governmental agencies rely on corporate knowledge of the population to monitor people and design specific policy. This places these corporations in a situation of power over the state that is conceivably greater than what traditional industrial cartels could achieve. And while they claim to be motivated by democratic values, there is little accountability to which they are subject. Companies like Google and Facebook have their own in-house researchers and the data banks and findings they release are completely arbitrary, there is little commitment to transparency. And of course this mighty power that is data wealth is used for the sole purpose of making money: either by selling you tailored goods, by selling your medical information to insuring companies, selling it to governmental agencies, and so on. And this is significant because it means this unprecedented scale of information on people is being used in a quite negative, unconstructive way: profit on the corporate side and paranoia and terror on the state side. The user is left to rely on the good faith of these companies and their governments in that they use the data ethically, and it really is just faith, because the average user has no clue to what degree she is being perceived and manipulated or for which purposes.

Scholars writing on the issues of how big data is analyzed and the subsequent actions taken according to the knowledge gained by analyzing it have been concerned with pointing out the problematic aspects in both source and process. This is refers to the analytical intervention category mentioned above. With regards to the source, what has been called raw data, there is a very important part of the material found in big data that is faulty or misleading, and because the amount is so vast, it is nearly impossible to look at it close enough as to verify its validity. Then there is the problem of the mining of data; algorithms are needed to be able to extract what is interesting or relevant from big data, but the production of these algorithms is arbitrary, and their functioning once they are put to work is not free of mistakes or omissions. (Barocas, Hood, Ziewitz, 2013; van Dijck, 2014) Furthermore, the functioning of the algorithms and the findings they provide do not come in a form that is readily understandable to common sense, and they need not do. The patterns and correlations found by algorithms can be random and make it hard to know whether they are relevant or the product of odd coincidence. But patterns and correlations found by algorithms that seem to point nowhere at a given time, might do in the future. There is a chance that to study phenomena that happens in the future we will need to dig in data that was collected ten years ago. (Andrejevic and Gates 2014). This is the supporting logic for the NSA and GCHQ collecting all the data they can, even if it is more than is really necessary or than they can actually handle.

A paper called Governing Algorithms: A Provocation Piece equally tries to demystify some of the quasi-magical attributes given to algorithms in the last years (Hood, Barocas, Ziewitz 2013). Algorithms have gained agency as autonomous decision-makers that can be left to work on their own and produce better results than human minds ever could, especially in the field of data mining and prediction. This alienates power in that they are not bounded to an agency of human consciousness with specific motivations and interests, and in this way they take a place in our imagination that is god-like, for they are super-powerful, flawless and unbiased. Once they are given birth, they act for power but they don’t take its name nor its face. But of course algorithms are a product of human design, and even though they are getting better and smarter they are not god-like. Danah Boyd and Kate Crawdford published a paper with the title Six Provocations for Big Data in which they scrutinize in a similar fashion the much praised idea of big data in an attempt to demystify notions about it. They contend that big data is prone to generate faulty and inaccurate sets of data precisely because it is very vast and very random. In many cases researchers using data mined from vast, multiple data sets don’t have the knowledge as to how exactly the data in use was congregated and consequently what are the possible flaws (Boyd and Crawdford 2011). Furthermore,

Academics looking at data sets from a social science or humanities perspective may pose very different questions than information scientists; and medical doctors are likely to see different patterns than criminologists”. On the other hand, “big data surveillance is available only to those with access to the databases and the processing power: it is structurally asymmetrical. Likewise, the forms of knowledge it generates are necessarily opaque; they are not shareable understandings of the world, but actionable intelligence crafted to serve the imperatives and answer the questions of those who control the databases.” (Andrejevic 2014:8)

Big data makes corporate world and government a happy couple

The era of ultra-surveillance is a very illustrative scenario of how governments do favor and benefit from private sector oligopolies. ISPs as well as information-rich tech companies work hand in hand with governments’ intelligence agencies by providing virtually absolute access to information on their customers, all the way to private conversations, photos and videos. These oligopolistic companies that amass a huge power in the form of unprecedented information of the population become direct instruments of government. The citizenry is thus left unprotected; who is going to defend the rights of people against abuse by private companies when the government is encouraging these practices and actually participating in them with even sketchier purposes than the private sector itself?

It is a historical fact that whenever a sector of the industry becomes highly concentrated and centralized it becomes a tool of government by the state, all the more if the industry in concern is that of generating knowledge on the population. It is a shared interest of corporate world and the state to gather the maximum amount of metadata as well as maintaining public acceptance on this extreme dataveillance. And in this love circle there is the third guest of the elite academy. Top tier universities are the ones with privileged access to this knowledge.

Scientists, government agencies and corporations, each for different reasons, have a vested interest in datafied relationships and in the development of methods that allow for prediction as well as manipulation of behavior (Van Dijck 2014: 7)

This speaks also to the notion of ordoliberalism that informed our current neoliberalism introduced by Foucault as brilliantly explained in an article by Nicholas Gane. In ordoliberalism, as in postwar Germany, the state no longer acts only as the arbiter of a free market that exists within its purview, but rather it organizes itself following the market principle of competition, for it is the market that legitimizes the state (Gane 2012). We can see in the era of big data how the state has to bargain with tech companies and ISPs to get a share of their power. They don’t act by coercion or try to limit their power, but rather compete with them by granting them the ideal setting to continue growing in exchange for information on the citizenry.

To sum up

Dataveillance and its components (big data, internet surveillance, homeland security laws, mighty data-rich corporations) are constituents of a whole new dispositif of power that arranges social relations and subjectivities into the coming society of control as described by Deleuze. In a time where the active gaze of power penetrates the whole of the social fabric in explicit alliance with the championing industries of ultra-capitalism it becomes very hard to imagine how any kind of significant social dissidence and criticism of the state might take place. Even though the internet first appeared as a tool that could point in the exact opposite direction—that of a space free of intermediaries where information could flow costlessly, instantaneously and without censorship—, once the forces of capital and power managed to make it work in their favor it became a very powerful tool of governance as we now see. Some still see in the internet a potential tool to enhance and organize activism in a kind of 2.0 version of political action (Jurgenson 2012), and while it might be true, that possibility seems to be getting dimmer with the passing of time. The snooper’s charter is the ultimate embodiment of the extension of the penitentiary not only to the streets of every city but to every traceable action of the common person that is to the system always a potential terrorist; in the state of exception that has become the norm, democratic rights are suspended. In order to escape the dispositif, we must profanate it by becoming aware of it and try to drag ourselves out of it, in this way creating the possibility to become again the untamable, ungraspable and unconfinable agents of ourselves (Agamben 2006), for only by reclaiming the subjectivity that the machine has taken from us will we be able to think and exist freely, to not have our decisions predicted and manipulated at every step.


References

Agamben, Giorgio. Che Cos'è Un Dispositivo? Roma: Nottetempo, 2006. Print.
Agamben, Giorgio. Homo Sacer. Torino: G. Einaudi, 1995. Print.
Amoore, Louise. The Politics of Possibility: Risk and Security beyond Probability. Durham: Duke UP, 2013. Print.
Andrejevic, Mark. Surveillance in the Big Data Era. Emerging Pervasive Information and Communication Technologies (PICT) (2013): 55-69. Print.
Andrejevic, M. and K. Gates. 2014. Editorial. Big Data Surveillance: Introduction. Surveillance & Society 12(2): 185-196.
Barocas, Solon, Sophie Hood, and Malte Ziewitz. Governing Algorithms: A Provocation Piece. SSRN Electronic Journal. Print.
Bauman, Zygmunt. Liquid Modernity. Cambridge, UK: Polity, 2000. Print.
Boyd, Danah, and Kate Crawford. Six Provocations for Big Data. SSRN Electronic Journal. Print.
Clarke, Roger. Information Technology and Dataveillance. Communications of the ACM 31.5 (1988): 498-512. Print.
Degli Esposti, S. 2014. When big data meets dataveillance: The hidden side of analytics. Surveillance & Society 12(2): 209-225.
Deleuze, Gilles. Postscript on the Societies of Control. October, vol. 59, 1992, pp. 3–7. www.jstor.org/stable/778828.
Michel Foucault. Discipline and Punish: The Birth of the Prison, (Vintage Books, New York, NY, 1979). pp. [195-229].
Gane, Nicholas. The Governmentalities of Neoliberalism: Panopticism, Post-panopticism and beyond. The Sociological Review 60.4 (2012): 611-34. Print.
Hardt, Michael, and Antonio Negri. Multitude: War and Democracy in the Age of Empire. New York: Penguin, 2004. Print.
Simon, Bart 2005 The Return of Panopticism: Supervision, Subjection and the New Surveillance. Surveillance & Society 3(1): 1-20.
van Dijck, José. 2014. Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society 12(2): 197-208.
"Most Of The Web Is Invisible To Google. Here's What It Contains." Popular Science. Web. 05 Jan. 2017.
"Want to Predict the Future of Surveillance? Ask Poor Communities." The American Prospect. Web. 06 Jan. 2017.