Surveillance, privacy and public space in the Stratumseind Living Lab. The smart city debate, beyond data

Auteur(s): Bron:
  • Ars Aequi, AA 2019, afl. 7-8, p. 570


In this contribution I open up the debate on smart cities and living labs, particularly in the field of law and policy, beyond the focus on data. I offer a glimpse of a broader perspective by examining the Stratumseind Living Lab through surveillance and privacy theory, exploring the implications both for privacy and public space itself. Concerning surveillance, I focused on a particular type of surveillant logic that is novel and prevalent in such initiatives – one that governs on a collective level without needing to individually identify individuals. I have shown how this mode of power – conceptualised as ‘security’ by Foucault – can still affect individuals, particularly by limiting their possibilities of
action (including via nudging) and potentially limiting their autonomy. Especially, if the digital technology of the Stratumseind Livng Lab (or similar initiatives) would become more sophisticated and thus capable of individual exploitation of vulnerabilities and weaknesses – what has been termed ‘hypernudging’ – the risks for manipulation and thus for autonomy (a key argument for why we value privacy) would increase. Moreover, such surveillance can also affect society and democracy more broadly. In particular, I have shown how surveillance in public space can adversely affect the possibility and ability of persons to participate in informal social life (forming multiple and various social relations) as well as in formal civic participation (such as, expressing dissent in various forms and being a part of political associations).This broadening of scope confirms the claim that an adequate regulatory response to a matter as complex as privacy in public space in the context of smart cities requires stepping outside the frame of existing laws and regulations, including going beyond data protection law.

1 Introduction

Smart city and living lab initiatives are by now an almost obligatory feature of any city or town in the developed world, including Europe. The contemporary city is thus increasingly built of pervasive information and communication technologies (ICTs), including digital cameras, sound and other sensors, adaptive lighting and wifi tracking technology. Within the smart city/living lab discourse – as spoken by businesses, local governments and supra- national entities like the European Union (EU) – such technologies are seen as a solution to all sorts of urban problems, from traffic congestion, high gas emissions and safety to the rising need of economic competitiveness among cities.[1] Beyond the grand promises of the smart city discourse, however, pervasive ICTs rearticulate the urban experience through its data-driven surveillance practices. For instance, advertisement columns with cameras (reclamezuilen in Dutch; as seen in Dutch train stations and public places) capture certain characteristics of persons in front of them (e.g. gender, age, mood) in order to offer them a personalised ad.[2] Combined with thousands of other ‘smart’ devices, such technologies transform cities into extraordinary apparatuses of data capture (of persons’ interests, preferences, desires, emotional states, beliefs, habits and so on), which was previously only possible within research laboratories and high-surveillance institutions like prisons or mental institutions.

As such, legal and policy debates on privacy in smart cities and similar initiatives are today dominated by a debate focused on data and, thus, on data protection law.[3] Journalistic debates are also almost universally framed around issues of data protection law, while often referring to ‘privacy’.[4] This is not surprising as – so the argument goes – ultimately the smart city is all about data.[5] However, most of the data is captured in public spaces and oftentimes refers to (algorithmic) groups of persons (rather than identified individuals), hence seemingly non- personal and falling outside of the scope of data protection law. The discussion on ‘privacy’ in this context is thus mainly a discussion on personal versus non-personal data, data ownership and technological solutions, such as data protection (or privacy) by design.[6] A common position held by businesses and local governments is that, if faces on video feed are blurred and unique identifiers anonymised (or pseudonymised), then privacy issues stemming from such video surveillance are solved. Sometimes the debate simply assumes that most of the data ‘collected’ from public space (both physical and online) is not personal data at all; instead, it is termed ‘open data’, making it ‘up for grabs’.[7]

A broader perspective on the issue of privacy in public space paints a different picture. Critical debates on smart cities and living labs in several disciplines, particularly urban geography, surveillance studies and privacy theory (distinct from the more specific field of data protection law), have shown that privacy issues are much broader and more diverse than issues relating to data. Not least because privacy should be seen as consisting of several types (including associational, behavioural and bodily privacy, over which lies a layer of informational privacy)[8] but also because smart cities and living labs shape the public space of cities and citizens’ behaviour there, having broader social and political consequences. The smart city debate – at least one that takes privacy seriously – should thus adopt a broader approach to the matter, including perspectives of surveillance studies, privacy theory and urban geography, and the insights that these bring.

This contribution tries to push back on the trend to focus predominantly on informational privacy and data protection (regulating the collection, storage and processing of personal data), which neglects other dimensions and types of privacy that remain protection-worthy even in a digitized world. In this essay, I offer a glimpse of the broader perspective by applying surveillance and privacy theory to a practical example – the Stratumseind Living Lab (SLL) in Eindhoven.[9] First, I briefly describe the SLL, particularly two of its sub-projects (CityPulse and De-escalate). Second, I briefly examine the SLL from a surveillance theory perspective, focusing on a type of surveillance governing through multiplicities. Finally, I combine these insights with a privacy and public space perspective, exploring the implications both for privacy and public space itself. I conclude that regulation that aims to preserve the possibility of privacy in public space, requires a broad approach, consisting of different types of measures (communication, consent, code and law) working together.

Stratumseind 2.0 and its living lab

Stratumseind is a busy nightlife street in the centre of Eindhoven. According to Eindhoven municipality, however, the number of visitors has been significantly dropping since 2010, arguably due to the rising criminality and vandalism on the street, giving Stratumseind a bad image.[10] In 2013, the municipality thus initiated the Stratumseind 2.0 project. Stratumseind 2.0 was an umbrella project in the form of a public-private partnership between Eindhoven municipality, the police and a range of private parties, including the Stratumseind establishments association, real-property owners on the Stratumseind street, and Dutch breweries.[11] The main objective of Stratumseind 2.0 was to ‘long-lastingly improve the street from an economic as well as a social point of view’.[12] More concretely, the project wanted to attract more visitors, make them stay longer and spend more money in the establishments, lower the vandalism, police and health-related costs in connection to Stratumseind, and increase the income related to Stratumseind and Eindhoven as a whole.[13] These ambitious goals were planned to be achieved through a variety of means and initiatives, especially through ‘a 365 days, 24/7 scan of all data on Stratumseind’.[14] The key element and the main sub-project of Stratumseind 2.0 was thus its living lab – the Stratumseind Living Lab (SLL).

The SLL has been described as a field lab with a variety of sensors (mostly on the Stratumseind street) and numerous actors, with the intention of measuring, analysing and stimulating the behaviour of people in public places.[15] It included various sensors and actuators, including: video and sound cameras with embedded analytical capabilities, special lighting and olfactory technology, CityBeacons,[16] wifi tracking, technology for social media sentiment analysis, and a weather station.[17] Other types of data were also collected and stored, including crime statistics, the amount of beer sold and trash collected per week. The main goal of the SLL was to gain insight into the ways in which external stimuli can (substantially) influence the escalating as well as de-escalating behaviour of visitors of the street.[18]

The SLL consisted of a growing number of sub-projects with diverse but intertwined actors and goals, ranging from crime prediction (CityPulse) and community policing (Trillion) to de-escalating people’s behaviour via light (De-escalate).[19] The two biggest and most long- lasting projects were CityPulse and De-escalate, which I briefly examine here.

The goal of the CityPulse project was to create ‘a powerful picture of the street and help authorities better predict and react to situations and de-escalate them before they develop’.[20] As such, it was a type of predictive policing project that has increasingly been taking place in the last decade worldwide, including in the Netherlands.[21] The CityPulse system was designed to analyse various types of data captured and collected, looking for anomalies in data patterns.[22] If these data sources would confirm the likelihood of an incident, the system would alert the regional police control room, giving the police an opportunity to make better informed decisions on any action on the Stratumseind street that might be required.[23] In particular, there were four possible alert notifications: ‘nothing wrong’, ‘everything alright’, ‘backup needed’ and ‘high risk situation’.[24] The system was thus seen as a predictive, preventative and an ancillary tool for the police in its role for crime and order maintenance in the city.

The De-escalate project was a project developing a special lighting system with the purpose of influencing and diffusing ‘escalated’ mood and behaviour with ‘dynamic lighting scenarios’ in public space.[25] It has been described as an intelligent lighting system to control emotion.[26] In this sense, it is a tool that influences people’s behaviour – a nudging tool. The key term employed, ‘escalated behaviour’, was defined in a very broad manner, referring to all types of behaviour of persons who in some way lose self-control, including screaming, getting abusive, aggressive or crossing other behavioural boundaries that a person would otherwise not cross.[27] When escalated behaviour would be detected by the technology, the lighting system would try to affect the ‘atmosphere’[28] on the street in order to de-escalate it. This was planned to be achieved through lighting scenarios, aiming to proactively keep stress levels at ‘acceptable levels’.[29] For instance, dim and warmer colour light is associated with lower arousal, with studies showing that exposing people to pulsating orange light at slow frequencies leads to relaxing breathing rhythms.[30]

3 Surveillance within the SLL: even if not individually identifiable, still reachable

As the functioning of smart cities and living labs depends on digital technology (constituting surveillance technology),[31] surveillance has a central place within smart city and living lab projects, so much that smart cities are often called ‘surveillance cities’.[32]

Scholars have already expressed concerns about individuals’ privacy, autonomy, freedom of choice, and discrimination in smart cities.[33] However, questions of what exactly this means, in which cases these issues arise, how and for whom, often remain unanswered. In fact, many scholars and the media often do not define surveillance at all (as if it were a straightforward notion) or they employ the old-fashioned Panopticon or Big Brother metaphor for the type of surveillance found in smart cities.[34] They do so despite the common criticism that these definitions and metaphors are out of date.[35] The choice of theoretical framework through which one addresses contemporary surveillance practices within smart cities is important. A particular framing of an issue involves social construction of phenomena, encourages certain interpretations, discourages or obfuscates others, and, consequently, invites a certain kind of criticism. It is therefore insufficient to state that ‘smart city privacy issues reflect traditional concerns about the panoptic [top-down] gaze of a surveillance society’?[36] Contemporary networked surveillance practices – and consequently their implications – are much more diverse and complex than that. This can clearly be seen in the SLL example as well.

Within the SLL, we see that surveillance transcends the public-private divide. It is carried out by both public and private actors,[37] in a type of public-private partnership, for public (safety) and private purposes (profit), for control and care, over suspects and non-suspects, and within public, semi-public and private spaces. These diverse surveillance practices often support each other in a complex manner that is almost impossible to disentangle.[38] Such surveillance thus has the form of a ‘surveillant assemblage’.[39] Consisting of increasing projects, types of digital technologies, actors, subjects, relations and goals, surveillance has a rhizomatic structure without distinct boundaries – like the roots and shoots of a persistent set of plants, it seems to pop up everywhere.

However, the novel and prevalent logic of surveillance and its dominant mode of power is found elsewhere. This is because within the SLL persons are generally governed as a multiplicity, often in the form of an algorithmic group. This logic is best described by Foucault in his conceptualisation of ‘security’.[40] Surveillance as security does not function in the ‘traditional’ way (as Bentham’s Panopticon) by breaking down multiplicities and identifying individuals.[41] Quite the opposite, security focuses on the multiplicity itself (particularly relationships between persons and the environment) in order to optimise the adjustment of the assembled components of the intertwined multiplicity.[42] In regard to public space, this logic has been used to optimise consumption, while at the same time minimising labour and other expenditures through risk management of multiplicities,[43] which is also the case in smart cities and living labs.

Applied to the SLL, security is sought through control but without stopping or hampering the flow – that is, consumption – of visitors on the street. Persons are not surveilled and targeted individually, rather they are targeted collectively. The goal is not to identify every single breaking of the norm (in this case, every single type of anti-social or escalated behaviour) and intervene, for example by a pub’s security guard or by the police, either in the form of a warning, exclusion from the pub, the street itself or arrest. The goal is to let smaller transgressions slide and try to manage their relationships with the whole – that is, try to prevent them affecting the atmosphere on the street – in such a way that an intervention will not be necessary. For instance, if Freek and Marloes have a lovers’ quarrel that leads to shouting and a broken glass, but lighting scenarios help calm them down (without needing to individually identify them) and prevent the atmosphere on the street to be affected, no other intervention is needed. In this sense, the logic of such surveillance is inclusionary – it is aimed at managing (that is, influencing behaviour) of persons in order to preserve the flow of consumption.

On the collective level, the SLL thus captures and analyses various types of data, which in itself do not directly identify individuals (and would not necessarily be considered personal data), for example, sound levels, levels of aggression in the sounds, crowding levels and ‘suspicious’ walking patterns. However, it captures them in order to produce risk categories in connection to the ‘situation’ on the street. Remember, the CityPulse system categorises these risks in four general levels: ‘nothing wrong’, ‘everything alright’, ‘backup needed’, and ‘high risk’. Freek and Marloes’ mildly escalated behaviour might count as a low risk situation, in which only lighting and smell scenarios need to be turned on in order to de- escalate the deviant behaviour and try to prevent it from happening in the future by affecting other persons’ mood. In other words, even if individuals are not individually identified, they are still reachable – they may be ‘nudged’ in this or that way (via design, light, smell) for this or that purpose (e.g. increased consumption of alcoholic beverages or shopping; I will examine nudging as a privacy risk in the next section).

However, the SLL project can also be said to operate on an individual level. As such, it would operate partially according to a securitising logic and partially according to the exclusionary logic of control happening in invisible and opaque networks. The logic of control does not kick in only when the securitising logic fails (e.g. when Marloes and Freek do not calm down and begin to physically fight so that city guards or the police need to be called); it also operates in parallel to it. Thieves and known troublemakers, for instance, are not desired on the street in the first place. Their presence on the street is not intended to be ‘managed’ through securitising logic, they are simply to be identified and excluded. In regard to the SLL, one can hypothesise that at the first detection of aggression (e.g. raised voice or broken glass) by someone deemed a troublemaker through data mining within the CityPulse system, the troublemaker will more likely simply be excluded (by city guards or the police) rather than tried to first be calmed down via lighting scenarios.[44] While this might be desirable on a general level, as no one wants their purse stolen on a night out with friends or their friend assaulted by a bully, how these ‘undesirables’ are determined – through codified proxies indicative of deviance within invisible and opaque networks – is problematic. Through data mining new proxies might also be ‘discovered’ based on correlations (rather than causal relations). For instance, there might be a correlation between being good at playing darts or wearing a certain brand of sneakers and aggressive behaviour.[45] As this has been extensively discussed in surveillance and privacy theory,[46] I will not discuss it further in this essay.

4 Privacy and public space in the SLL

According to the traditional liberal perspective on privacy, the main goal of privacy is ‘to be let alone’,[47] leading to a strong protection of private places and the perception that privacy intrusions in public space are either nonexistent or minor. This was not a practical issue until now, as people were theoretically visible and observable, but practically usually inconspicuous in public space (i.e. ‘just another face in the crowd’). With the rise of surveillance technologies and their ever-increasing application in public spaces, however, the default of moving around in public is no longer inconspicuousness but visibility.

While the need for adequate privacy protection in public space has been highlighted in the literature,[48] it is less clear which dimensions and aspects of privacy exactly need protection, and it remains a challenge to identify suitable means for protecting privacy in public. In this section, I offer brief thoughts on the matter based on the SLL example.

Based on the typology of privacy as developed by Koops et al., one can identify the most relevant dimension of privacy in the public zone: the dimension of self-development, that is the freedom to act autonomously in public space (at least within the broad boundaries of social and legal acceptability).[49] Although one can find solitary places outside of the home and one can desire to be let alone while among others in public space (for example, by avoiding eye contact with others or immersing oneself in the acoustic cocoon of one’s headphones), privacy in public space does not focus on disengaging the individual from social and political life, despite persistent claims of the traditional liberal perspective on privacy. Quite the opposite, privacy in public space focuses on facilitating the autonomous development of one’s identity and, related to that, a broad range of social relations, sometimes including political associations. As such, the two types of privacy most relevant in this context are behavioural and associational privacy (alongside informational privacy, which is an overlay dimension of all types of privacy).[50] This is something that has been recognized in the case law of the European Court of Human Rights (ECtHR), most recently by acknowledging ‘a right to lead a private social life’, including activities taking place in a public context, allowing individuals to develop their social identity.[51]

Based on the above discussion of surveillance within the SLL, we can identify particular risks both for privacy in public space, as well as for public space itself.

4.1 Security and manipulative nudging

In relation to surveillance as security, the main privacy risk is related to the issue of manipulative nudging and its effect on autonomy, one of the key arguments for why we value privacy.[52] From this perspective, smart cities are transforming cities into large laboratories, where a key question posed is: how to render the behaviour of persons predictable and externally controllable? In this sense the SLL can be seen manipulating the environment in order to discover and influence how people behave so as to make the place safer and more attractive for visitors. This type of development has often been called nudging.[53]

A nudge is commonly defined as ‘any aspect of the choice architecture [that is, the context in which people make decisions] that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives’.[54] The concept of nudging is based on the fact that much individual decision-making occurs subconsciously, passively and unreflectively rather than through active, conscious deliberation. The environment can thus be intentionally designed in ways that systematically influence human decision-making in particular directions. As such, nudges have been commonly criticised in literature for having manipulative effects that bypass autonomous decision-making, thus positing a risk for our autonomy.[55]

However, there are different types of nudges, some manipulative while others not. Some nudges are forthright and aim to appeal to people’s deliberative capacities. For example, simple disclosures (a type of ‘informational nudge’), such as nutritional labels on food, try to affect persons’ deliberative capacities. On the contrary, if nudges employ a hidden influence by covertly subverting another person’s decision-making power through exploitation of the person’s cognitive (or affective) weaknesses and vulnerabilities, they can be described as manipulative.[56] Yet, manipulation is usually targeted – in order to exploit one’s vulnerabilities, one must know what they are and how to leverage them.[57] Most nudges, however, are not targeted to particular individuals, they are applied to everyone in the same way. A speed bump on the road affects all who pass it. However, ICTs are well-suited to facilitate nudging that would allow for ‘fine-grained microtargeting’, turning manipulative nudging into what Yeung calls ‘hypernudging’.[58] Hypernudges are not only hidden but they target and exploit individual vulnerabilities, making them very difficult to resist.

Within the SLL, the use of lighting scenarios and of smell with the aim of influencing people’s behaviour can be classified as nudging.[59] Firstly, they are hidden. While we can detect special lighting (e.g. a reddish or blueish colour instead of the usual ‘white’ or ‘yellow’) and smell (the smell of oranges) on the street, we do not necessarily notice them, especially when we are focusing our attention to something else and are potentially intoxicated. Moreover, we would also likely not know the rationale behind the special light or smell or how they function. Secondly, these nudges do not appeal to people’s deliberative capacities; on the contrary, they exploit our cognitive weaknesses, albeit for a generally positive purpose – ‘de-escalating’ aggressive behaviour. If we are getting angry, for example, colder light (in the shades of blue and green) and the pleasant smell of oranges, supposedly make us feel calmer.[60] Finally, these nudges function in a rather crude way, trying to influence people as a group – all of the visitors of the Stratumseind street, thus are not able to precisely target and exploit individual vulnerabilities.

As such, it seems fitting to describe the nudging taking place within SLL as a manipulative practice – but one that is (relatively) easily resistible – and used for a benevolent purpose, at least for purposes of de-escalation of aggression.[61] This is in line with the results of the De- escalate project, which did not lead to any demonstrable effect on persons’ escalated behaviour.[62] Would the nudging practices on the Stratumseind street become capable of individual exploitation of vulnerabilities and weaknesses through more sophisticated digital technology, however, the risks for manipulation – and thus for autonomy – would become much higher. Such a development might not take place within the SLL, but it is imaginable that further development in digital technology would make it a real possibility. Given the enthusiasm of adopting technological ‘solutions’ in cities and a lack of legal restraint (apart from data protection law), technologies capable of having a targeted effect on our decision- making processes might be found in future (smart) cities.

4.2 Public space conducive to consumption

Surveillance within smart city and living lab initiatives also shapes the texture of public places in functionally specific ways that may be detrimental to both political and informal community life. It does this particularly by valencing multiple-purpose public spaces, which can be used for different purposes and in different ways, for particular patterns of use.[63] In the SLL, for instance, surveillance of public space (particularly surveillance as security) creates an environment that is conducive to consumption – in this case, consumption (of alcohol, beer in particular)[64] in the pubs on the Stratumseind street. This orientation towards consumerism (which also excludes potential sources of nuisance) is neither conducive to encounters and debate nor to political participation. This means that sociability and political participation, key functionalities of public space, are hindered.

It is sometimes assumed that protecting privacy in public places fails to address the social and cultural significance of public places, providing a basis for informal sociality and civic life.[65] This claim, however, is based on a narrow perception of privacy, understood as the right to be let alone, disengaging the individual from her public and social role. Yet, privacy is an infrastructural condition allowing for a fuller exercise of other rights and freedoms, including those connected to public space, such as: right to access, to representation and freedom of assembly and association.[66] For instance, masking and other forms of anonymity preservation can be important to ensure that protesting citizens can be themselves as political agents.[67]

This connection between privacy and public space has been well-described by Staeheli and Mitchell: ‘Public space is not only a space of politics, it is also a space constituted in and through privacy.’[68] For individuals to develop into autonomous beings, both personally and politically, they need to venture out of their private places into public space and interact with others. That is, they need to form various social relations with strangers and near-strangers and participate politically in public space. Privacy in public space thus needs to protect the possibility to develop and maintain social and political relations, which has value not only for the individual herself but also for society and democracy more broadly. If private and public are commonly understood as exclusionary opposites (e.g. a space or an activity can either be public or private with nothing in between), this is not so in the context of privacy in public space.

As such, protection of privacy in public space also serves to protect those aspects of public space connected to political participation and sociability. Certain attributes connected to these dimensions of public space thus deem protection, including openness of access, absence of exclusivity of control, multiple use, tolerance of a level of disorder, anonymity and allowing for dissent.[69] The tightening of social control, where de-escalation (or exclusion) are the norm, can also be said to be antithetical to the practice of critical citizenship. ‘Liberal citizenship requires a certain amount of discomfort – enough to motivate citizens to pursue improvements in the realization of political and social ideals. The modulated citizenry lacks the wherewithal and perhaps even the desire to practice that sort of citizenship’.[70] In a tightly scripted public space, where difference is either deterred or excluded, there is no room for discomfort or the development of multiple publics with different views. Pervasive networked surveillance that profiles and sorts persons is, after all, aimed at eliminating diversity and serendipity required to build robust communal ties.

Public spaces of contemporary (smart) cities might thus be conducive to consumption but they can also be disadvantageous to social and political participation. As such, it is vital to protect public space as shared space, with mutual rights and responsibilities, rather than as a realm where privacy and other individual rights are subject to the whim of others, in particular the state and corporations.

5 Conclusion

In this essay I have tried to open up the debate on smart cities and similar initiatives, particularly in the field of law and policy, going beyond the focus on data. I have examined the Stratumseind Living Lab (SLL) through the perspective of surveillance, focusing on a particular type of surveillant logic that is novel and prevalent in such initiatives – one that governs on a collective level without needing to individually identify individuals. I have shown how this mode of power (conceptualised as ‘security’ by Foucault) can still affect individuals, particularly by limiting their possibilities of action (including via nudging) and potentially limiting their autonomy. Especially, if the digital technology of the SLL (or similar initiatives) would become more sophisticated and thus capable of individual exploitation of vulnerabilities and weaknesses – what has been termed ‘hypernudging’ – the risks for manipulation and thus for autonomy (a key argument for why we value privacy) would increase. Moreover, such surveillance can also affect society and democracy more broadly. In particular, I have shown how surveillance in public space can adversely affect the possibility and ability of persons to participate in informal social life (forming multiple and various social relations) as well as in formal civic participation (such as, expressing dissent in various forms and being a part of political associations).

It would be going too far, of course, to claim that the broader privacy risks concerning society and democracy are imminent in the SLL example. Stratumseind is merely one street in the city of Eindhoven and a nightlife street at that. However, would the surveillant logics of the SLL expand throughout the city of Eindhoven – not completely unlikely given the promises of smart city transformation – these broader risks for democracy would come into play. As such, it is important to be aware of these risks in advance, when they still stem from hypothetical scenarios, rather than becoming aware of them only when the ‘train’ is going forward at full speed and turning on the breaks is much harder.

An adequate regulatory response to a matter as complex as privacy in public space in the context of smart cities requires stepping outside the frame of existing laws and regulations, including going beyond data protection law. As scholars have already pointed out, regulators should consider the whole repertoire of governance and regulation tools (including consent, communication and architecture) and think more broadly about how these could and should apply to surveillance in public spaces and its privacy risks.[71]

6 Endnotes

* Dr. Maša Galič is an assistant professor in privacy and criminal (procedure) law at the Vrije Universiteit Amsterdam, she is also an editor at the IT & Recht section of the Open Recht platform.

[1] See, for instance, initiatives in Barcelona, one of the most famous examples of a smart city (, accessed 12 June 2019).

[2] These cameras in the Netherlands were allegedly switched off after an opinion of the Dutch data protection authority stating that such a practice amounts to data processing of personal data without a legitimate basis; see Autoriteit Persoonsgegevens, ‘AP informeert branche over norm camera’s in reclamezuilen’ 2018, reclamezuilen (accessed 12 June 2019).

[3] E.g. Lorenzo Dalla Corte, Bastiaan van Loenen & Colette Cuijpers, ‘Personal Data Protection as a Nonfunctional Requirement in the Smart City’s Development’, in: Managing Risk in the Digital Society (Proceedings of the 13th International Conference on Internet, Law & Politics, Universitat Oberta de Catalunya, Barcelona, 29-30 June, 2017), Barcelona: Huygens Editorial 2017; Lilian Edwards, ‘Privacy, Security and Data Protection in Smart Cities: A Critical EU Law Perspective’, European Data Protection Law Review 28-58.

[4] E.g. Tinus Kanters, ‘Living Lab Stratumseind’ 2016, content/uploads/2016/07/LLTrillion2015.pdf; Norma Möllers & Jens Hälterlein, ‘Privacy Issues in Public Discourse: The Case of “Smart” CCTV in Germany’, Innovation: The European Journal of Social Science Research (26) 2013, issue 1-2, p. 57-70; Saskia Naafs, ‘“Living Laboratories”: The Dutch Cities Amassing Data on Oblivious Residents’ The Guardian 1 March 2018.

[5] Dalla Corte, van Loenen & Cuijpers 2017.

[6] Dalla Corte, van Loenen & Cuijpers 2017; Igor Calzada, ‘(Smart) Citizens from Data Providers to Decision- Makers? The Case Study of Barcelona’, Sustainability (10) 2018, issue 9, 3252; Antoni Martinez-Balleste, Pablo A. Perez-Martinez & Agusti Solanas, ‘The Pursuit of Citizens’ Privacy: A Privacy-Aware Smart City Is Possible’, IEEE Communications Magazine, (51) 2013, issue 6, p. 136-141.

[7] E.g. John Brandon, ‘Sensor Data up for Grabs’, MIT Technology Review April 18, 2011.

[8] Bert-Jaap Koops and others, ‘A Typology of Privacy’, University of Pennsylvania Journal of International Law (38) 2017, issue 2, p. 483-575.

[9] While smart cities are highly contextual (they are a by-product of several factors such as, geographical setting, its history, demographics, social context, economy, and laws) and thus distinctive, they also have characteristics that are comparable to other cities. As such, I use the example of the Stratumseind Living Lab as an illustrative example of a smart city-related initiative in the Netherlands.

[10] Eugene van Gerwen, ‘Stratumseind 2.0: Plan van Aanpak’, 2013

[11] Grolsch, Heineken, Bavaria and the InBev conglomerate.

[12] van Gerwen 2013.

[13] Kanters 2016.

[14] Tinus Kanters, ‘Living Lab, Onderdeel van Stratumseind 2.0, Smart Sensors, Smart Interfaces, Smart Actors, Smart Lights, Smart Data, Smart Design, Augmented Reality, Gaming’, August 2013

[15] The concept of living labs refers to a special methodology and experimentation platform, where users (citizens) and producers co-create, test and validate innovations in a real-life and open environment (see Pieter Ballon, ‘Living Labs’, in: Robin Mansell & Peng Hwa Ang (eds.), The international encyclopedia of digital communication and society, Hoboken, NJ: John Wiley & Sons 2015). Within the SLL there was little co- creation by citizens, so the initiative can be classified as a ‘real’ living lab only with difficulty (see T. Maas, J. van den Broek & J. Deuten, ‘Living labs in Nederland. Van open testfaciliteit tot levend lab’, The Hague: Rathenau Instituut 2017.

[16] Large objects combining the functions of cameras, information signs, signposts, antennas, advertising spaces and video screens.

[17] Several of these technologies, including the video cameras, sound sensors and lighting technology, were developed for the SLL, either by multinational technology companies (Atos, Intel, Philips) or smaller local businesses (Vinotion, Sorama), sometimes in co-operation with Dutch universities (Technical University Eindhoven).

[18] Clara Kuindersma, ‘De Openbare Ruimte Als Proeflab Voor Nudging’ Stadszaken 29 november 2018,

[19] The SLL officially lasted from mid-2014 to mid-2018, however, some of the projects seem to be continuing with several of the same parties still in 2019 (see ‘Living Lab, Stratumseind 2.0’ Facebook page. Retrieved March 19, 2019 from

[20] Atos, ‘CityPulse - using big data for real time incident response management’ April 2015, content/uploads/2016/06/atos-ph-eindhoven-city-pulse-case-study.pdf (accessed 23 March 2019).

[21] Marc Schuilenburg, ‘Predictive Policing: de opkomst van een gedachtenpolitie?’, Ars Aequi 2016, afl. 12,
p. 931-936 (AA20160931).

[22] For example, the video cameras of the CityPulse system had an embedded capability of tracking walking patterns. As such, the software could single out an individual with a ‘suspicious walking pattern’ on the street. Such a suspicious walking pattern could refer to someone walking up and down the street numerous times at a slow pace, indicating the possibility of a thief. See Albert Seuber, ‘CityPulse Eindhoven – Netherlands’ April 2015, il_2015.pdf (accessed 12 June 2019).

[23] Atos 2015.

[24] It is unclear to the author what the difference between ‘nothing wrong’ and ‘everything alright’ is.

[25] This project run from 2014 to 2018 and was led by researchers from the Technical University of Eindhoven (TU/e) and Philips, a large Dutch technology company from Eindhoven, which designed and provided the lighting system. Other parties to the project include Eindhoven municipality, the police, DITTS and smaller technology companies.

[26] Yvonne de Kort, ‘Spotlight on Aggression’, ILI magazine 2014, edition 1, p. 10-11.

[27] De Kort 2014.

[28] Atmosphere was seen as a characteristic of social context that can be transformed into measurable data, coloured through interactions with other visitors. It was defined as ‘people’s attitudes, mood, behaviours and interactions with one another as well as with their immediate environment’ (see Indrė Kalinauskaitė and others, ‘Atmosphere in an Urban Nightlife Setting: A Case Study of the Relationship between the Socio-Physical Context and Aggressive Behavior’, Scandinavian Journal of Psychology (59) 2018, issue 2, p. 223-235, on p. 228).

[29] H. den Ouden & A.C. Valkenburg, ‘Smart Urban Lighting’, in: A. Nigten (ed.), Real projects for real people, Volume 3, Rotterdam: The Patching Zone 2013, p. 156.

[30] K.C.H.J. Smolders, Y.A.W. de Kort & P.J.M. Cluitmans, ‘A Higher Illuminance Induces Alertness Even during Office Hours: Findings on Subjective Measures, Task Performance and Heart Rate Measures’, Physiology & Behavior (107) 2012, p. 7-16.

[31] J.K. Petersen, Handbook of Surveillance Technologies, Boca Raton, FL: CRC Press 2012, p. 10.

[32] E.g. David Lyon, The Culture of Surveillance: Watching as a Way of Life, Cambridge: Polity Press 2018; Torin Monahan, ‘The Image of the Smart City: Surveillance Protocols and Social Inequality’, in: Yashushi Watanabe (ed.), Handbook of Cultural Security, Cheltenham: Edward Elgar Publishing 2018; David Murakami Wood, ‘Smart City, Surveillance City’ SCL - the society for computers and law smart-city-surveillance-city; Jathan Sadowski & Frank Pasquale, ‘The Spectrum of Control: A Social Theory of the Smart City’, First Monday (20) 2015, issue 7.

[33] Kelsey Finch & Omer Tene, ‘Smart Cities: Privacy, Transparency, and Community’, in: Evan Selinger, Jules Polonetsky & Omer Tene (eds.), The Cambridge Handbook of Consumer Privacy, Cambridge: Cambridge University Press 2018.

[34] Finch & Tene 2018; e.g. Peter de Graaf, ‘Een biertje met Big Brother erbij op Stratumseind’, de Volkskrant 23 November 2015; Bert van Doorn, ‘Stratumseind 2.0: Big Brother kijkt niet alleen mee, maar beïnvloedt je ook’, Omroep Brabant 5 August 2018.

[35] There are significant differences in the particular logics or techniques of operation, for instance, between Foucault’s discipline or Deleuze’s control; for more on this see Maša Galič, Tjerk Timan & Bert-Jaap Koops, ‘Bentham, Deleuze and Beyond: An Overview of Surveillance Theories from the Panopticon to Participation’ Philosophy and Technology (30) 2017, issue 1, p. 9-37.

[36] Kelsey Finch & Omer Tene, ‘Welcome to the Metropticon: Protecting Privacy in a Hyperconnected Town’ Fordham Urban Law Journal (41) 2016, p. 1581-1615, on p. 1594.

[37] Including individuals in the Trillion community policing project.

[38] This also leads to issues in determining, which data protection regime is applicable: the General Data Protection Regulation or the Law Enforcement Directive (with a special regime for data processing in the context of law enforcement); this and connected issues will be examined in a forthcoming paper exploring smart cities from a data protection perspective.

[39] Kevin D. Haggerty & Eric V. Ericson, ‘The Surveillant Assemblage’, British Journal of Sociology (51) 2000, p. 605-622.

[40] Michel Foucault, Security, Territory, Population: Lectures at the Collège de France 1977-1978, London: Picador 2007.

[41] Jeremy Bentham, The Panopticon Writings, New York/London: Verso 2010; Michel Foucault, Discipline and Punish: The Birth of the Prison, London: Penguin Books 1991.

[42] Francisco Klauser, Till Paasche & Ola Söderström, ‘Michel Foucault and the Smart City: Power Dynamics Inherent in Contemporary Governing through Code’, Environment and Planning D: Society and Space (32) 2014, p. 869-885, on p. 873.

[43] Bernard E. Harcourt, ‘Digital Security in the Expository Society: Spectacle, Surveillance, and Exhibition in the Neoliberal Age of Big Data’ (Columbia Public Law Research Paper No. 14-404), 2014.

[44] The author does not have knowledge of any such practice actually occurring within the SLL. However, this scenario is in line with the general goal of the CityPulse system, that is detection and prevention of deviant behaviour, as well as the general trend in risk-oriented law enforcement and predictive policing practices.

[45] And they might be based on spurious correlations (see e.g. Mireille Hildebrandt, ‘From Data to Knowledge: The Challenges of a Crucial Technology’ (2006) 30 Datenschutz und Datensicherheit (30) 2006, p. 548-552).

[46] E.g. Bart W. Schermer, ‘The Limits of Privacy in Automated Profiling and Data Mining’, Computer Law and Security Review (27) 2011, p. 45-52,; Monahan 2018; John Cheney- Lippold, ‘A New Algorithmic Identity’, Theory, Culture & Society (28) 2011, issue 6, p. 164-181; Clive Norris, ‘From Personal to Digital: CCTV, the Panopticon, and the Technological Mediation of Suspicion and Social Control’, in: David Lyon (ed.), Surveillance as social sorting: privacy, risk and digital discrimination, Abingdon: Routledge 2003.

[47] As Warren and Brandeis famously put it (Samuel D. Brandeis & Louis D. Warren, ‘The Right to Privacy’ Harvard Law Review (4) 1890), p. 193-220).

[48] E.g. Margot E. Kaminski, ‘Regulating Real-World Surveillance’, Washington Law Review (90) 2015, p. 1113-1165; Helen Nissenbaum, ‘Toward an Approach to Privacy in Public: Challenges of Information Technology’, Ethics & Behavior (7) 1997, issue 3, p. 207-219; Christopher Slobogin, ‘Public Privacy: Camera Surveillance of Public Places and the Right to Anonymity’, Mississippi Law Journal (72) 2002, p. 213-315.

[49] Koops and others 2017.

[50] As an increasing amount of data is captured, stored and analysed, potentially revealing much more about us than previously possible, regulating data capture and processing – including in public space – is indeed very important and is thoroughly discussed in scholarship today. However, a focus on informational privacy needs to be complemented with a focus on other types of privacy (particularly associational and behavioural in this context), which receive far less attention. This is my aim in this essay.

[51] ECtHR 18 January, 2018, 48151/11 and 77769/13 (National Federation of Sportspersons’ Associations and Unions (FNASS) and Others v. France); ECtHR 5 September 2017, 61496/08 (Bărbulescu v. Romania).

[52] See Priscilla Regan, ‘Privacy and the Common Good: Revisited’, in: Beate Roessler & Dorota Mokrosinska (eds.), Social dimensions of privacy: interdisciplinary perspectives, Cambridge: Cambridge University Press 2015; Corey Brettschneider, Democratic Rights: The Substance of Self-Government, Princeton, NJ: Princeton University Press 2007; Beate Roessler & Dorota Mokrosinska, ‘Privacy and Social Interaction’, Philosophy and Social Criticism (39) 2013, p.771-791.

[53] E.g. Marc Schuilenburg & Rik Peeters, ‘From Biopolitics to Mindpolitics: Nudging in Safety and Security Management In From Biopolitics to Mindpolitics’, Culture of Control 2015, p. 1-7; José van Dijck & Thomas Poell, ‘Social Media and the Transformation of Public Space’, Social Media and Society (1) 2015, issue 2, p. 1- 5.

[54] Richard H. Thaler & Cass R. Sunstein, Nudge: Improving Decisions about Health, Wealth, and Happiness, Penguin Books 2009, p. 6.

[55] E.g. Karen Yeung, “Hypernudge”: Big Data as a Mode of Regulation by Design’, Information, Communication and Society (20) 2017, issue 1, p. 1-19.

[56] Daniel Susser, Beate Roessler & Helen Nissenbaum, ‘Online Manipulation: Hidden Influences in a Digital World’, 2019,

[57] Susser, Roessler & Nissenbaum 2019, p. 21.

[58] Yeung 2017.

[59] The actors employ the generic term ‘influence’ (invloed) a term without normative baggage.

[60] Yvonne de Kort, Light on and in Context (inaugural address TU/e), Eindhoven: Technische Universiteit Eindhoven 2015.

[61] Here I speak of a ‘manipulative practice’ because manipulation is a ‘success concept’ – it presupposes the effect of manipulation (Allen W. Wood, ‘Coercion, Manipulation, Exploitation’, in: Christian Coons & Michael Weber (eds), Manipulation: Theory and Practice, Oxford: Oxford University Press 2014). In the case of the SLL, this success has not been shown, so it is more appropriate to speak of manipulative practices. Nevertheless, some people are more easily influenced by such means and might still be affected but many persons would not be affected, at least not to a significant degree.

[62] Diede Hoekstra, ‘Netwerk van hypermoderne camera’s op stratumseind in Eindhoven gaat politie helpen’, Eindhovens Dagblad, December 2017. Although there are, of course, differences between people – some are influenced more easily than others.

[63] Jason W. Patton, ‘Protecting Privacy in Public? Surveillance Technologies and the Value of Public Places’ Ethics and Information Technology (2) 2000, p. 181-187, on p. , 186.

[64] Remember that breweries are parties to the Stratumseind 2.0 umbrella project.

[65] Patton 2000, p. 181.

[66] Evelyn S. Ruppert, ‘Rights to Public Space: Regulatory Reconfigurations of Liberty’, Urban Geography (27) 2006, p. 271-292.

[67] This is so because freedom to protest as political actors requires freedom from others’ judgement that would implicate other parts of identity; in other words, the worker and the student should not fear that they will be dismissed or expelled because they have engaged in political protest; see Bert-Jaap Koops, ‘Privacy Spaces’, West Virginia Law Review (121) 2018, issue 2, p. 611-665, on p. 652.

[68] Lynn A. Staeheli & Don Mitchell, ‘Spaces of Public and Private: Locating Politics’ in Clive Barnett & Murray Low (eds.), Spaces of democracy: geographical perspectives on citizenship, participation and representation, Thousand Oaks, CA: Sage Publications 2004, p. 149-150. Emphasis in original.

[69] I have identified these attributes based on a detailed examination of literature on the public sphere and public space (primarily from perspectives of human geography and political theory) in my doctoral dissertation (see Maša Galič, Surveillance and Privacy in Smart Cities and Living Labs: Conceptualising Privacy for Public Space, diss. Tilburg University, Optima Grafische Communicatie 2019).

[70] Julie E. Cohen, ‘Surveillance vs. Privacy: Effects and Implications’, in: David Gray & Stephen E. Henderson (eds.), The Cambridge Handbook of Surveillance Law, Cambridge: Cambridge University Press 2017, p. 555; emphasis in original. Emphasis in original.

[71] Colin J. Bennett & Charles D. Raab, The Governance of Privacy: Policy Instruments in Global Perspective Cambridge, MA: The MIT Press 2006; Michael Katell & others, ‘Seeing the Whole Picture: Visualising Socio- Spatial Power through Augmented Reality’, Law, Innovation and Technology (11) 2019.

Titel, auteur en bron


Surveillance, privacy and public space in the Stratumseind Living Lab. The smart city debate, beyond data


Maša Galič


Ars Aequi, AA 2019, afl. 7-8, p. 570

Permanente link

Huidige versie