Hello! I am an Assistant Professor in AI Ethics & Society at the University of Cambridge Professional and Continuing Education, where I co-direct the MSt AI Ethics & Society. I am a Science and Technology Studies scholar, researching issues of power, culture and inequality in AI, data and related technologies.
This paper introduces a method of critically mapping AI as an assemblage of social relations. This approach is rooted in the principles of centring the margins and highlighting power structures. Viewing an AI or algorithmic system as a wide-reaching network of social relations, including their impacts on different groups and contexts, enables consideration of how AI is embedded within different discourses and domains of power while emphasising the impact on those most affected. The paper provides a discussion of the critical framing of the project, the principles, processes and templates for mapping AI in this way, and three examples of algorithmic systems in public sector contexts that have been discontinued, are in use now or are being proposed for future use. A discussion of uses and limitations is provided, situating the method beyond a descriptive or analytical tool towards a critical approach to identifying locations for intervention in harmful or unjust uses of algorithmic systems.
xThe term "stakeholder" features prominently in discourses surrounding tech policy. It is used as a marker for engaging with wider organisations and publics beyond government and the companies that make technologies. But what does this term do in practice? What roles does it create or deny, what power structures does it open or close? This study of 194 tech policy documents produced by the UK government over a span of five years uncovers the different ways the term "stakeholder" is used in practice. By comparing the use of the word with whose ideas are actually cited in the documents, it highlights the discursive gaps between what is claimed and what is represented in the voices that shape policy. The results are analysed through queer performativity, including institutional non-performativity and peri-performative framings, to assess the roles that are imposed on different groups, and the different hierarchies and power structures the stakeholder constructs in current UK policy practices.
With Maisy Taylor, Sarah Vollmer and Zaynab Ravat
xCute videos are everywhere online. Many of these videos increasingly come from footage taken by doorbell cameras. Amazon’s Ring, and related connected camera devices, introduce new sociotechnical relations into domestic environments. First, I outline “squeeveillance” as the affective and performative dimensions of cuteness within surveillance. I explore the Ring surveillant assemblage and why it needs the power of cuteness. Then, I examine squeeveillance as the use of cuteness in the way Ring operates. I use the TV show Ring Nation (2022–present) to discuss the remediation of cute footage from doorbell cameras onto other media, before discussing the ways in which cuteness is performed as a normalisation of surveillance power. The article draws on theories of cuteness in conjunction with surveillance studies of power relations. In presenting squeeveillance as a lens through which to assess the expanding scope of Ring, I offer a discussion of the interconnected role of surveillance in contemporary domestic and media settings and its relation to current forms of power in surveillant assemblages.
xThe ubiquity of digital technologies has changed the ways people work. Whether it is the type of work people do or the tools they use to do it, work can mean something very different in a world of platforms, data, and artificial intelligence. But many familiar issues persist. This chapter discusses digitalised work, from the work of making digital systems to the changes those systems make to other jobs. The introduction starts by giving some key concepts, before the first section discusses the hidden types of work that go into making digital technologies. The following section focuses on the role of platforms in managing labour, before a discussion of the additional types of work required to organise and resist in digitalised work. A case study presents Indian content moderators, drawing together these different forms of work. Throughout the chapter, there is a focus on how digital work reinforces existing inequalities and creates new inequalities by distributing work globally and hiding those most affected.
xWe are often expected to trust technologies, and how they are used, even if we have good reason not to. There is no room to mistrust.
Exploring relations between trust and mistrust in the context of data, AI and technology at large, this book defines a process of ‘trustification’ used by governments, corporations, researchers and the media to legitimise exploitation and increase inequalities.
Aimed at social scientists, computer scientists and public policy, the book aptly reveals how trust is operationalised and converted into a metric in order to extract legitimacy from populations and support the furthering of technology to manage society.
xThis paper applies and extends the concept of algorithmic imaginaries in the context of political resistance to sociotechnical injustice. Focusing on the 2020 UK OfQual protests, the role of the ”fuck the algorithm” chant is examined as an imaginary of resistance to confront power in sociotechnical systems. The protest is analysed as a shift in algorithmic imaginaries amidst evolving uses of #FuckTheAlgorithm on social media as part of everyday practices of resistance.
xThis article addresses the problematic perspectives of drone culture. In critiquing focus on the drone’s apparent ‘autonomy’, it argues that such devices function as part of a socio-technical network. They are relational parts of human–machine interaction that, in our changing geopolitical realities, have a powerful influence on politics, reputation and warfare. Drawing on Žižek’s conception of parallax, the article stresses the importance of culture and perception in forming the role of the drone in widening power asymmetries. It examines how perceptions of autonomy are evoked by drones, to claim that this misperception is a smokescreen that obscures the relational socio-technical realities of the drone. The article therefore argues that a more critical culture of the drone emerges by shifting the focus and perception from autonomy to anonymity. This allows us to engage more fully with the distributed agency and decision-making that define how drones are developed and deployed. Rather than focusing on the drone as a singular, fetishised, technical object, a relational approach to the drone assemblage is proposed that highlights the competing human interests that define and resist drones in global politics and culture.
xData collection is everywhere. It happens overtly and behind the scenes. It is a specific moment of legal obligation, the point at which the purpose and conditions of the data are legitimised. But what does the term data collection mean? What does it say or not say? Does it really capture the extraction or imposition taking place? How do terms and practices relate in defining the norms of data in society? This article undertakes a critique of data collection using data feminism and a performative theory of privacy: as a resource, an objective discovery and an assumption. It also discusses alternative terms and the implications of how we describe practices of ‘collecting’ data.
xAt the start of 2021, Twitter launched a closed US pilot of Birdwatch, seeking to promote credible information online by giving users the opportunity to add context to misleading tweets. The pilot shows awareness of the importance of context, and the challenges, risks and vulnerabilities the system will face. But the mitigations against these vulnerabilities of Birdwatch can exacerbate wider societal vulnerabilities created by Birdwatch. This article examines how Twitter presents the Birdwatch system, outlines a taxonomy of potential sociotechnical vulnerabilities, and situates these risks within broader social issues. We highlight the importance of watching the watchers, not only in terms of those using and potentially manipulating Birdwatch, but also the way Twitter is developing the system and their wider decision-making processes that impact on public discourse.
xThis short paper presents a sketch for mapping CV as a field and set of sociotechnical practices. It takes a relational approach, centring those who are most affected by an AI system, identifying points of unfairness, situating systems in wider discourses and narratives, and thereby locating points of intervention towards the redistribution of power.
xPrivacy is increasingly important in an age of facial recognition technologies, mass data collection, and algorithmic decision-making. Yet it persists as a contested term, a behavioural paradox, and often fails users in practice. This article critiques current methods of thinking privacy in protectionist terms, building on Deleuze's conception of the society of control, through its problematic relation to freedom, property and power. Instead, a new mode of understanding privacy in terms of performativity is provided, drawing on Butler and Sedgwick as well as Cohen and Nissenbaum. This new form of privacy is based on identity, consent and collective action, a process to be performed individually and together to create new structures that instil respect at the heart of our sociotechnical systems.
xThis short paper presents a sketch for mapping AI along relational lines, centring those who are most affected by an AI system, identifying asymmetric power relations, situating systems in wider discourses and narratives, and thereby locating points of resistance, refusal and the redistribution of power.
x'After Humanity' approaches issues of technological futures, environmental collapse and human agency. The eponymous future after the fall of humanity imagines a world built through relations between machines and ecology, in order to critique human priorities and the impact of constant expansion on the environment. This is situated in the context of developing a combined speculative media method, fusing design, fiction and ethics. The work is offered as a proof of concept for the method in developing a specific context using machine learning and human creative practices.
xGamification has entrenched constant monitoring throughout society. From education to work to shopping, our activities are tracked, our progress is monitored, and rewards are meted out. But this enforced acceptance of constant surveillance constructs a social narrative in which privacy ceases to exist, and the technological tools at work can easily be shifted from reward to control. This is furthered through the shift from a Bentham–Foucault model of power and the threat of surveillance to the actualisation of complete protocological surveillance enabled by cloud computing, data centres, and machine learning. It is no longer the case that anything we do might be surveilled; we can be fairly certain that everything we do probably is being monitored, judged, and recorded. How can we negotiate these changing narratives? Of what fictions do we convince ourselves when we play the “game” called digital society? This article uses the work of Cory Doctorow, Charles Stross, Dave Eggers, and Ernest Cline to assess how fictionality can act as thought experiments for the social conditions of surveillance technologies. Through stories such as Halting State and Walkaway, we explore the collisions between the control-based society of tech companies and the disciplinary structures of traditional states—the points of tension between illusions of freedom, guided game paths, and the exercise of power over users’ data and behaviours. The article argues for expanding our perspectives on the reach of game analysis to the broader connected networks of cultural and political systems, to assess ways of responding to the idea that we are being played with, turned into characters in the gamified narratives of control-based surveillance societies.
xThis project examining the links between technology and ecology resulted in a journal article and a media practice output.
Fractal Media and Ecology
Read the article hereThe scars of humanity can be seen across the Earth. However, observing such ecological violence and their implications often requires finding the right perspective, moving beyond the spatial and temporal limits of individual humans. This article builds on discussion of the Anthropocene as a term and the anthropocentrism it implies to critique the relations between humanity, technology and ecology through posthuman perspectives. Focusing on Google's widely available tools, its problematic relation to the environment as a company and critical interventions by media artists Mishka Henner, Paolo Cirio and Geraldine Juárez, the article examines technologies that enable a 'posthuman' position from which to view the fractal activities of humanity: Google Maps and Earth; Street View; and the Google search engines. Fractals are offered as a mode of assessing the self-similar processes of mediation that define not only humanity's scalar expansion but also its shift into informational dimensions and the virtualization of ecology.
The scars of humanity can be seen across the Earth. However, observing such ecological violence often requires the right perspective. At every scale, humans make their mark, expressions of the rapid expansion of creative and destructive collective consciousness. This process is aided by technology, from the history of written language that enabled larger settlements and the agricultural revolution to contemporary computer technologies that create an alternative hyperspace within the Earth. In all corners of the globe the human biotech virus spreads. Chaotic and self-replicating, the fractal technological processes that enable human society provide self-similar mediations of physical space and the exploitation of the planet. Perhaps digital technology is the endpoint of this process, the full fractalisation of human consciousness heralding an apocalyptic conclusion to the Anthropocene. Yet this apocalypse is not only in the sense of global annihilation but in its original meaning as revealing. The same technologies that push human influence over the planet to a critical level also provide the means with which to critique our activities and their consequences. The Anthropocene represents the idealisation of material reality by the subjective intentionality of human creation and destruction. It is the virtualisation of ecology – and therefore a post-ecological era, a disruption of the system of ecology by the inclusion and impact of humanity. Thus we require new, posthuman modes of viewing ecology in relation to humanity in the Anthropocene. Enter Google – the ultimate virtualisation of the universe, not merely its digitisation but also its embedding within the global collective human consciousness via digital technology.
Technology is both the means by which we can view the Anthropocene and the means through which it has been brought into being. Google as a company embodies this dilemma – a problematic relationship between the startup ideals, the marketing rhetoric, and the needs of the organisation. However, through Google’s tools – such as Maps, Earth, Streetview – we can adopt new forms of viewing our ecological situation. Scanning the planet from above in Earth or Satellite view, taking atemporal walks through Streetview, expanding physical space with hypertextual maps, all these methods allow us to combine a data-driven and ecologically-driven need to break free of our conventional human position. These post-human perspectives allow for spatiotemporal detachment from the Anthropocene and therefore a position from which to conceive of post-human techno-ecologies. The post-human here is taken as the expansion of human perception or thought but only through the medium of technology. It is the new trends and codings of behaviour influenced by technology. If written language enabled humans to form as a cognitive entity and spread across the planet as a collective species, then computers are creating post-human attitudes and modes of being, with even greater global impact. Meanwhile, digital technologies in general and Google in particular, allow us to externalise not only memory (in vast data centres) but also the thinking process itself (in offloading decision-making to algorithms and a machinic assemblage of collective patterns).
The work exposes these frameworks of mediation through its presentation within specific frames. These are literal, technical frames – loaded in html iframes using javascript and the Google API – as well as metaphorical frames – built upon the constraints of the work. The spectator can attempt to manipulate their perspective, to take control of how they are viewing the Earth through Google interfaces, but the work itself constrains this. The windows appear only for limited periods of time, too short to allow full interactivity. Thus, while we are exposed to views beyond our normal individual human grasp of the world, we are also manipulated in the external and automated control of our usual access to information provided by Google. This process not only highlights the limits of post-humanity by its technologies, but the bombardment of changing views demonstrates the volume of data available and the sheer scale of views necessary to comprehend the Anthropocene as a whole. The entangled and embedded nature of individual humans, mediating technological structures, and corporations such as Google unveils itself in the breaking down of the work. Sooner or later, systems crash, connections get overloaded, and the metaphorical frameworks we construct are revealed as the human constructs they are. While we can stage post-human technical demonstrations, to attain a truly post-human perspective would be to transcend the current limits of our individual and collective, technocultural and biopolitical structures, motives and responses.
This film is one recording of a real-time, autonomous and interactive web-based visual experience that highlights the role of Google as a contributor to and tool for overcoming the impact of humanity via post-human methods of viewing and rethinking our relation to ecology.
xThis book outlines a new conception of the cyborg in terms of consciousness as the parallax gap between physical and digital worlds. The contemporary subject constructs its own internal reality in the interplay of the Virtual and the Real. Reinterpreting the work of Slavoj Žižek and Gilles Deleuze in terms of the psychological and ontological construction of the digital, alongside the philosophy of quantum physics, this book offers a challenge to materialist perspectives in the fluid cyberspace that is ever permeating our lives. The inclusion of the subject in its own epistemological framework establishes a model for an engaged spectatorship of reality. Through the analysis of online media, digital art, avatars, computer games and science fiction, a new model of cyborg culture reveals the opportunities for critical and creative interventions in the contemporary subjective experience, promoting an awareness of the parallax position we all occupy between physical and digital worlds.
x