continent. maps a topology of unstable confluences and ranges across new thinking, traversing interstices and alternate directions in culture, theory, biopolitics and art.
Issue 5.2 / 2016: 42-47

Lucy A. Suchman

a continent. inter-view

(image by Nina Jäger, continent.

Lucy A. Suchman is Professor of Anthropology of Science and Technology at the Department of Sociology, at the University of Lancaster. Suchman’s early work on corporate ethnography was influential in the identification of the nuances of human technological expectations, toward an understanding of the responses of machines to touch. More specifically, with her "When User Hits Machine", she worked on the now legendary struggle between office workers and the big green button on copy machines—her analysis showing that the idea that machines could ever completely know what the people using it were trying to do based on models of, and sense data from, those users, was deeply erroneous as a starting point for the design of human-computer interactions. Her research interests within the field of feminist technoscience and technology studies are focused on technological imaginaries and materiality, practices of technology design, and on developments at the interface of bodies and machines. Suchman’s Human-Machine Reconfigurations: Plans and Situated Actions (Cambridge University Press: 2nd Edition, 2007) examines the performativity of human identity through digital interfaces and its material technologies. Suchman’s more current research, as in her article "Situational Awareness: Deadly Bioconvergence at the Boundaries of Bodies and Machines,"[1] extends her longstanding critical engagement with the field of human-computer interaction to contemporary warfare, including the figurations that inform immersive simulations, and problems of “situational awareness” in remotely controlled weapon systems. Lucy spent twenty years as a researcher at Xerox's Palo Alto Research Center, during the historic period in which this institution produced many of the paradigms of computation, interfacing and digital design practice we use to this day. For her work she has received numerous honours, including the Benjamin Franklin Medal in computer and cognitive science.[2] How did you get here?
LS: I got to these topics because I was a student of anthropology in the 1970s and 80s at Berkeley. This was a very politicised time for American anthropology, and I was interested in what went under the name of "studying up" at the time—that anthropology as a field should be looking at powerful institutions and actors. I was interested in doing a critical ethnography of a large corporation, and through a series of circumstances I ended up at Xerox's Palo Alto Research Center (PARC) Silicon Valley, where I encountered computing and human-computer interaction, and artificial intelligence.

I got diverted and became really interested in issues around the premise of interactive machines, artificial intelligence and the models of human interaction and human intelligence that were informing these projects. I went to PARC as a research intern, wrote my PhD thesis on these topics, and then worked there for the next 20 years as a researcher until I went to Lancaster. My work has always been informed by STS (science and technology studies) and so I came here through that kind of engagement with technologies, and their imaginaries and politics.


Lucy Suchman: Restoring Information's Body - Remediations at the human-machine interface What technical systems are operating on us right now?
LS: We are definitely immersed in and infused by information and communications technology, but I think for me the main preoccupation is the relationship between the discourses and rhetorics around technology, and the actual material practices.[3] I am very interested in recovering some of the mess, and slippage, and non-coherence, and multiplicity of technology. I am working against the grand narratives[4] of technology and the idea that we have entered new eras, so engaging with the technosphere is kind of an interesting challenge in that regard. We were invited by the organisers to engage with that term, but also to trouble it, to explode it in many ways.[5] What part of the technosphere do you rely upon the most?
LS: I spend enormous amounts of time in intense interaction with people who are geographically far away or temporarily displaced, so in terms of everyday life, that is where I am deeply affected. But the technologies that I am most preoccupied with at the moment are military technologies, in particular the incorporation into military imaginaries of networked infrastructures. For example, I am a U.S. citizen, so I am plagued with guilt about that, and a lot of my work is a critical engagement with that responsibility.[6] 

The developments in the U.S. military around remote control are very troubling to me and I try to bring some of my previous academic and research work to bear on more activist interventions. I am looking at drones and remotely controlled ground robots, which are under development: their discourses  foreground the saving of lives, for example by the detection of improvised explosive devices, but these robots are also clearly going to be mounted with guns. There are increasing projects in these areas on the part of the U.S., which is in terms of its arms production as well as an increasing orientation to the projection force from a safe space. We can see all sorts of fallacies in that, including the idea that we can stay safe in our homeland while we project this force and that that is not going to come back—this is clearly impossible to sustain, even if we believed it was morally supportable, which I do not.[7] That is what is preoccupying me at the moment. What is the technosphere?
LS: I think it is an orientation to the extent to which we are living in a time where technological imaginaries are tremendously dominant frames of reference for us[8], as well as actual technological infrastructures; it is a way of orienting, and the agencies and effects of that. But then also, many of us are trying to make that specific. Some of the most interesting projects take a particular site or particular artifact, and unpack that to show both the specificity of that site or artifact, but also its interconnectivity, and the ways in which you can read out of cultural imaginaries, political economies and so forth, that connect things.[9] This Technosphere programming is full of lovely case studies that really demonstrate in ways that only talking about them generally could never really convey as powerfully. Please pick one image that resonates with your idea of the technosphere.[10]
LS: I would pick the 3D printed skull. Initially, I read it as a measuring device, a classical anthropological project of measuring the human. So then it is even more interesting, but actually it is a technology for making a facsimile of a skull. For me, it is very much about things that I am engaged with around relations of nature and culture, the organic and the artifactual, and the ambiguity of those things; also what constitutes the human, the problematic place of the brain in the constitution of the essential human, the attempt to categorize and classify, and capture that, and anthropology’s complicated relations to that historicallyand of course now, the kind of inversion to the reproduction of human through technological means. 

And what the skull represents in relation to the human is, I think, a very interesting question. It is an iconic object of death, but also of human origins and remains, and potentials, and of course human exceptionalisma fabulous picture.

[1] Excerpt from Lucy Suchman’s video documentation, Man Against Machine:

[2] Beth Adelson. "Bringing Considerations of Situated Action To Bear on the Paradigm of Cognitive Modeling: the 2002 Benjamin Franklin Medal in Computer and Cognitive Science Presented to Lucy Suchman" Journal of the Franklin Institute 340, no. 3 (2003): 283-292.

[3] EDITORS’ NOTE: Suchman’s work relates to the rhetoric around technologies and the role of science-as-culture, revealing how deeply embedded our anthropocentric systems and imaginaries are within technological creation, and how they determine not only the form of expression, but the very definition of “intelligence,” “consciousness,” and “awareness.” In her examination of how roboticists imagine humanness for replication through artificial intelligence, Suchman writes: “One form of intervention into current practices of technology development, then, is through a critical consideration of how humans and machines are currently figured in those practices and how they might be figured—and configured—differently. This effort engages with the broader aim of understanding science as culture, as a way of shifting the frame of research—our own as well as that of our research subjects—from the discovery of universal laws to the ongoing elaboration and potential transformation of culturally and historically specific practices, to which we are all implicated rather than modest witnesses.” Lucy Suchman. Human-Machine Reconfigurations: Plans and Situated Actions. (Cambridge: Cambridge University Press, 2006), 227. [See also: footnotes to Mark Hansen, on digital identity, limits of humanness.]

[4] EDITORS’ NOTE: Grand narratives (grands récits: ‘big stories’): “Lyotard's term for the totalizing narratives or metadiscourses of modernity which have provided ideologies with a legitimating philosophy of history. For example, the grand narratives of the Enlightenment, democracy, and Marxism. Hayden White (b.1928), an American historian, suggests that there are four Western master narratives: Greek fatalism, Christian redemptionism, bourgeois progressivism, and Marxist utopianism. Lyotard argues that such authoritarian universalizing narratives are no longer viable in postmodernity, which heralds the emergence of ‘little narratives’ (or micronarratives, petits récits): localized representations of restricted domains, none of which has a claim to universal truth status. Critics suggest that this could be seen as just another grand narrative, and some have seen it as Eurocentric.”

[5] EDITORS’ NOTE: Just as Suchman encourages us to question what actually defines humanness as a model for artificial intelligence, our discussion of the technosphere encourages a “troubling” of the dominant figuration of technologies as being complicit in Western colonial and industrial histories, and their ongoing iterations. “The effects of figuration are political in the sense that the specific discourses, images, and normativities that inform practices of figuration can work either to reinscribe existing social orderings or to challenge them. In the case of the human, the prevailing figuration in Euro-American imaginaries is one of autonomous, rational agency, and projects of artificial intelligence reiterate that culturally specific imaginary.” Lucy Suchman. Human-Machine Reconfigurations: Plans and Situated Actions. (Cambridge: Cambridge University Press, 2006), 227.

[6] EDITORS’ NOTE: We find it productive for our thinking on the matter to consider the ways in which this violence is articulated in American racial discourse: “Too-often subscribing to idealist theories of power, these approaches prioritize practices aimed at increasing cultural hegemony or positive symbolic representation of marginal groups, rather than seeing race as reproduced through differential regimes of ballistic and carceral material violence like police and prisons and strategizing on this basis. [...] the entire liberal discourse of ‘ethics’—inasmuch as it takes place within the White discourses framed by the ‘ignorability’ of police and carceral terror—renders it totally irrelevant to Black existence. [...] How can non-Black persons who are struggling against the miserable lives they are offered do so in ways that do not, as Wilderson puts it, ‘fortify and extend the interlocutory life’ of the anti-Black existential commons?” K. Aarons. “No Selves to Abolish: Afropessimism, Anti-Politics, and the End of the World.” Ill Will Editions.

[7] EDITORS’ NOTE: Military technology imparts distance between the mechanisms and products of its devastation, seemingly displacing moral reprehensibility from the clear actions of individuals, onto the autonomous functions and logic of machines. Distance may create the illusion of human nonparticipation in these algorithmic machinations, however, considering Suchman’s discussion around figuration, the very ‘agency’ of these apparently sterilised computations is based upon replicating and responding to human interests. Susan Schuppli argues weaponised algorithms should thus be accountable to legal frameworks that are both reactive and anticipatory—a criminal liability: “When algorithms are being enlisted to out-compute terrorism and calculate who can and should be killed, we need to produce a politics appropriate to these radical modes of calculation and a legal framework that is sufficiently agile to deliberate over such events.” Susan Schuppli. “Deadly Algorithmscontinent. 4, no. 4 (2015): 25.

This necessity, however, reorients our attention towards the military industrial systems that demand the autonomous machinations and serve as models for this figuration of weaponised algorithms. In his portrait of military imagery technology, Harun Farocki writes, “If there is a relation between production and destruction, it is also true that it is not so much hardware that needs to be disposed of, as it is the controlling and targeting. But selling more control depends precisely on differentiating between friend and foe. The economy, at least that of the weapons manufacturer, demands war for humanitarian aims.” Harun Farocki. “War Always Finds a Waycontinent. 4, no. 4 (2015): 59.

[8] EDITORS’ NOTE: “We have probably never paid so much attention as we have today, we have undoubtedly never been so attentive to the cosmodynamic movements, but these are transfigured by a subliminal light, it is this light and no longer the light of the solar star that extends across space in a compensated time which is no longer exactly that of Kronos, but rather that of the energetic ruse of motors.”  Paul Virilio. Negative Horizon. (New York: Continuum, 2008): 126.

[9] EDITORS’ NOTE: “According to the Axis of Evil-against-Evil, the first task of warmachines is to perceive War not as a consequence of collisions between warmachines or crisscrossing lines of tactics, but as an autonomous machine spawning warmachines in order to hunt them down […] War is fueled by the terminal fusions of strategy and and tactical multiplicities; everything that emerges from war is a devastating disruption for the configurations, guiding systems and probe-functions of war machines.”Reza Negarestani. Cyclonopedia: Complicity with Anonymous Materials. (Melbourne: Re-press, 2008): 77.

[10] EDITORS’ NOTE: During the discussions, interviewees were asked to pick from a set of somewhat random images. This collection of different phenomena served as a prompt for thought on the forms of appearance and the visuality of the technosphere. You can view the set here The discussion here refers to