Andrew C. Dwyer
Search for other papers by Andrew C. Dwyer in
Current site
Google Scholar
PubMed
Close
Ecological computationality
Cognition, recursivity, and a more-than-human political actor

Popularly, computation – incorporating hardware, software, and sensors – have rarely been understood as more-than-human and ecological, but rather as technologies under human control and knowability. In this chapter, I advocate, alongside previous work outside of geography, for conceptualising computation as ecological. Through an examination of computationality as a performative interaction with an ecology of materials supported by capacities for (re)cognition, I suggest that computation is productive of a more-than-human politics. Using vignettes from an autoethnography of the analysis and detection of malicious software, I discuss how computation affords certain properties as much as it is generative of new interpretations of security that are threaded together across environments, big data, and human decisions in cybersecurity. Through recursive logics, enabled by computationality, computation is then engaged in shared (political) processes of choice-making that both shape, and are shaped by, various ecologies. Using cybersecurity as an exemplar, I explore how its drive to render detection at greater speed and against ‘unseen’ threats means that it has increasingly leveraged computation’s capacity to recognise and reason whether software is malicious or not. Computation, in this collective reading, is then a political actor in-formation that reads, interprets, and acts. I use this to complicate more-than-human ecologies, as well as its application to perspectives on digital geography and cybersecurity. This is to suggest that various ‘digital’ ecologies are not simply hosted on computation, but are intertwined and are actively reworked through computationality, producing new ecologies that may be incommensurable to representational scrutiny.

Computation has become a key mediator and contributor to almost all aspects of collective everyday life. It is, in various places and at differing temporalities, at work. Computational processes and their associated outputs are engaged in the retrieval of information using the Internet, the geolocation infrastructure that mobile devices are entangled with, and the ‘background’ databases and algorithms utilised across numerous industries and governments, as much as in the monitoring and cataloguing of animals and plants for conservation and ecological restoration. In this chapter, I pursue an argument – drawing on research from software studies and media studies1 – that computation should be considered ecological, albeit distinct to its (in)organic counterparts. To do so, I develop the concept of computationality as a performative collective of materials consisting of hardware, software, and networks with a capacity for (re)cognition, which topologically intersect with certain people, places, and communities.2 Rather than computation understood as a tool, interface, surface, or network upon which to appreciate the (digital) ecologies of plants, animals, and other forms of organic and inorganic materials, I argue that computation is a recursive and cognitive political actor. By this, I mean that computation is engaged in forms of reasoning that are sustained by iterative, and often non-linear, feedback loops that are formative of a recursivity across, in, and through, multiple ecologies. Computation, from this perspective, then, not only represents ecologies but is actively reworking them too. Machine learning algorithms, at their greatest extent, have become indicative of recursive computation’s capacity to optimise and make inferences: whether in the (mis)identification of faces, in assessing the likelihood for a person to commit fraud, or – crucially, in the context of this book – suggesting the best areas for ecological intervention. Hence, exploring the role of computationality in societies is essential for understanding how it is productive of, and is shaped by, ecologies that we collectively live in.

To articulate the complex relationships between computationality, ecology, and computation as entangled political actors in this chapter, I provide two vignettes from an (auto)ethnography of the analysis and detection of malicious software. My analysis draws on seven months of fieldwork, training to become an analyst at an analysis and detection laboratory, that relied on computationality and ecological methods for practising cybersecurity. Malware, as it is frequently known, has historically been referred to as its derivatives in the computational virus and worm. Today, malware has become a seemingly persistent and ever-present part of contemporary interconnected societies, with recent cases affecting the Irish Healthcare Safety Executive3 and against the software provider SolarWinds, in a supply-chain compromise of the US government.4 Since the late 1980s, privatised responses to secure against computational viruses materialised in anti-virus engines. Today, anti-virus technologies have been integrated into endpoint detection products that offer a broader range of techniques and strategies for malware detection by a range of household names such as Norton and McAfee. Unlike the ‘script kiddies’ of the latter part of the twentieth century,5 malware is now primarily written by states and organised criminal gangs to exploit computational vulnerabilities, with ransomware the most egregious contemporary example.6 However, in this chapter, I do not focus on the essential analysis ongoing elsewhere on how to attend to such threats, but instead conceptualise how malware is analysed and detected through thinking of endpoint detection – and cybersecurity generally – as engaging with computational ecologically.

By arguing for an ecological appreciation of computation through computationality, I embrace perspectives from across software studies, media studies, and more-than-humanism, as well as the philosophy of computation.7 In emphasising computationality, I wish to extend, twist, and bend the ‘outputs’ of computation so that it is rendered a societal and political thing, one which cannot be simply disentangled from its relations or presented as an abstracted mathematical machine, but as embedded in complex, recursive, political, and more-than-human ecologies. In the case of this chapter, I demonstrate, through the exemplar of endpoint detection, how software is categorised as malicious and then becomes detected in ways that embrace the cognitive capacity of computation in various ecologies. The chapter thus proceeds by (1) giving a brief background to how computation is conceptualised as a technology that is incapable of exerting agency and politics; (2) how conceptual developments have repositioned how the human can assist in assessing computation and recursivity in more-than-human and ecological forms; before (3) engaging more deeply with computational (re)cognition as a more-than-human political ecology that distances itself from readings on technological affordances, complexity, and agency; and then (4) concluding with openings of what computation as a political actor means for the study of digital ecologies.

Computation: from technology to ecology

Within the (computing) sciences, the dominant interpretation of computation has long regarded it as an axiomatic and logical system that is bound to certain mathematical rules, upon which it can be controlled and rendered knowable. This frequently assumes that computation’s agency is limited to its authors, with reference to software authors’ capacity to generate ever-greater sophisticated forms of reasoning. The central figure of the (human) author is sustained by a ‘tech bro’ culture where resolutions to our problems are exclusively addressed by a supposed elite of coders to construct new worlds – such as through recent branding and materialisation of the ‘metaverse’.8 McKenzie Wark’s critique of such Silicon Valley culture, identifies the importance of coders in the formation of a new iteration of capitalism in ‘vectorialism’, which splices society and power according to the ability to code and create new societal ‘vectors’.9 In such accounts of the importance of software authors and coders from Wark to Zuckerberg, however, there is a pervasive centring of the human – or coding elites – as being in control of computation. Such a position makes computation devoid of its political agency and often ascribes it to complexity,10 where agency in some accounts manifests in bugs, glitches, and errors.11 Together, these perspectives suggest that computation’s agency emerges from either an inherent environmental complexity or from an intending and political human.

Computation as a complex tool, or one that has certain technological affordances or affects, rests on multiple genealogies of our relationship to technology,12 which extend back into the (Western) Enlightenment that sought to categorise and offer a world of rationality and order.13 Although there is not space to fully explore the often-conflicting historical lineages of technological thought, Yuk Hui has noted how the body became understood as mechanical at the same time as René Descartes made a distinction between humans and non-human animals.14 Animals – and bodies generally – became considered ‘machine-like’ as they were ‘devoid of mind and consciousness, and hence lacking in sentience’.15 In this privileging of the human mind through a mind–body dualism, it is not only computation that is devoid of an ability to participate in politics, but also all non-human organic animals and plants. Friedrich Leibniz likewise creates a further division between natural and artificial machines, with the former more complex than the latter.16 In this reading, animals and organic matter are of a higher ranking than technology. These collectively produce an implicit hierarchy with the dominance of the human mind at the top, followed by their bodies, ‘natural’ machines, and at the bottom, artificial, technological machines subservient to humans and less sophisticated than other organic matter.

In contrast, cybernetics, led by individuals such as Norbert Wiener,17 advocated for understanding animals, humans, and machines through abstracting their interactions to information.18 Although N. Katherine Hayles has been critiqued for her over-simplification of the complex emergence of the study and various orders of cybernetics,19 early cybernetics’ abstraction and equivalence of things, permitted by an emphasis on information and feedback systems, enabled computational and human intelligence to be routinely compared and contrasted. Cybernetic equivalence has let computation – most prolifically in discussions on machine learning algorithms – to be compared to human forms of intelligence.20 However, with dominant lineages from Western thinking identified above, computation is still rendered a tool below human forms of intelligence. This is not to claim that the abstraction and equivalence of things should be ascribed to all cyberneticists or indeed technological thought but, I argue, this offers one foundation to understand how and why computation can be both perceived as more ‘intelligent’ than humans as much as it remains regulated as another technology at the lowest rung of the hierarchy of politics and agency. Although such distinctions are often unsettled in popular anxieties of robotic domination and control in post-anthropogenic landscapes, they derive from an anxiety of our relationship to computation as a political actor that we cannot place. That is, computers are either seeking to control us or us computers. Together, this leaves computation lacking a more-than-human politics or agency, and when it does, it is shrouded in the hyperbole of domination.

More-than-humanism, ecology, and recursivity

Since the 1990s, there have been moves, sometimes unrelated to technological thinking, across the social sciences, humanities, and beyond to address the position of the human in contemporary thought and practice. This has sought, often through enmeshed and interlinked genealogies, to understand and recognise the role of non-human agencies and their impact upon our societies and politics, whether that be, for example, through new materialisms,21 Actor–Network Theories,22 and in animal studies.23 In geography, these have often consolidated and been promoted through a focus on ‘more-than-humanism’24 that advocate for an expanded ‘we’ as much as for experimentation in research praxis.25 Likewise, the philosophy of computation has attended to computational agency, logics, and the capacities for inductive processes of artificial thought.26 Collectively, these bodies of thought have productively questioned the central role of the human and continued (feminist) traditions of deconstructing the dualism of the mind and body, both within and outside of geography.

In veins like more-than-humanism within geography, the concept of ecology within media studies and software studies frequently pursues and develops related paths in continental philosophy, noting how software works across and through various different materials and in different socio-cultural interactions.27 This suggests that technology is generative of affordances on, through, and in response to societal dynamics. Ecological thinking has likewise been adopted in human geography through the feminist thinking of Donna Haraway,28 in thinking about the ‘Anthropocene’,29 and in association with the concept of Gaia.30 This has been accompanied by questioning the Eurocentrism – and thus the positionality of the human – of such approaches and its impact on other ways of living.31

When considering ecological computationality within geography, I combine two streams of thought from software studies, media studies, and the philosophy of computation on the one hand and more-than-humanism on the other. This permits two things: first, it takes seriously the former’s empirical and conceptual contributions on computation’s technological affordances and logics to shape society in ways similar to core work in geography on the matter,32 whereas, second, this chapter draws on geography’s complementary repositioning of differently positioned peoples and communities, experimental practice, and working alongside the agencies of animals and plants in more-than-human ways.

As software studies and media studies have explicitly examined, the ecologies of computation must attend to the sheer volume of computational materialities exhibiting greater interdependencies, which have enhanced the complexity, negotiation, translation, and intertwining of human subjectivities and technological affordances. This deepening spatial distribution – and environmental complexity – of computation, alongside growing processing capacities, has been accompanied by the decreasing cost of computing hardware. This has enabled sensors, actuators, cameras, and other forms of computation to become commonplace in the monitoring and assessment of a range of ecologies.33 In the process, computational interdependencies, supported by big data collection, have enabled the possibility of enhanced recursion, especially with machine learning algorithms, enabling affects and effects to be generated at scale.34 However, this does not only occur at supposed large scales. For example, when one types into a text document on a computer, it is not simply a word appearing on a screen. Instead, there are a whole host of processes, to first read the electronic signal from a keyboard, interpret this, and find and access a place in memory to a stored value, which in turn requires various other processes to retrieve, translate, and display this through a text-processing program on a screen. Often, such interactions function as expected; however, there is always a potential for a mistranslation, a misreading within the ecology of processes, which often expresses itself when a program crashes or glitches.35

One common embrace of more-than-humanism in human geography engages with various new materialisms that have often built upon and underacknowledged similar veins of thought.36 New materialisms emphasise the capacity of things to have agency, with philosopher Jane Bennett making a distinction between the ‘big agency’ of humans and the ‘small agency’ of worms.37 Yet, this distinction suggests a primacy of the human as ‘big’ or hierarchically atop of other things. Whereas, turning to discussions on the agencies of algorithms, with their interconnections, big data relations, and capacities for recursive reasoning,38 such a hierarchical distinction begins to decompose. Thus, it may be pertinent to consider how there are various gradients of capacity for agency, dependent on the capacity for recursive reasoning, where computation allows for ecologies to be deconstructed and reconstituted by ‘deep’ machine learning algorithms.39 To summarise, in contemporary new materialist thinking, computation – and other technologies, as well as organic and inorganic ecologies – can be understood through either environmental complexity (such as slippages, errors, and bugs) or through an expanded view of agency, which gives emphasis to the capacity of inorganic technologies to be generative of affordances and to perform that extend and complicate human-centric notions of the term. Hence, ecologies of computationality consist of the environment complexities of computation as much as the affordances they permit, shaping and contouring societies through the performances computation engages in. This could, for example, involve computational materials such as sensors with certain material qualities, generating big data, and software that affords a capacity to render knowable, in certain ways, an environment. Yet, in developing such productive lines of thought, new materialisms have not explicitly attended to computation’s capacity for recursive reasoning and have regulated computation to being another inorganic agent.

Analysing and detecting malware

When conducting an autoethnography of a malware analysis and detection laboratory, I often encountered the affordances of computational materials as much as I glimpsed at ecologies of computational materials as distinctive political actors. The majority of the labour I performed in the laboratory consisted of the production of malware detections using techniques that have existed since the earliest days of anti-virus technologies. Detection ‘signatures’ I crafted sought to prevent software identified as malicious from executing, using code matching against unique attributes found in the software and code. Contemporary endpoint detection itself now utilises a much wider range of techniques and strategies in addition to detection signatures, including ‘behavioural’ analyses that monitor computational environments for changes, as well as machine learning algorithms that use big data from a range of sources to create features to identify malicious attributes in analysed software. The detection engines of endpoint detection enterprises are today installed on millions, if not billions, of devices, analysing environments and used to detect malware and suspicious behaviour in an industry worth billions of US dollars each year. Endpoint detection is widely used, and perhaps forms one of the most data-intensive, spatially complex, yet under-studied, areas of cybersecurity. It is, in many forms, an attempt to monitor, transform, and shape a planetary ecology of computation to limit malicious and suspicious activity.

At Sophos, in Oxfordshire, England, where I conducted my (auto)ethnography, multiple different forms of analysis and detection were in concurrent use, intermingling and threaded across one another. When sitting at my desk, I was confronted with a bewildering amount of data, from various databases both internal and external (such as Alphabet-owned Virus Total, a malware repository). This also included learning how to analyse software at the lowest levels of human-readable code (known as ‘assembly language’) so that code instructions could be intricately read and understood, and sometimes incorporated into detection signatures. Although I do not have space to go into further detail here, what I seek to highlight is how the environments of the laboratory were already ecological – drawing together various tools for analysis, multiple, competing streams of big data generation and analysis, as well as analysts’ affective and embodied relations to maliciousness honed through experimentation and time spent conducting analysis of malware ‘in-depth’.40 However, such an ecology, where one is using computation as both the ‘tool’ of detection as much as the object of analysis, made it exceptionally difficult to ascertain who and what was doing the analysis and thus was practising cybersecurity. For example, when I was sitting at my desk in the laboratory, looking at my screens, I had ‘automated’ percentages with colours presented to me of the malicious likelihood of the artefact being analysed, as well as tools which could execute software in a simulated environment to assess whether it was malicious or not. Hence, it was not only me or the other analysts partaking in the protection of computers, but also computers in performance in ecologies, engaging in crucial political choices about what is or is not malicious.

Vignette 1: A false positive, 22 June 2017

I had just returned from lunch at the cafeteria on the ground floor of the headquarters of Sophos, wondering what I was going to do that afternoon. Before I had left for lunch, I had been working with a large group of software samples that had been tagged by other endpoint detection vendors as SupTab – a browser modifier.41 SupTab was variously tagged as either malicious or, in some cases, as a ‘potentially unwanted application’. The different tags applied to the same software are reflective of the social construction of ‘maliciousness’. Such tags are reflective of an affective and financial economy, affective through what is considered normal by analysts and financial through the time and labour that laboratories can devote to the production of detections. However, big data analysis and sharing has enabled an ecological practice to identify samples for further detection and to structure and direct the limited labour of the laboratory. In this case, I was allocated a range of software samples that had been identified by other vendors as SupTab and needed to be detected by Sophos. To do so, I had generated a method to identify various ‘missed’ SupTab detections through an unusual set of strings to uniquely detect the software (or, at least, I thought).42 However, it was after lunch, when my detection encountered the curated ecologies of quality assurance processes to ensure that my detection would execute as expected and not detect ‘benign’ or ‘clean’ software out ‘in the wild’,43 that things went wrong.

Due to the increasing reliance on big data across cybersecurity to develop ecological awareness, there are persistent issues with data quality, especially when one relies on data feeds and comparisons to other endpoint detection vendors. This quickly destroyed any post-lunch malaise. When I unlocked my computer, a message was flashing from Elliott, another malware analyst, stating that I had submitted a detection for SupTab that was producing quality assurance errors. This was preventing the release and distribution of a range of detections packaged as part of that afternoon’s release to computers that had Sophos’ endpoint detection installed on them. I therefore quickly turned my attention to the analysis screens on my computer. What Sophos called the ‘false positive rig’ contained a data store of what software was deemed ‘benign’ or ‘clean’ by other endpoint detection vendors. The rig had identified my detection as ‘detecting on software’ tagged as non-malicious. The rig was used by quality assurance to ensure detections produced by Sophos did not detect ‘clean’ software. In the case of my detection, it appeared I had not written a sequence of instructions precise enough to only refer to the unique attributes of SupTab. I had to investigate, and quickly.

As I dug into the case and worked line by line through my detection and examples of the supposedly ‘clean’ software, I found that the rig had samples that were incorrectly identified by other endpoint detection vendors as non-malicious. The misidentification was down to one endpoint detection vendor incorrectly identifying files as ‘clean’ and therefore claiming they were not malicious or potentially unwanted, transforming the Sophos infrastructure of quality assurance. Such infrastructures, as much as they are reliant on big data sharing, offer a baseline of comparison to produce detections that are less likely to incorrectly detect software that is not malicious. By focusing on the infrastructural condition of big data sharing, I do not seek to single out Sophos. Rather, all endpoint detection vendors are dependent on similar infrastructures and are indicative of the great interdependencies and complexities of embracing ecological computationality in cybersecurity. By engaging in greater recursivity through big data analysis, cybersecurity is not only shaping conditions on customer computers, but cybersecurity itself is dependent on an ecology of unknown environments and infrastructures, as well as as computational analysis ‘tools’ that have been constructed based on the affective engagements of analysts at other endpoint detection vendors. Thus, the ecology of Sophos’ laboratory had been disrupted by the complexity of the ecologies of big data sharing and computational networks as well as being reliant on ‘automated’ quality assurance techniques dependent on computationality. Thus, the false positive identified by the rig on my detection was one rupture, caesura, on a warm June afternoon that exposed the complex ecologies at work in cybersecurity today.

Recognition and politics

However, as much as these ecological ruptures could be plausibly attributed to computational complexity, complexity did not appear to wholly explain what was going on in the laboratory. Drawing on the work of N. Katherine Hayles,44 and associated work on thinking about recursive societies,45 I found the argument about computation’s capacity to (re)cognise as a cognitive actor supported by its recursive capacities fruitful. Such thinking allows for more-than-human political hybrids, which alternatively enable facial features to be matched with databases of ‘suspicious’ individuals at state borders, often folding entrenched forms of injustice.46 This approach also extends to identifying plants through mobile applications, such as PictureThis, or ascertaining whether software is malicious or not in cybersecurity. Hayles’ work specifically engages with materials according to their capacity to be either cognisers or non-cognisers in contrast to a hierarchy based on human centrism. Cognisers include a broad polity of different things that include organic things including people, animals, plants, and even viruses, but crucially also computation. In contrast, non-cognisers include things that do not read, interpret, or act upon signs47 – including rocks, ocean waves, and plastics – that could be understood to dynamically interact and afford properties as in much new materialist thought.

Hayles’ expansive notion of cognition opens up a plural, and political, ‘we’ by exploring how cognisers make choices and collectively develop meaning and communities. This makes computation distinct to other forms of technology with its capacity to recursively read the environments and ecologies it is within, interpret these, and then act by shifting bits, producing ‘outputs’, and more. Computation is not simply acting according to the forces exerted upon it, but is about making choices. As I have expressed with the notion of grammars of malware elsewhere,48 this thinking is productive of complex ecological computationalities, involving people writing software and various computational materials making choices, alongside infrastructures and more. Certain grammars of cognitive capability are embedded in computational materials, enabling malware to make choices that lead to particular transformations in computation (as much as other computational materials and people are making choices too!) Focusing on the capacity for cognition also opens up polities for all organic matter to make choices, albeit at different gradients and with different affects. For some plants, the cognitive scope and choices may be exceptionally limited, such as how to orientate towards light or seek nutrients in soils. Debates on such capacities of both plants and animals to cognise, be intelligent, and otherwise have been widely debated within geography and elsewhere.49 However, I claim that a broadening of choice-making is intimately political without recourse to saying that all forms of cognition are equal, comparable, or commensurable to one another. So, rather than conceptualising computation through the figure of a tool, instead computationality enables a political acknowledgement of the complex processes that occur during the various readings, interpretations, and then choices that emerge in such a process. This also suggests an alternative view of computation beyond the glitch, bug, or error, and rather of an expression of choices that do not align with our humanly, representational, aspirations.

Vignette 2: Machine learning algorithms, colours, and percentages

The importance of the recursive cognitive capacities of computationality were most clearly expressed to me in what was a relatively new addition to Sophos’ capabilities to both analyse and detect malware: convolutional neural network machine learning algorithms.50 Although algorithms have pervaded endpoint detection for many years, machine learning algorithms offer the promise of identifying the ‘unknown’ malware, as has been promised with ‘threats’ in other security domains.51 This is because, most simply, machine learning algorithms leverage the cognitive capacity of computation to recursively iterate digital data in feedback loops. This process (re)constitutes various ecologies, establishing new abstract features to categorise and render the world in new formations. In one use at Sophos, an algorithm processed a software byte distribution presented as an image to identify malicious features.52 This was achieved through training on the big data of previously identified malware that had been shared as detections (itself an ecology of contemporary cybersecurity).

However, to a malware analyst, recognising human-sensible features from this bit distribution is exceptionally difficult and remains mainly incommensurable to them. I may be able to recognise some patterns, but not in the way that the algorithm did, creating new features of maliciousness that are broadly nonsensical to the analysts I worked with. Thus, such machine learning algorithms produce new formations of what is malicious or not. They literally perform new forms of security (and what is ‘normal’, a profoundly political act). Such algorithms provide a basis for a new form of recognition to take place, where the algorithm can be ‘grounded’ in the affective and embodied forms of maliciousness practised by malware analysts through the learning data used to train the algorithm. This produced, ultimately through computational recognition, new forms of what may be considered malicious, albeit still infused with norms established by malware analysts both at Sophos and elsewhere. In many ways, this is a more-than-human communal ‘we’ upon which different political actors work together in often incommensurable forms.

Computational cognitive capacities also permit a speeding up, and expanded reach, of cybersecurity and endpoint detection. Yet, it is also perilously reliant on more-than-human politics with alien forms of recognition embedded within it. When I was at my desk in the laboratory, the outputs of the machine learning algorithm were presented on my screen as the potential likelihood of maliciousness as a percentage, with an associated hue of colour on the interface before me. When conducting the autoethnography, becoming-analyst, those colours and percentages obscured the recognition of the machine learning algorithm, infusing my affective relations with malware as computational materials, and thus what maliciousness was. It changed how I viewed the software I was analysing, transforming my perception and relationality to this structured arrangement of code. I was being trained not solely by people, but by an ecology of computation itself. In this sense, malware analysis and detection are a truly more-than-human endeavour.

However, computation was not only transforming my perception in the laboratory, but actively constructing other ecologies by deeming what features were malicious or not. This leads to an increasing standardisation of what is considered ‘good’ software practice and establishing precedents over what a ‘good’ cybersecurity ecology may look like. Yet, in many ways, what happens when computation recognises and performs its choices, is it truly recognising ‘malware’ or something rather more unsettling, simply anomalous software? Anomalous to what? To our society’s capitalist response to computational vulnerability? I therefore assert that computation is not simply a tool with a unidirectional form of control and knowability from its authors. Computation actively participates in the political through its choices – which can intersect with our more humanly concerns. This occurs in ways that make its study all the more difficult, yet equally fascinating – whether that be through models of climate change or addressing concerns over disinformation on social media. Yet, political capacities occur across all forms of computationality, not just with machine learning algorithms, such as in the translations required to write in a text document, albeit with a varying gradient of the choices that are available to be made. It is only in machine learning algorithms – and increasingly their integration with robotics – that there has become a need to address computation’s explicit political capacities as it becomes imbricated across vast digital ecologies.

A political digital ecology?

In Félix Guattari’s Three Ecologies, he argues for ecologies to be understood across environments, social relations, and human subjectivity as three registers of an ‘ecosophy’.53 Guattari was aware, even in 1989, of the potential ‘of the technological and data-processing revolutions, as prefigured in the growth of a computer-aided subjectivity, which will lead to the opening up or, if you prefer, the unfolding [dépliage], of animal-, vegetal-, Cosmic- and machinic-becomings’.54 In contemporary thinking of ecologies of, and digitised through, computation, their recursive cognitive ability has radically transformed our societies by rearticulating forms of knowing and politics. Rather than as some backdrop for understanding animal or vegetal matter – to use Guattari’s phraseology – computation, in this chapter, has been presented as an active interlocutor and political actor enmeshed in communal more-than-human polities. This has drawn upon insights of ecology from software studies and media studies alongside the work on more-than-humanism and the philosophy of computation to argue that it is a cognising political actor, making choices aided by its recursive technological capacity. There is thus an incessant negotiation and translation going on all around us. It is not just an unfolding, but instead a recursive folding engaging in contested terrains, with alien relations of computational recognition,55 with multiple forms of subjectivity that do not wholly ascribe to humanly ways of recognising, thinking, and doing.

In my reading of computation, unlike Hayles, choice is a foundation to understanding a performative, recursive, and, crucially, political ecological computationality. This situates computation as distinct to other technologies that do not make choices and creates a more-than-human ecological polity alongside other forms of organic life. This does not neglect that there are important technological affordances – and indeed by all things – but these affordances should not be confused with political actors; ones that actively choose how to read, interpret, and act in the world compared to those which afford certain properties that shape those engaging in choice-making (albeit political actors can afford certain properties too, such as through what is written in malware). Such choices as political negotiations and frictions across terrains in ecologies may be incommensurable to our humanly representations, and even greater capacity for choice-making in more abstracted computational forms in machine learning algorithms increase this. I thus advocate for a more-than-human, expanded ‘we’, where computation is one of a broader range of political actors at work in (digital) ecologies. Thus, as much as computational representations are not always as complex as in machine learning algorithms, drawing on software studies and media studies’ close empirical attention to the fissures, ruptures, and interrelationship with our capitalist societies, it is possible to glimpse at how more-than-human relationships,56 between computation, people, other technologies, organic life, and more, can enable a more expansive appreciation of our contemporary ecologies.

In this chapter, I have used two vignettes of the Sophos malware analysis and detection laboratory to demonstrate how ecological computational practices are used – computationality – in cybersecurity. In identifying software as malicious, computation affords some properties to collect and store vast amounts of big data; but this is also made ‘useful’ through the politics of recognition and choice. This means that computation is not just about setting up a tool for analysis – but rather as an analyst, I was experimentally negotiating with a political actor that is defuse, not whole, known, nor localisable as much as it is distributed across customer endpoints and the sites of big data generation and collection. Cybersecurity, as one example, is then a complex hybrid of ecologies with various political actors, complex environments, in more-than-human collectives. Thus, researching digital ecologies, as this book proposes, means to also study the computationality upon which other ecologies may interface, interact, and be articulated afresh through computation’s capacity to be an entangled political actor. It is not simply enough to understand how computational media and interfaces may shape our perceptions, or how sensors may be able to understand other places in new ways, but also how computationality must be studied to understand the emergence of more-than-human modes of knowing, thinking, and politics.

Notes

1 Taffel, Digital Media Ecologies.
2 I have developed this thinking elsewhere in relation to security studies through topology and scene in Dwyer et al., ‘Topologies of security.’
3 PwC, Conti Cyber Attack on the HSE.
4 FireEye, ‘Highly evasive attacker.’
5 ‘Script kiddies’ are often referred to as novice, or relatively unskilled, individuals who use common techniques and tools to gain access or attack other computer systems.
6 Ransomware is a category of malware that encrypts – or mathematically ‘locks’ – files on a computer so that the user cannot access such files without the decryption key to ‘unlock’ them. This is accompanied by a ransom ‘note’ that informs the victim that they must pay a fee to gain access to their files.
7 Fazi, Contingent Computation; Fuller and Goffey, Evil Media; Parikka, Digital Contagions.
8 Zuckerberg, ‘Founder’s letter.’
9 Wark, Capital Is Dead.
10 Such a position on complexity can be considered with regard to the ‘black box’ of machine learning algorithms, where it is simply the complexity of recursive practices that needs to be ‘opened’ to observe the rationality of an algorithm’s outputs.
11 Parikka and Sampson, The Spam Book. However, see Leszczynski and Elwood, ‘Glitch epistemologies’ for a geographical perspective on the openings that glitches offer, albeit in a different context but which could offer glimpses into similar arguments I make regarding ecological computationality.
12 Pedwell, ‘Speculative machines and us.’
13 Bowker and Star, Sorting Things Out.
14 Hui, Recursivity and Contingency.
15 Hatfield, ‘René Descartes,’ n.p.
16 For more on this, see Raymont, ‘Leibniz’s distinction between natural and artificial machines.’
17 Wiener, Cybernetics.
18 Hayles, How We Became Posthuman.
19 Taffel, Digital Media Ecologies, p. 33.
20 Pedwell, ‘Speculative machines and us.’
21 Devellennes and Dillet, ‘Questioning new materialisms.’
22 Law, ‘After ANT.’
23 Barua, ‘Volatile ecologies.’
24 Dowling et al., ‘Qualitative methods II’; Greenhough, ‘More-than-human geographies.’
25 Whatmore, ‘Materialist returns.’
26 Fazi, ‘Can a machine think (anything new)?’; Parisi, ‘Critical computation.’
27 See Fuller, ‘Media ecologies’; Fuller and Goffey, Evil Media; Montfort et al., 10 PRINT CHR; Spencer, ‘Creative malfunction.’ Ecological thinking has multiple different avenues that this chapter cannot delve into to, but it uses the position from theoretical media studies (particularly software studies) rather than those of social movement media studies (for a genealogy of the latter, see Treré and Mattoni, ‘Media ecologies and protest movements’).
28 Haraway, Staying with the Trouble.
29 Castree, ‘The epistemology of particulars.’
30 Latour, Facing Gaia; Stengers, In Catastrophic Times.
31 Panelli, ‘More-than-human social geographies’; Povinelli, Geontologies; Yusoff, A Billion Black Anthropocenes or None.
32 Kitchin and Dodge, Code/Space.
33 Gabrys, Program Earth.
34 Beer, ‘The problem of researching a recursive society.’
35 This is not to say that computation cannot be very tightly articulated and programmed, especially in high-assurance software. Yet, this is because each step of the translation process is heavily scrutinised. This, typically, however limits software to being highly attuned to a particular task and cannot reason, which is more often than not the allure – and value – of our contemporary attention towards machine learning algorithms.
36 Rosiek et al., ‘The new materialisms and Indigenous theories.’
37 Bennett, Vibrant Matter.
38 Beer, ‘The problem of researching a recursive society’; Hui, Recursivity and Contingency.
39 Amoore, Cloud Ethics.
40 Dwyer, Malware Ecologies.
41 For more information, see Microsoft Security Intelligence, Microsoft Security Intelligence Report 22.
42 These are concatenations of alphanumerical symbols. I do not provide any further detail on what these were, as it is highly likely that these are still used in active detections.
43 ‘In the wild’ is a common term in cybersecurity to refer to spaces – and ecologies – outside of a closed computational network or environment.
44 Hayles, ‘Can computers create meanings?’; Hayles, Unthought.
45 Beer, ‘The problem of researching a recursive society’; Pedwell, ‘Speculative machines and us.’
46 Amoore, ‘Machine learning political orders.’
47 For more on a discussion on signs and information in computational cognition, see a conversation between N. Katherine Hayles and Tony Sampson – in Hayles and Sampson, ‘Unthought meets the assemblage brain,’ – nor should this thinking be directly associated with the cognitive sciences.
48 Dwyer, ‘Cybersecurity’s grammars.’
49 Lawrence, ‘Listening to plants.’
50 Although the details of these algorithms are too complex to note here, these types of algorithms analyse images.
51 Amoore, ‘The deep border.’
52 The bytes (8 bits) of a software program were transformed into a distribution that could then be visualised, with various colours, depending on whether it was encrypted or not, for instance.
53 Guattari, The Three Ecologies.
54 Ibid., p. 25.
55 Fazi, ‘Beyond human.’
56 Mackenzie, ‘Machine learners.’

Bibliography

Amoore, L. 2020. Cloud Ethics: Algorithms and the Attributes of Ourselves and Others. Durham, NC: Duke University Press.
Amoore, L. 2021. The deep border. Political Geography, 110: 102547.
Amoore, L. 2023. Machine learning political orders. Review of International Studies, 49(1): 20–36.
Barua, M. 2014. Volatile ecologies: Towards a material politics of human–animal relations. Environment and Planning A, 46(6): 1462–1478.
Beer, D. 2022. The problem of researching a recursive society: Algorithms, data coils and the looping of the social. Big Data & Society, 9(2): 20539517221104996.
Bennett, J. 2010. Vibrant Matter: A Political Ecology of Things. Durham, NC: Duke University Press.
Bowker, G.C. and Star, S.L. 1999. Sorting Things Out: Classification and Its Consequences. Cambridge, MA: MIT Press.
Castree, N. 2005. The epistemology of particulars: Human geography, case studies and ‘context’. Geoforum, 36(5): 541–544.
Devellennes, C. and Dillet, B. 2018. Questioning new materialisms: An introduction. Theory, Culture & Society, 35(7–8): 5–20.
Dowling, R., Lloyd, K., and Suchet-Pearson, S. 2016. Qualitative methods II: ‘More-than-human’ methodologies and/in praxis. Progress in Human Geography, 41(6): 823–831.
Dwyer, A.C. 2019. Malware Ecologies: A Politics of Cybersecurity. PhD thesis, University of Oxford. Available from: https://ora.ox.ac.uk/objects/uuid:a81dcaae-585b-4d5b-922f-8c972b371ec8/.
Dwyer, A.C. 2023. Cybersecurity’s grammars: A more-than-human geopolitics of computation. Area, 55(1):10–17.
Dwyer, A.C., Langenohl, A., and Lottholz, P. 2023. Topologies of security: Inquiring in/security across postcolonial and postsocialist scenes. Critical Studies on Security, 11(1): 1–13.
Fazi, M.B. 2018. Contingent Computation: Abstraction, Experience, and Indeterminacy in Computational Aesthetics. London: Rowman & Littlefield International.
Fazi, M.B. 2019. Can a machine think (anything new)? Automation beyond simulation. AI & Society, 34(4): 813–824.
Fazi, M.B. 2021. Beyond human: Deep learning, explainability and representation. Theory, Culture & Society, 38(7–8): 55–77.
FireEye, 2020. Highly evasive attacker leverages SolarWinds supply chain to compromise multiple global victims with SUNBURST backdoor. Threat Research [online], 13 December. Available from: https://web.archive.org/web/20210312010 850/ www.fireeye.com/blog/threat-research/2020/12/evasive-attacker-leverages- solarwinds-supply-chain-compromises-with-sunburst-backdoor.html.
Fuller, M. 2005. Media Ecologies: Materialist Energies in Art and Technoculture. Leonardo (Series). Cambridge, MA: MIT Press.
Fuller, M. and Goffey, A. 2012. Evil Media. Cambridge, MA: MIT Press.
Gabrys, J. 2016. Program Earth: Environmental Sensing Technology and the Making of a Computational Planet. Minneapolis, MN: University of Minnesota Press.
Greenhough, B. 2014. More-than-human geographies. In A. Passi, N. Castree, R. Lee, S. Radcliffe, R. Kitchin, V. Lawson, and C. Withers (Eds.) The SAGE Handbook of Progress in Human Geography (pp. 94–119). London: Sage.
Guattari, F. 2014. The Three Ecologies. Bloomsbury Revelations. London: Bloomsbury.
Haraway, D.J. 2016. Staying with the Trouble: Making Kin in the Chthulucene. Experimental Futures. Durham, NC: Duke University Press.
Hatfield, G. 2018. René Descartes. In E.N. Zalta (Ed.) The Stanford Encyclopedia of Philosophy, Summer 2018. Metaphysics Research Lab, Stanford University [online]. Available from: https://plato.stanford.edu/archives/sum2018/entries/descartes/.
Hayles, N.K. 1999. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago, IL: University of Chicago Press.
Hayles, N.K. 2017. Unthought: The Power of the Cognitive Nonconscious. Chicago, IL: University of Chicago Press.
Hayles, N.K. 2019. Can computers create meanings? A cyber/bio/semiotic perspective. Critical Inquiry, 46(1): 32–55. https://doi.org/10.1086/705303.
Hayles, N.K. and Sampson, T. 2018. Unthought meets the assemblage brain: A dialogue between N. Katherine Hayles and Tony D. Sampson. Capacious: Journal for Emerging Affect Inquiry, 1(2): 60–84.
Hui, Y. 2019. Recursivity and Contingency. Media Philosophy. London: Rowman & Littlefield International, Ltd.
Kitchin, R. and Dodge, M. 2011. Code/Space: Software and Everyday Life. Cambridge, MA: MIT Press.
Latour, B. 2017. Facing Gaia: Eight Lectures on the New Climatic Regime. Cambridge, UK: John Wiley & Sons.
Law, J. 1999. After ANT: Complexity, naming and topology. The Sociological Review, 47(S1): 1–14.
Lawrence, A.M. 2022. Listening to plants: Conversations between critical plant studies and vegetal geography. Progress in Human Geography, 46(2): 629–651.
Leszczynski, A. and Elwood, S. 2022. Glitch epistemologies for computational cities. Dialogues in Human Geography, 12(3): 361–378.
Mackenzie, A. 2017. Machine Learners: Archaeology of a Data Practice. Cambridge, MA: MIT Press.
Microsoft Security Intelligence. 2017. Microsoft Security Intelligence Report 22 [online]. Available from: www.microsoft.com/en-us/security/business/security-intelligence-report.
Montfort, N., Baudoin, P., Bell, J., Bogost, I., Douglass, J., Marino, M.C., Mateas, M., Reas, C., Sample, M., and Vawter, N. 2012. 10 PRINT CHR $(205.5+ RND (1)); GOTO 10. London: MIT Press.
Panelli, R. 2009. More-than-human social geographies: Posthuman and other possibilities. Progress in Human Geography, 34(1): 79–87.
Parikka, J. 2016. Digital Contagions: A Media Archaeology of Computer Viruses. 2nd ed. New York: Peter Lang.
Parikka, J. and Sampson, T.D. 2009. The Spam Book: On Viruses, Porn, and Other Anomalies from the Dark Side of Digital Culture. Cresskill, NJ: Hampton Press.
Parisi, L. 2019. Critical computation: Digital automata and general artificial thinking. Theory, Culture & Society, 36(2): 89–121.
Pedwell, C. 2022. Speculative machines and us: More-than-human intuition and the algorithmic condition. Cultural Studies, 38(2): 188–218.
Povinelli, E.A. 2016. Geontologies: A Requiem to Late Liberalism. Durham, NC: Duke University Press.
Raymont, P. 1998. Leibniz’s distinction between natural and artificial machines. The Paideia Archive: Twentieth World Congress of Philosophy, 11: 148–152.
Rosiek, J.L., Snyder, J., and Pratt, S.L. 2020. The new materialisms and Indigenous theories of non-human agency: Making the case for respectful anti-colonial engagement. Qualitative Inquiry, 26(3–4): 331–346. https://doi.org/10.1177/ 1077800419830135.
Spencer, M. 2021. Creative malfunction: Finding fault with Rowhammer. Computational Culture, 8. http://computationalculture.net/creative-malfunction-finding-fault-with-rowhammer/.
Stengers, I. 2015. In Catastrophic Times: Resisting the Coming Barbarism (A. Goffey, Trans.). London: Open Humanities Press.
Taffel, S. 2019. Digital Media Ecologies: Entanglements of Content, Code and Hardware. London: Bloomsbury Academic.
Treré, E. and Mattoni, A. 2016. Media ecologies and protest movements: Main perspectives and key lessons. Information, Communication & Society, 19(3): 290–306. https://doi.org/10.1080/1369118X.2015.1109699.
Wark, M. 2019. Capital Is Dead: Is This Something Worse? London: Verso.
Whatmore, S. 2006. Materialist returns: Practising cultural geography in and for a more-than-human world. Cultural Geographies, 13(4): 600–609.
Wiener, N. 1948. Cybernetics: Or Control and Communication in the Animal and the Machine. New York: J. Wiley.
Yusoff, K. 2018. A Billion Black Anthropocenes or None. Minneapolis, MN: University of Minnesota Press.
Zuckerberg, M. 2021. Founder’s Letter, 2021. Meta [online], 29 October. Available from: https://web.archive.org/web/20220115154749/https://about.fb.com/news/2021/10/founders-letter/.
  • Collapse
  • Expand

All of MUP's digital content including Open Access books and journals is now available on manchesterhive.

 

Digital ecologies

Mediating more-than-human worlds

Metrics

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 152 152 84
PDF Downloads 123 123 63