Anti-computing explores forgotten histories and contemporary forms of dissent – moments when the imposition of computational technologies, logics, techniques, imaginaries, utopias have been questioned, disputed, or refused. It also asks why these moments tend to be forgotten. What is it about computational capitalism that means we live so much in the present? What has this to do with computational logics and practices themselves?
This book addresses these issues through a critical engagement with media archaeology and medium theory and by way of a series of original studies; exploring Hannah Arendt and early automation anxiety, witnessing and the database, Two Cultures from the inside out, bot fear, singularity and/as science fiction. Finally, it returns to remap long-standing concerns against new forms of dissent, hostility, and automation anxiety, producing a distant reading of contemporary hostility.
At once an acute response to urgent concerns around toxic digital cultures, an accounting with media archaeology as a mode of medium theory, and a series of original and methodologically fluid case studies, this book crosses an interdisciplinary research field including cultural studies, media studies, medium studies, critical theory, literary and science fiction studies, media archaeology, medium theory, cultural history, technology history.
Anti-computing punctuates computational advances; the rush ahead, the demand to be wanted, the claims for progress, and for progress as automatically good. Anti-computing is a pause, a stop, a refusal, an objection, a sense, an emotion, a response, a popular campaign, a letter, an essay, a code-work, a theorist, a sensibility, an ambience, an absolute hostility, a reasoned objection, a glitch, a hesitation, an ambient dislike. It may be articulated by a human, a crowd, a network, or by a program that refuses to run. It may also be an element
Long before the invention of the mechanical clock, the monastic computes offered a model of time that was visible, durable, portable and objectifiable. The development of ‘temporal literacy’ among the Anglo-Saxons involved not only the measurement of time but also the ways in which the technologies used to measure and record time — from sundials and church bells to calendars and chronicles — worked to create and reorder cultural capital, and add new scope and range to the life of the imagination. Techniques of time measurement are deeply implicated in historical consciousness and the assertion of identity; this paper proposes some avenues of exploration for this topic among the Anglo-Saxons.
violence to economic prosperity, are ‘presented as unambiguous and objective’ because they ‘are grounded in the certainty of numbers’. Such a conception of numbers is encapsulated by Desrosières (2001 : 348) when he talks of ‘metrological realism’. This viewpoint holds that ‘computed moments (averages, variances, correlations) have a substance that reflects an underlying macrosocial reality, revealed by those computations’. In other words, numbers reveal something about the
analysis to current discussions about data relations. Couldry and Mejias (2019: 337, 340) suggest that ‘Data colonialism combines the predatory extractive practices of historical colonialism with the abstract quantification methods of computing’; they explain that ‘this rests on the construction of data as a “raw material” with natural value’. Segura and Waisbord (2019) voice concern about the danger that the datacolonialism critique will become
’, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems , pp. 611 – 20 , doi: 10.1145/2470654.2470742 . ITU ( 2019 ), Economic Contribution of Broadband, Digitization and ICT Regulation: Econometric Modelling for Africa , International Telecommunication Union , www.itu.int/dms_pub/itu-d/opb/pref/D-PREF-EF.BDT_AFR-2019-PDF
Police Control Systems focuses on the way that British police institutions have controlled the individual constable on the ‘front line’. This control has been exercised by a variety of different institutions and individuals, ranging from direct day-to-day input from ‘the community’, responsibility under Common Law, through bureaucratic systems built around exacting codes of rules – and the gradual modification of this process to accommodate a growing professionalism – to the real-time control of officers by radio, coupled with the increasing use of surveillance techniques. This is the first book on police history which looks at how police institutions worked on a day to day level. It challenges the idea that the reformed police of the early nineteenth century were automatically ‘professional’, asserting instead that in most respects they were de-professionalised. It describes the role played in police organisations by books, forms, clerks, and telephones, and looks at how some of this technology was derived from military precedents. It argues that at many - but not all – technical milestones in these institutional developments were precipitated by national security concerns. It ends with an analysis of the development of the Police National Computer in the 1960s and 1970s: a milestone in policing and computing history which has never been explored before.
In the 2020s we live in an era of automation anxiety and automation fever (Bassett and Roberts, 2020 ). There are rising concerns around extant computational cultures, near-horizon developments, and longer-term predictions about computational futures. Anti-computing of various kinds is back with a vengeance. A new moment of urgency arises. Once again the contemporary moment is proclaimed as the time of make or break, the time for decisive action. Once again it is argued that technology will in short order crystallize into a good or bad
; this came to seem essential to understanding and constructing anti-computing. But why look back? Following the turn of the decade, as I complete this book, there is abundant hostility to computing, anxiety about its impacts, and rejection of its visions in the here and now. Automation anxieties around the future of work are fuelling a new anti-computational turn, data-surveillance issues haunt formal politics, and there is rising concern over screen ‘addiction’ in the young. Limit points are being declared and last-chance saloons announced. The massacre in
This book offers a practical introduction to digital history with a focus on working with text. It will benefit anyone who is considering carrying out research in history that has a digital or data element and will also be of interest to researchers in related fields within digital humanities, such as literary or classical studies. It offers advice on the scoping of a project, evaluation of existing digital history resources, a detailed introduction on how to work with large text resources, how to manage digital data and how to approach data visualisation. After placing digital history in its historiographical context and discussing the importance of understanding the history of the subject, this guide covers the life-cycle of a digital project from conception to digital outputs. It assumes no prior knowledge of digital techniques and shows you how much you can do without writing any code. It will give you the skills to use common formats such as plain text and XML with confidence. A key message of the book is that data preparation is a central part of most digital history projects, but that work becomes much easier and faster with a few essential tools.