Police Control Systems focuses on the way that British police institutions have controlled the individual constable on the ‘front line’. This control has been exercised by a variety of different institutions and individuals, ranging from direct day-to-day input from ‘the community’, responsibility under Common Law, through bureaucratic systems built around exacting codes of rules – and the gradual modification of this process to accommodate a growing professionalism – to the real-time control of officers by radio, coupled with the increasing use of surveillance techniques. This is the first book on police history which looks at how police institutions worked on a day to day level. It challenges the idea that the reformed police of the early nineteenth century were automatically ‘professional’, asserting instead that in most respects they were de-professionalised. It describes the role played in police organisations by books, forms, clerks, and telephones, and looks at how some of this technology was derived from military precedents. It argues that at many - but not all – technical milestones in these institutional developments were precipitated by national security concerns. It ends with an analysis of the development of the Police National Computer in the 1960s and 1970s: a milestone in policing and computing history which has never been explored before.
Long before the invention of the mechanical clock, the monastic computes offered
a model of time that was visible, durable, portable and objectifiable. The
development of ‘temporal literacy’ among the Anglo-Saxons involved not only the
measurement of time but also the ways in which the technologies used to measure
and record time — from sundials and church bells to calendars and chronicles —
worked to create and reorder cultural capital, and add new scope and range to
the life of the imagination. Techniques of time measurement are deeply
implicated in historical consciousness and the assertion of identity; this paper
proposes some avenues of exploration for this topic among the Anglo-Saxons.
violence to economic prosperity, are ‘presented as unambiguous and
objective’ because they ‘are grounded in the certainty of
numbers’. Such a conception of numbers is encapsulated by Desrosières (2001 : 348) when he
talks of ‘metrological realism’. This viewpoint holds that
‘computed moments (averages, variances, correlations) have a substance that
reflects an underlying macrosocial reality, revealed by those computations’.
In other words, numbers reveal something about the
analysis to current discussions about data
relations. Couldry and Mejias (2019: 337,
340) suggest that ‘Data colonialism combines the predatory
extractive practices of historical colonialism with the abstract quantification
methods of computing’; they explain that ‘this rests on the
construction of data as a “raw material” with natural value’.
Segura and Waisbord (2019) voice
concern about the danger that the datacolonialism critique will become
This book offers a practical introduction to digital history with a focus on working with text. It will benefit anyone who is considering carrying out research in history that has a digital or data element and will also be of interest to researchers in related fields within digital humanities, such as literary or classical studies. It offers advice on the scoping of a project, evaluation of existing digital history resources, a detailed introduction on how to work with large text resources, how to manage digital data and how to approach data visualisation. After placing digital history in its historiographical context and discussing the importance of understanding the history of the subject, this guide covers the life-cycle of a digital project from conception to digital outputs. It assumes no prior knowledge of digital techniques and shows you how much you can do without writing any code. It will give you the skills to use common formats such as plain text and XML with confidence. A key message of the book is that data preparation is a central part of most digital history projects, but that work becomes much easier and faster with a few essential tools.
reframing in the context of contemporary digital networks, the power-laden dynamics of which are epitomised in the increasingly ubiquitous technology of cloud computing.
In the following, I interrogate how dynamics of capturing clouds in digital domains (in both possible meanings) interfere with borders and state power, and how they are resisted and rearticulated in and through contemporary works of art. Do digital networks and data clouds subvert state power and borders? Or do they, rather, reiterate and reinforce received structures of dominance by
Why does the US view China’s progress in
dual-use AI as a threat to its first-mover advantage? How might the US respond to this
perceived threat? This chapter considers the intensity of US–China strategic
competition playing out within a broad range of AI and AI-enabling technologies (e.g.
machine learning (ML), 5G networks, autonomy and robotics, quantum computing, and big-data
analytics). 1 It describes how great-power
competition is mounting within several dual-use high-tech fields, why these
… contradicts itself, cuts itself off, grinds itself up’. 72
Television’s self-legible autonomy helps to explain its uncanniness to its early users, and in fact, even in their mid-century infancy, cybernetics and computing were already starting to instigate new questions about the relationship between reality, consciousness and codes of meaning, and complicate the definition of human subjectivity, and – in the case of Grey Walter’s uncannily lifelike robots – human psychopathology.
‘Jigging like a clumsy Narcissus’: William Grey
time imaging has been increasingly applied in the study of human, and to a lesser extent to the study of animal,
mummies (Adams and Alsop 2008: 22; McKnight 2010).
Computed tomography (CT) was introduced in 1972, initially to image the
human brain, and it transformed the practice of neuroradiology (Hounsfield
1973). Subsequently whole body CT scanners with larger gantries were available from 1975. CT has significantly impacted on the clinical diagnosis and
management in patients and is extensively used for these purposes; it is now the
imaging technique most widely
Alan Warde, Jessica Paddock, and Jennifer Whillans
might be conceptualised as requiring a complex imaginary equation which computes costs of money, time, quality and personal reputation. Considerable variation between households should be anticipated, as the possible permutations permit vast variation. We therefore use some detailed vignettes of interviewees (using their survey responses as well as the testimonies from in-depth interviews) to show how each finds different solutions to the problem of permutating varied types of events to construct the platform for their eating arrangements. From these individual