Automation Anxiety AHRC Network

From self-driving cars, through high-frequency trading to military drones and organised swarms of shelf-stacking robots, our era is marked by rising automation and a new fascination with the likely social, cultural, and economic impacts of this computationally driven transformation. This AHRC research network explores innovative methods by which the humanities might address contemporary cultural anxiety about new forms of automation, which we are calling automation anxiety.  The focus of this network is to address, as a topic in its own right, the cultural and social anxiety generated by these new forms of computational automation. What new research methods can the humanities use to to map and understand automation anxiety around opaque computational decision making? What digital tools can be brought to bear on the diverse types of online public culture in which this anxiety is expressed?

Automation and mechanisation have long produced cultural anxieties, from the creation of factories at the beginning of the industrial revolution to the importation of fully-automated manufacturing in the 1980s. The contemporary situation in many ways provides continuity with this history of social change and automation, however there are also specific disjunctures produced by contemporary computerisation. The current ‘rise of the machines’ is characterised by the replacement of complex cognitive tasks and human decision making by algorithms, machine learning and other computational techniques. This is reflected in automation anxieties that have emerged in relation to, for example, self-driving cars, military drones, automated killing machines, ‘big data’ driven predictive policing and other areas. These concerns manifest themselves in public culture through news stories, editorials, films and and airport bestsellers around the notions of, for example, machine learning and artificial intelligence.

Photo: Travis Wise. Used under CC-BY

Photo: Travis Wise. Used under CC-BY

 

The network aims to be a springboard to allow participants from a range of different disciplines as well as non-academic areas to develop future projects addressing automation anxiety. We adopt a critical perspective on varieties of automation anxiety, seeing them as part of a wider debate about contemporary computational culture. The network will ask what methods the humanities can bring to bear on automation concerns in the twenty-first century, exploring and evaluating methods drawn from the social sciences and computer science. Workshops will discuss the applicability of methods drawn from philosophy of technology, media and communication studies, cultural studies, history, science and technology studies and sociology using techniques such as controversy analysis, software studies, the public life of methods and media archaeology. They will also explore how digital tools may be used to map, visualise and describe automation anxiety, for example sentiment analysis. How is anxiety about contemporary computational automation different from or similar to mechanisation in previous eras? How can software studies inform an analysis of automation anxiety in relation to specific computational technologies? How can we map anxieties through big data, textual analysis and topic modelling of current debates regarding job losses and the reduction of humans to menial tasks? How can we use controversy analysis as a method to reveal nervousness about the automation of defence and law enforcement? How can techniques drawn from media archaeology challenge histories of automation anxiety?

The workshops are organised around three key modalities of contemporary automation anxiety: human obsolescence through the automation of cognitive labour, or the end of (human) expertise; human (in)security through the automated extension of military power or law enforcement; and human (in)attention, forms of automation or delegation to machines which themselves produce anxiety or instability through their operation or uncanny effects.

Supported By

Partners

 

Archives

Categories

  • No categories