Throwback Thursday: The Ironies of Automation
Don't worry, our Throwback Thursday doesn’t involve embarrassing pictures of me or Arathi from 5 years ago. Instead, it is more cerebral. The social science behind automation and autonomy is long and rich, and despite being one of the earliest topics of study in engineering psychology, it has even more relevance today.
Instead of re-inventing the wheel, why don't we look at the past literature to see what is still relevant?
In an effort to honor that past but also inform the future, the inaugural "Throwback Thursday" post will highlight scientific literature from the past that is relevant to modern discussion of autonomy.
Both Arathi and I have taught graduate seminars in automation and autonomy so we have a rich treasure trove of literature from which to draw. Don't worry: while some of the readings can be complex and academic, in deference to our potentially diverse readership, we will focus on key points and discuss their relevance today.
The Ironies of Automation
In this aptly titled paper, Bainbridge discusses, back in 1983(!), the ironic things that can happen when humans interact with automation. The words of this paper ring especially true today when the design strategy of some companies is to consider the human as an error term to be eliminated:
But is this design strategy sustainable? Bainbridge later wisely points out that:
The paper then discusses, how, under such an approach, many unintended problems arise. The ultimate irony, however, is that the implementation of very high levels of automation (including eliminating the driver in a self-driving car) will ultimately lead to a higher workload burden for the "passenger."
Citation (warning: link is behind a paywall): Bainbridge, L. (1983). The ironies of automation. Automatica, 19(6), 775-779.