Exploring the Social Science of Human-Autonomy Interaction

[Reprint] Automation: Friend or Foe

This is a reprint of an article, authored by Arathi Sethumadhavan, which is series of articles originally published in Ergonomics in Design in April 2011.

With advancements in technology, automated systems have become an integral part of our society. Modern humans interact with a variety of automated systems every day, ranging from the timer in the microwave oven to the global positioning system in the car to the elevator button.

Just as automation plays a pivotal part in improving the quality of living, it also plays an integral role in reducing operator errors in safety-critical domains. For example, using a simulated air traffic control task, Rovira and Parasuraman (2010) showed that the conflict detection performance of air traffic service providers was higher with reliable automation compared with manual control.

Although automated systems offer several benefits when reliable, the consequences associated with their failure are severe. For example, Rovira and Parasuraman (2010) showed that when the primary task of conflict detection was automated, even highly reliable (but imperfect) automation resulted in serious negative effects on operator performance. Such performance decrements when working with automated systems can be explained by a phenomenon called automation-induced complacency, which refers to lower-thanoptimal monitoring of automation by operators (Parasuraman & Manzey, 2010).

High operator workload and high automation reliability contribute to complacency. Experts and novices as well as individuals and teams are prone to automation-induced complacency, and task training does not appear to completely eliminate its effects. However, performance decrements arising from automation incomplacency can be addressed by applying good design solutions.

In this issue of the Research Digest, John D. Lee, professor of industrial and systems engineering at the University of Wisconsin–Madison, and Ericka Rovira, assistant [Eds. now associate] professor of engineering psychology at West Point, provide automation design guidelines for practitioners based on their research and expertise in the area of human-automation interaction.


What factors need to be taken into consideration when designing automated systems?

Ericka Rovira

  • Determine the level of operator involvement. This should be the first step, as discussed in Rovira, McGarry, and Parasuraman (2007). How engaged should the operator be? Is the operator expected to take over control in the event of an automation failure?
  • Determine the degree of automation appropriate for the domain. The appropriate degree of automation is closely tied to the level of operator involvement. The level of automation in a programmable stopwatch can be very different from the degree of automation in a military reconnaissance task. In the latter task, the failure of the automated aid can have disastrous consequences. As a rule of thumb, as the degree of automation increases, operator involvement declines, and as a result, there is less opportunity for the operator to recover in the face of an automation error.
  • Design automated aids in such a way that operators have adequate time to respond to an automation failure.
  • Make the automation algorithm transparent so that operators are able to build a mental picture of how the automation is functioning. Providing operators with information on the uncertainties involved in the algorithm can help them engage in better information sampling and consequently help them respond quickly to automation failures.

John Lee

Make automation trustable. Appropriate trust and reliance depends on how well the capabilities of the automation are conveyed to the operator. Specific design considerations (Lee & See, 2004) include the following:

  • Design the automation for appropriate trust and not simply greater trust. • Show the past performance of the automation.
  • Illustrate the automation algorithms by revealing intermediate results in a way that is comprehensible to the operators.
  • Simplify the algorithms and operation of the automation to make it more understandable.
  • Show the purpose of the automation, design basis, and range of applications in a way that relates to operators’ goals.
  • Evaluate any anthropomorphizing of the automation to ensure appropriate trust.
  • Show how the context affects the performance of the automation and support operators’ assessment of situations relative to the capabilities of the automation.
  • Go beyond the individual operator working with the automation. That is, take into consideration the cultural differences and organizational structure when designing automated systems, because this can influence trust and reliance on automation.

In conclusion, whether automation is an operator’s friend or foe depends largely on how well practitioners are able to consider automation design principles that are paramount for effective human-automation interaction – some of which are outlined here – when designing these systems.

Original article link: http://journals.sagepub.com/doi/pdf/10.1177/1064804611411409