Throwback Thursday: Use, Misuse, Disuse, and Abuse of Automation

In this throwback post, I will introduce some important but similar-sounding terms from the automation literature and their causes: use, misuse, disuse, and abuse. I will highlight the key takeaways from the article followed by a give a light commentary.

Until recently, the primary criteria for applying automation were technological feasibility and cost. To the extent that automation could perform a function more efficiently, reliably, or accurately than the human operator, or merely replace the operator at a lower cost, automation has been applied at the highest level possible.
— Parasuraman & Riley, pp. 232

The above statement was made by the authors 20 years ago and the irony is it seems to be the dominant design and engineering philosophy to apply the highest levels of automation whenever technological feasible without much regard for the consequences to human performance.

Automation use: Automation usage and attitudes towards automation are correlated. Often these attitudes are shaped by the reliability or accuracy of the automation.

— Parasuraman & Riley, pp. 234

I'm not very good with directions and find myself relying a lot on Google Maps when I am in an unfamiliar city. I use it because I know that I can rely on it most of the time.  In other words, my positive attitude and use of Google Maps (i.e., the automated navigation aid) is influenced by the high reliability of Google Maps as well as due to my higher confidence in Google Maps compared to my own navigational skills (there have been numerous occasions where Google Maps have come to my rescue!).

Similarly, operators in complex environments will tend to defer to automation when they think it is highly reliable and when their confidence in the automation exceeds their confidence in their own abilities to perform the task.

Misuse: Excessive trust can lead operators to rely uncritically on automation without recognizing its limitations or fail to monitor the automation’s behavior. Inadequate monitoring of automated systems has been implicated in several aviation accidents.
— Parasuraman & Riley, pp. 238-239

While it is true that when reliable, automation may be better than humans at some tasks, the costs associated with automation failure are high for these highly reliable automated systems.  As Rich described in his last throwback post , a potential consequence of highly reliable, automated systems is the out of the loop performance syndrome or inability of operators to take over manual control in the event of an automation failure due to their overreliance on automation as well as due to degradation of their manual skills. 

High trust in automation can also make human operators less attentive to other sources of contradictory information; operators become so fixated on the notion that the automation is right that they fail to examine other information in the environment that seem to suggest otherwise. In research, we call this automation bias (more on this in a later post).

Misuse can be minimized by designing automation that is transparent about its state and its actions and that provides salient feedback to human operators.  Next week's throwback post will elaborate on this point.

Disuse: If a system is designed to minimize misses at all costs, then frequent device false alarms may result. A low false alarm rate is necessary for acceptance of warning systems by human operators.
— Parasuraman & Riley, pp. 244

This means that automation that has a high propensity for false alarms is less likely to be trusted. For example, if the fire alarm in my building goes off all the time, I am less likely to respond to it (the cry wolf effect).  It's not so simple as to say, "just make it less sensitive!"  Designing an automated system with a low false alarm rate is a bit of a conundrum because with a low false alarm rate also comes a miss rate; that is, the fire alarm may not sound when there is a real fire. 

While the cost of distrusting and disusing automation is high, the cost of missing an event can also be high in safety critical domains.  Designers should therefore consider the comparitive costs of high false alarm rates and high miss rates when designing automation.  This would obviously depend on the context in which the automation is used.

Abuse: Automation abuse is the automation of functions by designers and implementation by managers without due regard for the consequences for human (and hence system) performance and the operator’s authority over the system.
— Parasuraman & Riley, pp. 246

My earlier throwback post discusses the importance of considering the human performance consequences associated with automation use.  Completely eliminating the human operator from the equation by assuming that this will eliminate human errors in its entirety is not a wise choice. This can leave  operators with a higher workload and in a position to perform tasks for which they are not suited.  This irony was discussed by Bainbridge in 1983.  In short, operators' responsibilities should be based on their capabilities.  

 

Citation:  Parasuraman, R., & Riley, V. (1997). Humans and automation: Use, misuse, disuse, abuse. Human Factors, 39, 230-253.

Downloadable link here.