Siri and Alexa Say #MeToo to Sexual Harassment

The number of prominent celebrities and politicians being taken down for sexual harassment really seems to represent a major change in how society views sexual harassment.  No longer whispered or swept under the rug, harassment is being called-out and harassers are being held accountable for their words and actions.  

So, if AI will soon be collaborators, partners, and team mates, shouldn't they also be given the same treatment?  This story in VentureBeat talks about a campaign by Randy Painter to consider how voice assistants behave when harassed:

We have a unique opportunity to develop AI in a way that creates a kinder world. If we as a society want to move past a place where sexual harassment is permitted, it’s time for Apple and Amazon to reprogram their bots to push back against sexual harassment

I've never harassed Siri so I wasn't aware of the responses she gives when one attempts to harass her:

Siri responds to her harassers with coy remarks that sometimes even express gratitude. When they called Siri a “slut,” she responded with a simple “Now, now.” And when the same person told Siri, “You’re hot,” Siri responded with “I’m just well put together. Um… thanks. Is there something I can help you with?”

In our interview last week with Dr. Julie Carpenter, she addressed this somewhat:

Another ethical question rising from romantic human-AI interaction is, “Will a person who is accustomed to the imbalanced power dynamic of a human-robot relationship transfer their behaviors into their human-human relationships?” The implication there is that (1) the person treats the robot in a way we would find distasteful in human-human dynamics, and (2) that our social behaviors with robots will be something we apply as a model to human-human interactions.

This is fascinating because there is existing and ongoing research examining how humans respond and behave with AI/autonomy that exhibits different levels of politeness.  For example, autonomy that is rude, impatient, and intrusive were considered less trustworthy by human operators. If humans  expect autonomy to have a certain etiquette, isn't it fair to expect at least basic decency from humans towards autonomy?

Citation: Parasuraman R., & Miller C. (2004). Trust and etiquette in high-criticality automated systems. Communications of the Association for Computing Machinery, 47(4), 51–55.