Have you ever caught yourself screaming in frustration because Siri keeps confusing Guardian Ad Litem with Guardians of the Galaxy?
If so, you might want to pause a moment to consider what this says about you.
More than 35 million Americans are using voice-activated apps like Apple’s Siri, Microsoft’s Cortana or Amazon’s Alexa. And at any given moment, a good number of them are losing their cool at their invisible assistants.
“[A]s tech companies introduce more advanced assistants and chatbots, some of these conversations inevitably turn hostile,” according to this article in CNN Money. “Maybe Siri mistakenly calls your boss at night instead of your girlfriend. Or you’re upset that your meeting is being rescheduled and decide to shoot the automated messenger. Call it version 2.0 of banging on your keyboard or throwing a mouse at the wall. And now, the machine talks back.”
All in good fun, you might say. But problems arise when rude behavior toward digital assistants begins to seep into your human interactions.
“Venting at machines could lead to a coarsening of how people treat each other,” Sherry Turkle, director of the MIT Initiative on Technology and Self, says here. “You yell at Alexa, you know Alexa is a machine. We treat machines as though they were people. And then, we are drawn into treating people as though they were machines.”
Rage At The Machines
There is little data on what is being called technology abuse. But one study found that in 10 to 50 percent of all human-computer interactions, the digital assistant or call agent was mocked, taunted or treated in a way that would be unacceptable in the analog world.
“Are there ethical objections to using offensive language, to verbally abusing digital technology?” asks digital ethicist Frank Buytendijk. “One would say there are none. You can’t really offend technology, so it doesn’t matter. You could compare it to shouting really loud to a wall, but it wouldn’t be impressed.”
No harm, no foul, right? Maybe not.
“I think it does show something about one’s inner civilization,” Buytendijk goes on to say. “As if you would only behave because others expect you to. Or because it may have consequences. Isn’t good behavior self-evident, a reward in itself and part of building character? Furthermore, your behavior could be offensive to other people that might witness it. And maybe you could start behaving badly to others, because you got used to it in dealing with Siri. We can at least conclude that verbal abuse of technology doesn’t make you a better person.”
Siri Fights Back
Meanwhile, tech companies are spending money to curb the abuse.
When Cortana is cursed at, for example, she has been taught to say: “Well, that’s not going to get us anywhere.” Siri responds with “I’d blush if I could” or “I’ll pretend I didn’t hear that” or something similar.
Computer engineers say the next big thing will be machines that are able to probe deeper into the user’s frustration and suggest possible remedies.
“As someone who has witnessed a balky laptop hurled out of a second-floor window, I understand the frustration with technology,” writes Barb Darrow in Fortune Tech. “But bad behavior toward Siri or Alexa or Cortana shows more about a person’s character than the technology itself. Which is why it’s so endearing to see parents of young children insist that those kids say please and thank you to Siri or Alexa. In other words, do unto Siri as you would have Siri do unto you.”
Money http://money.cnn.com/2017/08/22/technology/culture/personal-voice-assistants-anger/index.html?sr=recirc090717rageagainsthemachine1030vodtop · Fortune http://fortune.com/2016/09/29/dont-swear-at-siri/ · Gartner Blog – Frank Buytendijk http://blogs.gartner.com/frank-buytendijk/2016/09/26/can-you-abuse-technology-it-certaintly-works-the-other-way-around/