Snippet: Why Apps Like Siri and Cortana Need to Understand Suicide ☇

Shared on March 16, 2016

Annalee Newitz for Ars Technica:

What happens when you tell Siri that you have a health emergency? What if you confess to Cortana that that you’ve been raped, or that you’re feeling suicidal? These sound like weird questions until you consider how many people rely on apps to get health information.

Of course your smartphone may not be the greatest tool to use seeking this kind of help, but if you’re extremely upset or hurt, you might not be thinking logically and have nowhere else to turn. That’s why a group of researchers set out to discover what the four most common conversational agents say in these situations. They wanted to know what these apps do when asked about rape, suicide, abuse, depression, and various health problems.

As someone who had a job dealing with some crisis situations, this is a fascinating and important study. I’m a little disappointed that I had not considered this in the more than four years since Apple introduced Siri on the iPhone 4S or the predecessor standalone app.

Snippets are posts that share a linked item with a bit of commentary.