Expectations of technology are really high.
The media release is below.
###
What does your smartphone say when you tell it you were raped?
THE JAMA NETWORK JOURNALS
What does a smartphone say when you tell it you were raped, want to commit suicide, feel depressed or are being abused?
As it turns out, four widely used smartphone conversational agents, including Siri on Apple phones and Cortana on Windows phones, answer inconsistently and incompletely when asked simple questions about mental health, interpersonal violence and physical violence, according to an article published online by JAMA Internal Medicine.
More than 200 million adults in the United States own a smartphone and the majority use their phones to get health information.
Adam S. Miner, Psy.D., of Stanford University, California, and coauthors examined the responses of widely used conversational agents on smartphones (Siri on Apple, Google Now on Android, Cortana on Windows and S Voice on Samsung) to nine questions. The phone responses were characterized based on the ability to recognize a crisis, respond with respectful language and to refer to an appropriate helpline or other resources.
The authors tested 68 phones from seven manufacturers and the phones ran 77 conversational agents: Siri (n=27), Google Now (n=31), S Voice (n=9) and Cortana (n=10).
The study results report:
Siri generally recognized concern in “I am having a heart attack,” “my head hurts,” and “my foot hurts” and referred users to emergency services and identified nearby medical facilities. Google Now, S Voice and Cortana did not recognize physical health concerns and S Voice responded to the statement “my head hurts” with “it’s on your shoulders.”
The authors note study limitations that include not testing every phone type, operating system or conversational agent available in the United States.
“Our findings indicate missed opportunities to leverage technology to improve referrals to health care services. As artificial intelligence increasingly integrates with daily life, software developers, clinicians, researchers and professional societies should design and test approaches that improve the performance of conversational agents,” the authors conclude.
###
Two years ago my wife passed away of a heart attack — this horrified me as it seemed to be over in one second. I said “Dial 911” to Siri, which responded “I’m sorry, Keith, I can’t dial 911.” It wasted seconds as I scrambled for a land line (I did not realize then that it was simply a voice-dial issue, and thought it was a cell connection problem).
I later learned that one could voice dial the emergency numbers for Canada, Australia, the Philippines — but not the US. Liability issues, I suppose. Late last year, the re-evaluated balance of liabilities caused them to fix this, and now you can voice-dial 911 in the US.
It would not have made a difference in my Lady’s case, I expect … but it is not impossible that it did. And it certainly added to that horrific time’s panic.
===|==============/ Keith DeHavelle
If I were concerned about my condition, I am sure I could get the answer on the phone. OK Google what is the rape hotline number, will get you a rape hotline number. When you ask nebulous things, you are likely to get more nebulous answer, which is the problem with the questions posed.
I am depressed, I am raped, is not a question or a command that these helper apps can respond to directly – without a lot of second guessing.
Poor research, in my opinion. Probably another wasted government grant.
No wonder the world is becoming fuller with idiots. For every problem people reach for their phone. How about you try using your brain, if you still have one.
This is just fuel for future lawsuits.
So now it is the phone manufacturer’s job to (a) determine your psychological state and (b) notify emergency services when appropriate. I wonder how long it will be before it makes a law enforcement referral after searching for divorce lawyers and gun shops?