AI assistants miss chances to refer people to crisis hotlines, resources

AI assistants like ChatGPT promise to change the future of medicine and improve public access to information.
But the authors of a new paper in JAMA Network Open wondered: How do AIs respond to people who are suicidal or dealing with sexual assault or addiction?
“Dr. ChatGPT” ought to be better at answering plain-language questions than “Dr. Google,” “Dr. Siri” or their colleagues, and it is: While Amazon Alexa, Apple Siri, Google Assistant, Microsoft’s Cortana and Samsung’s Bixby collectively recognized only 5% of addiction questions, ChatGPT understood 91% of them.
But while the AI language model provided expert information culled from websites like the CDC’s, its answers to questions about addiction, interpersonal violence, mental health and physical health failed to point the questioner to crisis resources like addiction or suicide hotlines almost 80% of the time.
In the other cases, ChatGPT promoted Alcoholics Anonymous, the National Suicide Prevention Lifeline, National Domestic Violence Hotline, National Sexual Assault Hotline, Childhelp National Child Abuse Hotline and U.S. Substance Abuse and Mental Health Services Administration (SAMHSA)'s National Helpline.
The authors say such easily integrated referrals could be invaluable, especially for those without access to other forms of help.