New study reveals Australians turning to AI for mental health support

New study reveals Australians turning to AI for mental health support

15 November 2024

Ongoing challenges in accessing mental health care has seen an increasing number of people turn to artificial intelligence (AI) to obtain more easily available, low cost and private mental health support, new research suggests.   

The study, led by Orygen and published recently in JMIR Mental Health, provides critical insights into how AI is currently being used to support mental health care in Australia, and is the first to survey both community members and mental health professionals on their use, perceptions, and experience of benefits and harms related to AI.  

Lead author, Clinical Psychologist and Director of Digital Service Transformation and Research at Orygen Digital, Associate Professor Shane Cross, said that AI had significant potential to address service challenges in mental health care, potentially automating some tasks and providing new forms of support.  

“We know that increasing numbers of people are experiencing mental health difficulties each year, and less than half of those who suffer get the treatment they need, so AI has the potential to revolutionise mental healthcare by making it more accessible, personalised and efficient,” Associate Professor Cross said.  

"However, this is a new technology, and we must proceed with caution by addressing the significant concerns raised by survey respondents related to privacy, ethics, and the quality of AI-generated advice to ensure these tools are safe and effective.  

“Despite their appeal, commercial AI tools like ChatGPT were not developed with these specific uses in mind, and therefore carry risks.”   

The study surveyed 107 community members and 86 mental health professionals, finding many were already using AI tools and held differing but largely positive attitudes towards AI in supporting future mental health care.  

More about the survey:   

  • The survey found AI tools, most commonly ChatGPT, were used by around a third of community members, primarily for quick emotional support or in some cases as their own personal ‘therapist’.  

  • 40 per cent of mental health professionals used AI in their practice, primarily to help with paperwork like note taking, report writing and research.  

  • Of those who had used AI in the last six months, 76.7 per cent of community members and 91.8 per cent of mental health professionals reported AI to be beneficial to varying degrees.  

  • Importantly, nearly half of the community members (46.7 per cent) and over half of the mental health professionals (51.4 per cent) experienced risks or harms when using AI, including issues related to data privacy, ethical use, potential misdiagnosis, unhelpful advice and reduced human connection.  

Associate Professor Cross said one of the key challenges for mental health professionals currently is the need for substantial time-intensive administrative tasks, which limited their availability to provide clinical care to more people.  

“We found that mental health professionals were most likely to use AI tools for things like research, report and letter writing, and other kinds of administrative support – and were also keen to see more AI tools developed to help with things like synthesising clinical evidence and tracking patient progress,” Associate Professor Cross said.  

“Community members, on the other hand, were more likely to use AI tools for advice when they were emotionally distressed, or as a coach or therapist when they weren’t able to access in-person support.”  

While both groups saw the potential of AI to support mental health care in terms of accessibility, cost reduction, personalisation and efficiency, they were equally concerned about reducing human connection, ethics, privacy and regulation, medical errors, potential for misuse and data security.  

“Despite the immense potential, AI integration into mental health systems must be approached with caution,” Associate Professor Cross said.