industry & research
campusreview.com.au
I, Robot, MD
University of Melbourne
professor says AI is the future
for mental health assistance.
By Wade Zaglas
T
hankfully we’ve come a long way
since people with mental health
issues – such as depression, bipolar
disorder or psychosis – were locked up in
large, austere institutions or subjected to
barbaric treatments.
With the introduction of more targeted
drugs, such as selective serotonin reuptake
inhibitors (SSRIs) and mood stabilisers,
including lithium, the lives of people with
mental illness have undoubtedly improved.
There have also been many
psychological therapies trialled and
approved, the most common being
cognitive behavioural therapy (CBT).
However, while these developments
have helped many mental health patients
reintegrate into society, more needs to
be done. Devising effective mental health
treatments is a pressing issue. One in five
Australians will experience a mental illness
at some point in their lives, and services
in rural and remote areas struggle to offer
effective, lifelong treatment.
The cost to the country from mental
illness is also sobering. According to the
Royal Australian and New Zealand College
of Psychiatrists, in 2014 mental illness cost
the Australian economy $98 billion, or 6 per
cent of Australia’s GDP.
So, when you consider the vastness of
Australia, the prevalence of mental health
conditions in the community, and the cost
to the country, funding research in this area
becomes critical.
Enter AI, or artificial intelligence.
Grant Blashki, an associate professor
from the Nossal Institute for Global Health
at the University of Melbourne, said recently
that “if we can navigate the ethical and
privacy concerns”, artificial intelligence
may help us keep up with the increasing
demand for mental health services.
Despite his enthusiasm, he cautioned
that developing trust with AI will be harder
than achieving it with a qualified health
professional such as psychologist or
psychiatrist.
Algorithms are a source of anxiety too,
he added, particularly when they reach
conclusions that are “incorrect, biased or
even discriminatory”.
What can’t be ignored, however, is that
these algorithm-based digital platforms for
accessing mental health have the potential
to reach a much broader segment of the
population, particularly those who feel
stigmatised by their condition, live remotely,
or whose socioeconomic status makes
accessing traditional services difficult.
There is a plethora of internet-based
mental health interventions, often referred
to as e-therapies or e-counselling, such
as moodgym and myCompass, and many
have proven to be effective. However,
Blashki makes a distinction between these
approaches and AI-based mental health
solutions, which are “designed to learn and
to adjust to change based on experience to
make better decisions in the future”.
As he points out, AI technology is already
being trialled to ascertain someone’s
predisposition to mental health conditions
through the comments they make on
Facebook. In that sense, it can help to
detect depression before full onset begins.
To test this, researchers in 2018 analysed
700 electronic Facebook records, with
users’ consent, and found correlations
between the types of language used online
and the existence of a depressive disorder.
Recurrent themes also popped up, with
hostility, isolation, rumination and self-
reference permeating many of the posts
belonging to the group with depressive
disorders.
As research into the area of AI and mental
health progresses, it is becoming apparent
that it holds a lot of promise as a tool for
both early detection and diagnosis.
“Clinicians have for many decades utilised
mood-tracking tools to help monitor
patient progress, but AI brings a much more
comprehensive – and perhaps intrusive –
approach to tracking patient trajectories,”
Blashki says.
Singapore is forging ahead with an
AI-inspired intervention for people with
a mental health condition. The program,
Artificial intelligence may
help us keep up with the
increasing demand for mental
health services.
called Cogniant, uses phone data to
monitor patient behaviour and inform the
clinician of progress or relapse.
The program can also monitor an
individual’s daily routines and activities,
which help inform how the person is
dealing with the mental health condition.
Of course, ethical and privacy concerns
will need to remain at the forefront of
researchers’ minds, as such information in
the wrong hands could be dangerous, even
life-destroying. ■
19