D
Digital | Catalyst
Looking to the human side of AI
P
rofessor
Greg
Whitwell, chair of the
CEMS global alliance
in management education
and dean of the University
of Sydney Business School,
explores the need to look
beyond artificial intelligence
(AI) and question the human
drivers behind it.
In a recent global survey of CEMS
graduates, the majority in their
mid-to-late twenties, 81% agreed with
technology leaders such as Facebook’s
Mark Zuckerberg that access to the
internet, and therefore unlimited
knowledge, should be a basic human
right. The principle behind this was
that more knowledge leads to better
decisions. However, while we are
awash with knowledge today, we don’t
necessarily have greater understanding
as we don’t have the skills to evaluate the
quality or veracity of the information
freely available.
Bias-free knowledge?
Most of us assume that AI/machine
knowledge is free from bias because
it is created by a machine without
the emotion or prejudice attached to the
human condition.
CEMS’ corporate partners (more
than 70 global organisations that belong
to the CEMS alliance and influence
the curriculum taught on the CEMS
Master’s in International Management
in 32 top business schools across the
world) are increasingly realising that
it’s dangerous for their employees to
accept, without question, knowledge
created this way. They are questioning
alexandermannsolutions.com
22
“We don’t have the
skills to evaluate the
quality or veracity of
the information
freely available”
Greg Whitwell
how we address what we should be
teaching about the use of tech in
business and how we can train our
current workforces and future leaders
to optimise their use of tech while
recognising its limitations.
The humans behind
the machines
It is crucial to understand how
AI-generated knowledge is created
and the human reality behind it. We
need to understand the nature of the
people who have created these systems;
that they are fundamentally human
creations and tend to incorporate the
unthinking biases of those who have
created them. For example, what kind
of people are coders (traditionally
introverted) and what might this
mean for the impact their creations
have globally? How should that change
our thinking around how we use tech
in business, especially when trying to
create diverse, inclusive workforces?
Curriculum reform ( be it
undergraduate, postgraduate, executive
education or in-house training)
therefore requires something much
more profound than the introduction
of an understanding of coding. This is
a praiseworthy initiative, but it doesn’t
go far enough.
It is one thing to learn about the
technicalities of writing code, it is
another to be able to critically appraise
issues such as privacy in a digital world;
the nature, assumptions and biases of
algorithms and those who write them;
the power and responsibilities of those
organisations that collect, own and
sell the personal data extracted every
time someone uses a smart device;
the means by which misinformation,
‘fake news’ and ‘alternative facts’
are generated, disseminated and
accepted; cyber bullying and the rise of
‘call-out culture’, plus cyber security
and data integrity.
A need for continuous
education
A well-rounded education is now more
important than ever. With that comes
a responsibility for educators to help
learners challenge assumptions, to
rethink what they had previously taken
for granted and to question norms, as
part of a determination to see if there is
a better way of doing things.