O
On Topic | Catalyst
A global perspective
“Build inclusivity and diversity into
the initial product requirements as a vital
parameter alongside high potential
and great performance”
Instead, build inclusivity and diversity into the initial
product requirements as a vital parameter alongside
high potential and great performance.
“If you are clear about your objectives, you can
build something that’s accurate and fair, as well as
being true to your particular organisation’s values.
With clear objectives and parameters, you’ll also
ensure explainability, not only to candidates, but
also if decisions were to be legally challenged. Once
a product is built and works for you, machine learning
can then be applied to make it vastly more scalable.”
Similarly, expanding the breadth of data used to
power a hiring algorithm can transform outcomes.
Diversity-recruiting-software company Headstart
– like many AI companies – benchmarks ‘what good
looks like’ by using data from existing employees. But
on top of this it analyses millions of job descriptions
and roles to see how certain attributes of a candidate
might predict how they will perform in future.
“You don’t base a hiring decision on just one
factor,” says founder Nicholas Shekerdemian. “We
review our model in real time so we’re looking at
thousands of data points across lots of organisations
– we start to see a networked effect.”
alexandermannsolutions.com
38
Globally, there are different levels of maturity and
acceptance of AI technologies, so multi-national
hiring strategies need to cover a lot of bases.
Roger Philby, founder of The Chemistry Group,
says: “The Asian market is so fragmented. Japan will
be very different to Malaysia, for example. In Africa,
candidates have gone straight to mobile, so any tech
development for this region needs to take this into
consideration. If you can’t offer a dynamic mobile
solution you can forget the emerging markets.”
Recruiters also need to be mindful of sharing and
collecting data across global markets, as they may be
in breach of data protection regulations, particularly
the EU’s General Data Protection Regulation (GDPR).
“Collecting data and sorting it into algorithms could
end up creating highly sensitive categories of data
under GDPR or other international data protection
laws,” explains Jonathan Maude, partner at law firm
Vedder Price.
Nilsson adds that different cultures’ acceptance of
or desire to use AI often differs depending on their
attitudes towards data privacy: “Certain countries
such as Germany are stronger on data protection.
That doesn’t mean they’re against using it, there’s
just a greater need to show there’s a benefit to
the individual.”
With all this in mind, relying on AI alone to offer the
fairest and most efficient way of plugging skills gaps is
not the answer. Adding elements such as assessments
to the screening process can further reduce the risk
of bias, adds Philby: “Candidates want the process
to be as frictionless as possible, so some companies
are running trials where they use Facebook or social
media data to see how that compares with people
who are successful in their business, but they’re
cross referencing that with assessment data so over
time we can say what a successful person’s profile
looks like.”
Dorothée El-Khoury, HR practice leader at
consulting company The Hackett Group, believes
one of the issues with AI is that some recruiters
expect it to solve all their problems. “It’s still early
days – the number of companies using it on a
large scale is small,” she says.
“Some organisations are segmenting their talent
based on how business-critical those roles are and
how easy they are to find. Those that are critical and
difficult to source are where the focus is.”
In addition, some companies are tapping into
passive candidates’ social media behaviour and
tailoring their communications to them, taking an
‘external’ talent approach to internal candidates and
making the most of their alumni communities.