Study finds risk and reward in AI powered jobs search

Posted on 14 Nov 2023

By Greg Thom, journalist, Institute of Community Directors Australia

AI 2

New research has raised fears the rise of artificial intelligence tools could have a negative impact on workplace diversity.

Diversity Council Australia (DCA) has warned that AI-powered recruitment has the potential to mirror society’s inequalities and “bake in bias”.

A three-year study by DCA in partnership with Hudson RPO and Monash University has revealed that although there are benefits to using AI in the recruitment process, there is also a risk the burgeoning technology could reinforce systemic bias and discrimination if incorrectly used.

To guard against this, DCA has released guidelines for employers designed to ensure inclusivity and reduce the risk of bias in the recruitment process.

Key elements include:

  • a five-step process called TREAD (Team Up, Reflect, Educate, Acquire, Decide) that encourages employers to tread carefully through the process of deploying AI recruitment.
  • a reflective assessment checklist that enables employers to make an informed decision about how they can best proceed with deploying an AI recruitment tool, so that it helps rather than harms workforce diversity.
TREAD graphic

The guidelines for employers represent the third and final stage of the long running Inclusive AI at Work in Recruitment project.

The guidelines were developed in consultation with a panel of stakeholders representing marginalised job seekers, employers with experience using AI, academics, and technology experts.

AI applications in recruitment include:

  • anonymising candidates
  • conducting video interviews via a virtual recruiter
  • powering psychometric testing
  • sourcing candidates on platforms such as LinkedIn via predictive software

The use of AI tools such as ChatGPT has nearly doubled in the past year, with spending on AI systems predicted to surge 24% to more than $3.6 billion by 2025.

The publication of DCA's research comes alongside the release of a report just released by the technology focused-social enterprise Infoxchange, which reveals that one-in-four not-for-profit and charity organisations are already using artificial intelligence with two-thirds on track to adopt it within a year.

“We know that unless AI is deployed with a focus on diversity and inclusion it has the potential to mirror society's inequalities and bake in systemic biases."
Diversity Council Australia CEO, Lisa Annese.
AI recruitment graphic

DCA CEO Lisa Annese said if used correctly, AI technology can reduce costs, save time, and create fairer outcomes for minority groups.

“We know that unless AI is deployed with a focus on diversity and inclusion it has the potential to mirror society's inequalities and bake in systemic biases.

“Conversely, if it’s used with D&I front of mind, the benefits can be astounding.”

Ms Annese said DCA's AI guidelines were the result of three years of trailblazing research distilled into practical steps for crucial reflection and action.

“These guidelines will help provide employers with the tools they need to take advantage of this incredible technology in a way that reduces bias and helps foster a more inclusive and diverse Australian workforce.”

Kimberley Hubble, CEO of recruitment firm Hudson RPO which sponsored the research, agreed.

“While AI offers us immense opportunities in recruitment and many areas of HR, we need to use it purposefully and carefully to ensure we improve not hinder diversity outcomes in the workplace,” she said.

Ms Hubble said DCA’s evidence-based guidelines would give employers practical advice on using AI inclusively during recruitment and selection to maximise diversity in their businesses.

More information

Diversity Council Australia Inclusive AI at Work in Recruitment project stage one (March, 2022)

Diversity Council Australia Inclusive AI at Work in Recruitment project stage two (January, 2023)

More news

Become a member of ICDA – it's free!