Sapia founder and chief executive Barb Hyman says AI hiring is the only way to give someone a fair interview.

Michael Scott, the protagonist from the US version of The Office, is using an AI recruiter to hire a receptionist.

Guardian Australia applies.

The text-based system asks applicants five questions that delve into how they responded to past work situations, including dealing with difficult colleagues and juggling competing work demands.

Potential employees type their answers into a chat-style program that resembles a responsive help desk. The real – and unnerving – power of AI then kicks in, sending a score and traits profile to the employer, and a personality report to the applicant. (More on our results later.)

This demonstration, by Melbourne-based startup Sapia.ai, resembles the initial structured interview process used by their clients, which includes some of Australia’s biggest companies such as Qantas, Medibank, Suncorp and Woolworths.

The process would typically create a shortlist an employer can follow up on, with insights on personality markers including humility, extraversion and conscientiousness.

How an AI voice clone fooled Centrelink – video

For customer service roles, it is designed to help an employer know whether someone is amiable. For a manual role, an employer might want to know whether an applicant will turn up on time.

“You basically interview the world; everybody gets an interview,” says Sapia founder and chief executive Barb Hyman.

The selling points of AI hiring are clear: they can automate costly and time-consuming processes for businesses and government agencies, especially for large recruitment drives for nonmanagerial roles.

Sapia’s biggest claim, however, might be that it is the only way to give someone a fair interview.

“The only way to remove bias in hiring is to not use people right at the first gate,” says Hyman. “That’s where our technology comes in: it’s blind; it’s untimed, it doesn’t use résumé data or your social media data or demographic data. All it is using is the text results.”

Sapia founder and chief executive Barb Hyman says AI hiring is the only way to give someone a fair interview.
Sapia founder and chief executive Barb Hyman says AI hiring is the only way to give someone a fair interview. Photograph: Ellen Smith/The Guardian

A patchy track record

Sapia is not the only AI company claiming their technology will reduce bias in the hiring process. A host of companies around Australia are offering AI-augmented recruitment tools, including not just chat-based models, but also one-way video interviews, automated reference checks, social media analysers and more.

In 2022, a survey of Australian public sector agencies found at least a quarter had used AI-assisted tech in recruitment that year. Separate research from the Diversity Council of Australia and Monash University suggests that a third of Australian organisations are using it at some point in the hiring process.

Applicants, though, are often not aware that they will be subjected to an automated process, or on what basis they will be assessed within that.

The office of the Merit Protection Commissioner advises public service agencies that when they use AI tools for recruitment, there should be “a clear demonstrated connection between the candidate’s qualities being assessed and the qualities required to perform the duties of the job”.

The commissioner’s office also cautions that AI may assess candidates on something other than merit, raise ethical and legal concerns about transparency and data bias, produce biased results or cause “statistical bias” by erroneously interpreting socioeconomic markers as indicative of success.

There’s good reason for that warning. AI’s track record on bias has been worrying.

In 2017, Amazon quietly scrapped an experimental candidate-ranking tool that had been trained on CVs from the mostly male tech industry, effectively teaching itself that male candidates were preferable. The tool systematically downgraded women’s CVs, penalising those that included phrases such as “women’s chess club captain”, and elevating those that used verbs more commonly found on male engineers’ CVs, such as “executed” and “captured”.

Research out of the US in 2020 demonstrated that facial-analysis technology created by Microsoft and IBM, among others, performed better on lighter-skinned subjects and men, with darker-skinned females most often misgendered by the programs.

Last year a study out of Cambridge University showed that AI is not a benign intermediary, but that “by constructing associations between words and people’s bodies” it helps to produce the “ideal candidate” rather than merely observing or identifying it.

Natalie Sheard, a lawyer and PhD candidate at La Trobe university whose doctorate examines the regulation of and discrimination in AI-based hiring systems, says this lack of transparency is a huge problem for equity.

At least a quarter of Australian public sector agencies used AI-assisted tech in recruitment in 2022.
At least a quarter of Australian public sector agencies used AI-assisted tech in recruitment in 2022. Photograph: JYPIX/Alamy

“Messenger-style apps are based on natural language processing, similar to ChatGPT, so the training data for those systems tends to be the words or vocal sounds of people who speak standard English,” Sheard says.

“So if you’re a non-native speaker, how does it deal with you? It might say you don’t have good communication skills if you don’t use standard English grammar, or you might have different cultural traits that the system might not recognise because it was trained on native speakers.”

Another concern is how physical disability is accounted for in something like a chat or video interview. And with the lack of transparency around whether assessments are being made with AI and on what basis, it’s often impossible for candidates to know that they may need reasonable adjustments to which they are legally entitled.

“There are legal requirements for organisations to adjust for disability in the hiring process. But that requires people to disclose their disability straight up when they have no trust with this employer. And these systems change traditional recruitment practices, so you don’t know what the assessment is all about, you don’t know an algorithm is going to assess you or how. You might not know that you need a reasonable adjustment,” says Sheard.

skip past newsletter promotion

Australia has no laws specifically governing AI recruitment tools. While the department of industry has developed an AI ethics framework, which includes principles of transparency, explainability, accountability and privacy, the code is voluntary.

“There are low levels of understanding in the community about AI systems, and because employers are very reliant on these vendors, they deploy [the tools] without any governance systems,” says Sheard.

“Employers don’t have any bad intent, they want to do the right things but they have no idea what they should be doing. There are no internal oversight mechanisms set up, no independent auditing systems to ensure there is no bias.”

A question of diversity

Hyman says client feedback and independent research shows that the broader community is comfortable with recruiters using AI.

“They need to have an experience that is inviting, inclusive and attracts more diversity,” says Hyman. She says Sapia’s untimed, low-stress, text-based system fits this criteria.

“You are twice as likely to get women and keep women in the hiring process when you’re using AI. It’s a complete fiction that people don’t want it and don’t trust it. We see the complete opposite in our data.”

‘You are twice as likely to get women and keep women in the hiring process when you’re using AI,’ says Sapia founder Barb Hyman.
‘You are twice as likely to get women and keep women in the hiring process when you’re using AI,’ says Sapia founder Barb Hyman. Photograph: Ellen Smith/The Guardian

Research from the Diversity Council of Australia and Monash University is not quite so enthusiastic, showing there is a “clear divide” between both employers and candidates who were “converted” or “cautious” about AI recruitment tools, with 50% of employers converted to the technology but only a third of job applicants. First Nations job applicants were among those most likely to be worried.

DCA recommends recruiters be transparent about the due diligence protocols they have in place to ensure AI-supported recruitment tools are “bias-free, inclusive and accessible”.

In the Sapia demonstration, the AI quickly generates brief notes of personality feedback at the end of the application for the interviewee.

This is based on how someone rates on various markers, including conscientiousness and agreeableness, which the AI matches with pre-written phrases that resemble something a life coach might say.

A more thorough assessment – not visible to the applicant – would be sent to the recruiter.

Sapia says its chat-interview software analysed language proficiency, with a profanity detector included too, with the company saying these were important considerations for customer-facing roles.

Hyman says the language analysis is based on the “billion words of data” collected from responses in the years since the tech company was founded in 2013. The data itself is proprietary.

You’re (not) hired!

So, could Guardian Australian work for Michael Scott at fictional paper company Dunder Mifflin?

“You are self-assured but not overly confident,” the personality feedback says in response to Guardian Australia’s application in the AI demonstration.

It follows with a subtle suggestion that this applicant might not be a good fit for the receptionist role, which requires “repetition, routine and following a defined process”.

But it has some helpful advice: “Potentially balance that with variety outside of work.”

Looks like we’re not a good fit for this job.



Share This Article