Apparently companies are now using AI and face recognition to identify the best candidates.
Unilever is among companies using this technology to analyse the language, tone and facial expressions of candidates when they are asked a set of identical job questions which they film on their mobile phone or laptop.
The algorithms select the best applicants by assessing their performances in the videos against about 25,000 pieces of facial and linguistic information compiled from previous interviews of those who have gone on to prove to be good at the job.
Hirevue, the US company which has developed the interview technology, claims it enables hiring firms to interview more candidates in the initial stage rather than simply relying on CVs and that it provides a more reliable and objective indicator of future performance free of human bias.
However, academics and campaigners warned that any AI or facial recognition technology would inevitably have in-built biases in its databases that could discriminate against some candidates and exclude talented applicants who might not conform to the norm.
“It is going to favour people who are good at doing interviews on video and any data set will have biases in it which will rule out people who actually would have been great at the job,” said Anna Cox, professor of human-computer interaction at UCL.
Hirevue, which last month received a major investment injection from the multi-billion pound Carlyle Group, says it has already used its technology for 100,000 interviews in the UK. Worldwide, it claims to deliver one million interviews and more than 150,000 pre-hire assessments every 90 days.
Loren Larsen, Hirevue’s chief technology officer, told The Daily Telegraph that 80 to 90% of the predictive assessment was based on the algorithms’ analysis of candidates’ use of language and verbal skills.
“There are 350-ish features that we look at in language: do you use passive or active words? Do you talk about ‘I’ or ‘We.’ What is the word choice or sentence length? In doctors, you might expect a good one to use more technical language,” he said.
“Then we look at the tone of voice. If someone speaks really slowly, you are probably not going to stay on the phone to buy something from them. If someone speaks at 400 words a minute, people are not going to understand them. Empathy is a piece of that.”
Facial expressions assessed by the algorithms include brow furrowing, brow raising, eye widening or closing, lip tightening, chin raising and smiling, which are important in sales or other public-facing jobs.
“Facial expressions indicate certain emotions, behaviours and personality traits,” said Nathan Mondragon, Hirevue’s chief psychologist.
“We get about 25,000 data points from 15 minutes of video per candidate. The text, the audio and the video come together to give us a very clear analysis and rich data set of how someone is responding, the emotions and cognitions they go through.”
Candidates are ranked on a scale of one to 100 against the database of traits of previous “successful” candidates, with the process taking days rather than weeks or months, says the company. It claims one firm had a 15% uplift in sales.
“I would much prefer having my first screening with an algorithm that treats me fairly rather than one that depends on how tired the recruiter is that day,” said Mr Larsen.
Griff Ferris, Legal and Policy Officer for Big Brother Watch, said: “Using a faceless artificial intelligence system to conduct tens of thousands of interviews has really chilling implications for jobseekers.
“This algorithm will be attempting to recognise and measure the extreme complexities of human speech, body language and expression, which will inevitably have a detrimental effect on unconventional applicants.
“As with many of these systems, unless the algorithm has been trained on an extremely diverse dataset there’s a very high likelihood that it may be biased in some way, resulting in candidates from certain backgrounds being unfairly excluded and discriminated against.”