A new web platform is taking the bias out of government hiring by using behavioural science and machine learning to help recruiters find candidates based on talent, regardless of their gender, ethnicity, or any other background. Of the successful candidates chosen through the “Applied” platform thus far, 60% would not have been hired through a traditional recruitment process. It is the first tech venture of the UK’s Behavioural Insights Team – the world’s first “nudge” unit – and has government clients in the UK, Australia and Singapore.
Results & Impact
Graduates hired through the platform have so far been completely gender equal. Of the successful candidates chosen through the Applied platform, 60% would not have been hired with a typical CV. Thus far, more than 16,000 applicants and 6,000 managers have used the platform. Multiple studies suggest people from minority backgrounds are less likely to be interviewed or given jobs than non-minority applicants with equal qualifications.
Applied, Behavioural Insights Team, UK Cabinet Office, Nesta, Governments of Australia and Singapore
The platform uses text analytics to find and remove bias and stereotype threat from job descriptions, and also links the final hiring outcome with that job description so future improvements can be made with machine learning. Once people have applied for the job, the technology is used to: anonymise the applications, break them into chunks, randomise the order for review, and share them across the hiring team. Applications are then scored by question, not by candidate, with each question being given a weighted points score. The platform also provides real-time diversity data and analytics on who is applying for the job and who is dropping off at each stage of the application, so that organisations can identify which stage of their hiring is preventing greater diversity.
Women and girls, ethnic minorities, LGBTQ+ people
Cost & Value
The cost of using the platform as a hiring tool is competitive: the cost per position to be filled can be as low as $260, even with a large volume of applicants.
Running since 2017
One risk is that for organisations that already hire with a strong commitment to diversity, blind selection can actually produce more homogenous teams, as it eliminates the possibility of positive discrimination. The biggest challenge the platform has faced so far is another application of behavioural science: humans' status quo bias; our preference for stability. Convincing organisations to change the comfortable systems and technology they have relied on previously is substantially harder than getting them to run a diversity training course, even if the first option is immensely more effective.
The platform has been used for hiring by the UK Civil Service. It is also being piloted in the Australian government in the Department of the Prime Minister and Cabinet, and Singaporean government, in the Ministry of Manpower.
A new platform is using insights from behavioural science and artificial intelligence to remove unconscious bias from job recruitment.
“There was persistent frustration that this was an area in which nothing that was being done was working, that we were falling short with traditional hiring and promoting methods in the workplace,” said Kate Glazebrook, the CEO and co-founder of the platform. “There is no evidence that unconscious bias training works, and yet corporates spend billions every year on it.”
Applied was developed by the UK Behavioural Insights Team (BIT) – or ‘nudge’ unit – the world’s first government institution focused on applying behavioural science to policy. Applied is the first tech product spinout of the ventures arm of the BIT, and is already being used by the UK Civil Service and departments within the Australian and Singaporean governments.
“There is no evidence that unconscious bias training works, yet corporates spend billions every year on it”
“The reason the BIT has a ventures arm, and also the premise on which we developed the platform, is that sometimes the best solution to a problem we care about is not really policy, but being in the middle of an interaction between two people and improving the outcome,” said Glazebrook, who used to work as a public servant in the BIT herself.
The platform is essentially a selection tool that removes the risk of bias from the application procedure. It dispenses with traditional CVs, and enables recruiters to find the best candidate regardless of background.
“All the data shows that the average length of time someone spends looking at a CV is about 10 seconds,” said Glazebrook. “And 20-30 years worth of studies have demonstrated that those seconds are spent paying attention unconsciously to things like name, where you live; all the demographic information at the top of a CV that directly harms diversity.”
“Sometimes the best solution to a problem we care about is not really policy, but being in the middle of an interaction between two people”
The first step to debiased hiring is in the advertising: Applied uses text analytics to find and remove bias and stereotype threat from job descriptions, and also links the final hiring outcomes with that job description so future improvements can be made with machine learning.
The platform then has a tried and tested algorithm that removes distracting and irrelevant demographic data, and finds talent by comparing answers on a variety of skills-based questions. Each feature is piloted and tested before it is added to the platform, and the results of this long history of scientific experimentation are publicly available online.
“We have our own review algorithm that does four things. Firstly, it anonymises the candidate applications. It then chunks those applications up into smaller pieces so reviewers compare candidates on a given dimension directly – just like when teachers mark, they look at all students’ “question one” answers together and then the “question two” answers, as it’s easier to compare,” said Glazebrook.
“If you leave hiring to one person, you often just end up replicating the way they like to see the world”
“We then allow for the wisdom of crowd, by getting multiple reviewers to score individually and then averaging that out,” she said. “We’ve shown that if you leave hiring to one person, you often just end up replicating the way they like to see the world.”
The final feature of the platform is a randomisation of the order in which candidates appear to reviewers. “That’s so that we can accommodate for quirky things we’ve found in the data, like that we tend to be more generous when we first start scoring,” said Glazebrook.
Clients are also provided with real-time diversity data and analytics on who is applying for the job and who is dropping off at each stage of the application, to identify which stage of their hiring process is blocking diversity.
Applied only started running this year, but already has some impressive results from the more than 16,000 applicants and 6,000 managers that have used it. Of the successful candidates chosen through the platform, 60% would not have been hired with a typical CV. There has also been gender parity in graduate hiring.
“The public sector is at the front end of this relative to corporates”
However, blind selection processes do pose risks for organisations that already hire with a determined commitment to diversity, as they can actually produce more homogenous teams by eliminating any possibility of positive discrimination.
“The overall evidence suggests that, far more often than not, blinding will lead to more diverse teams,” said Glazebrook. “But blinding may well hurt you if you’re always very, very proactive in choosing the most diverse candidates, because it might mean that occasionally the best-qualified person for the job, as assessed by you blindly, is actually the white male.”
Interestingly, while Applied has several high-profile private sector clients, much interest has come from within the public sector where it was first developed.
“The public sector is at the front end of this relative to corporates; the UK public sector is especially forward thinking on this stuff,” said Glazebrook. “From my experience, the degree of thinking and the passion for doing something about it is as much there within government as it is in corporates.”
“Change is always hard, and changing systems is always harder”
Governments and corporations are charged the same for the service, and the cost is comparable to other selection tools on the market; the cost per position or role to be filled can be as low as $260, even with a large volume of applicants.
However, the challenge for the platform will come in translating interest and pilots into full-scale implementation; another instance of behavioural science at work. Convincing organisations to change the comfortable systems and technology they have relied on so far is substantially harder than getting them to run a diversity training course, even if the first option is immensely more effective and at no higher cost.
“The biggest challenges are the anticipated behavioural ones. Change is always hard, and changing systems is always harder. Billions each year are spent on unconscious bias training – which dwarfs what you’d be spending on the platform – but people find that easier than spending on technology that uproots their ways of working,” said Glazebrook.
“The challenge for many people is still letting go of what has been a comfortable process they’ve used before, and they’ve been hired on before.”
(Picture credit: Flickr/WOCinTech Chat)