Scotiabank looks to employee data to tackle gender diversity

When Scotiabank was looking to find new ways to close the gender gap, it turned to what it had been deploying in nearly every other facet of its business: technology.

It analyzed a myriad of data sets for its Canadian employees, including internal social media interactions and past performance data, to come up with variables that correlated with success, said Permpreet Sidhu, the bank’s vice-president of performance and inclusion.

The result was a set of key metrics and an emerging leader index the bank has used since November to identify employees that should be encouraged to move up the corporate ladder, while helping to strip the unconscious bias that can sometimes drive promotion decisions.

Read: Niagara Casinos wins award for far-reaching diversity initiatives

“What we are trying to do is interrupt some of the bias that comes in, even when we are identifying who to develop 12 months before that job even becomes available,” said Sidhu. “The index will put a broader slate of candidates on the radar for leaders.”

It’s the latest example of technology used to combat unconscious bias in the workplace and increase diversity among the ranks. Other strategies in the corporate sector include a tool that can detect biased language in online workplace chats, using software to strip out gender identifiers from LinkedIn Corp. pages while recruiting and employing artificial intelligence to neutralize the promotion process.

“We’re at the frontier right now, where people are running all these experiments,” said Sarah Kaplan, director of the University of Toronto’s Institute for Gender and the Economy. “And we really don’t know what’s effective yet, but everyone is trying and they have this hope that AI or other automated algorithms might help.”

In March, Catalyst Group, an organization dedicated to promoting equal rights for women in the workplace, launched a plug-in for the Slack messaging platform called BiasCorrect. It’s programmed to detect 25 words and phrases with a gender bias, such as “she is so pushy.” Once installed, BiasCorrect should automatically flag the potentially problematic phrase to the Slack user and offer an alternative such as “she is so persuasive.”

Read: Pay gap between women, men in Canadian tech jobs is nearly $20K per year: study

Serena Fong, Catalyst’s vice-president of strategic engagement, said they’ve made the underlying code public to allow users to add more phrases or adapt it for other messaging platforms. Catalyst hopes this plug-in will help people become more aware of their unconscious biases and the impact of their words, and contribute to a more inclusive workplace overall. “There is no quick fix to the problem of unconscious bias,” she said.

However, there’s concern the technology can be just as biased as the person who programmed it or the underlying information used. “AI is not some panacea,” said Kaplan. “We just have to be very thoughtful about it. It’s not like you can remove the hand of the human just by applying these bots. The bots might actually be amplifying bias.”

Two professors from the Massachusetts Institute of Technology and London Business School conducted a real-world experiment that showed a gender-neutral ad for a science, technology, engineering and math job posting online that was shown more frequently to men than to women.

MIT’s Catherine Tucker and LBS’s Anja Lambrecht ran an ad on Facebook Inc., Instagram, Twitter Inc. and other sites through Google’s ad display network. On each site, an algorithm optimized the ad to get the most views, which resulted in more male eyeballs than female, according to an article in Scientific American.

Read: Most Canadian employers not tracking gender pay gap: report

Women generally make more household purchasing decisions than men, and in turn marketing algorithms place a premium on female views of an ad, and it costs more. Showing the ad to men was more cost effective, Tucker told Scientific American. Kaplan said stripping out names and other identifiers that would signal an applicant’s gender may have unintended consequences.

The underlying data, such as the schools on a candidate’s resume, may be influenced by gender, she added. “In fact, their gendered experience throughout their entire careers have shaped all their steps along the way, and you really need to account for that in the selection process.”

Scotiabank has been mindful of this risk and has eliminated some data from its index as a result. The bank has chosen to remove education from its model and instead use on-the-job experiences, said Sidhu.

“You have to be aware and you have to be responsible with the data, and eliminate those fields inherently having bias, so that you’re not continuing to exacerbate the challenges.”

Read: How to use benefits to support diversity and inclusion