Diversity, The Inclusion Initiative, TII, IWD Karina Robinson Diversity, The Inclusion Initiative, TII, IWD Karina Robinson

Artificial Intelligence and bookcases

Humans host over 140 cognitive biases. These range from excessive reliance on the printed or digital word to confirmation bias – the tendency to recall information that confirms our pre-existing beliefs. How, labouring under such a bias burden, can companies overcome the human factor to create a more inclusive and diverse world?

As we celebrate International Women’s Day, and the advances that have been made in the corporate world, it is worth noting that not everyone believes that we are prone to prejudices (the brain’s use of shortcuts to help us deal with days filled with stimulation and choices). Only last month at a town hall virtual meeting of KPMG UK staff, Bill Michael said that discrimination caused by unconscious bias was “complete and utter crap.” The consulting firm’s Chairman has since resigned.

 

Creating inclusion through AI

Humans host over 140 cognitive biases. These range from excessive reliance on the printed or digital word to confirmation bias – the tendency to recall information that confirms our pre-existing beliefs. How, labouring under such a bias burden, can companies overcome the human factor to create a more inclusive and diverse world?

As we celebrate International Women’s Day, and the advances that have been made in the corporate world, it is worth noting that not everyone believes that we are prone to prejudices (the brain’s use of shortcuts to help us deal with days filled with stimulation and choices). Only last month at a town hall virtual meeting of KPMG UK staff, Bill Michael said that discrimination caused by unconscious bias was “complete and utter crap.” The consulting firm’s Chairman has since resigned.

Mary Beard, the iconic Cambridge professor, writes that centuries of conditioning means that “our mental, cultural template for a powerful person remains resolutely male.” She uses the example of closing one’s eyes to conjure up the image of a professor. It is not just that most of us see a man, so does she: “the cultural stereotype is so strong that…it is still hard for me to imagine me, or someone like me, in my role.”

Margaret Thatcher, she notes, took voice training as she rose to power to lower her pitch by an octave and thus give her the (male) authority that her advisors felt she lacked when speaking in a high pitched, female voice.

To counter such cultural hard-wiring, Artificial Intelligence (AI) is key. It is, however, programmed by humans and prone to other biases as well, based on the data input and to what one might call algorithmic quirks. In June 2020 Amazon was forced by public opinion to ban the police from using its controversial facial recognition technology which was prone to racial bias in its surveillance technology.

Recruitment algorithms have been criticised for biases in favour of white, male candidates, given that the data set fed to them consists of successful candidates from the past. Even more subtle factors can influence the result. A Munich company’s algorithm was proven to favour candidates who had a bookcase behind them, helping them score more highly for “conscientiousness” and “agreeableness” and lower for “neuroticism,” according to a German public broadcaster’s analysis of the recruitment tool.

Yet it is not all bad news.

“We are in the early days of AI applications,” notes Richard Nesbitt, the former CEO of the Toronto Stock Exchange, speaking at a recent roundtable organised by the LSE’s The Inclusion Initiative (TII). “They will become more relevant and useful as they improve.”

Technological advances, including in Natural Language Processing via quantum, in essence the first step in true ‘Machine Intelligence’, is one way. Another is through more evidence of when AI works – negative examples always tend to dominate the news.

“A study of the academic literature on AI gives us evidence that it can improve efficiency in the hiring process, and likely also make better hiring decisions in terms of job performance and diversity of hires, depending on the type of AI,” says Paris Will, Researcher in Behavioural Science at TII.

One such type of AI is produced by MeVitae, an Oxford-based start-up which partners with Microsoft and Oracle, among others. Its algorithms anonymise CVs and cover letters by automatically removing over 15 key personal identifiers (such as gender, age, social economic background) directly from applicant resumes. This enables the human recruiter to focus on important matters like skills and capabilities.

“We have mapped out that 60% of the 140 human cognitive biases come into play during the screening process for candidates i.e. the moment someone sees a CV/cover letter,” says Riham Sattii, the clinical neuroscientist Co-founder and CEO of MeVitae,

AI can also be useful in tracking the words used in job descriptions and how they influence interviews and hiring.

Speaking on the TII panel, Deborah Lorenzen, Head of Enterprise Data Governance at State Street, points out that from existing evidence we can probably guess that using words like “battle” and “driven” will lead to more men applying for a job. “What is most surprising to many though is that using the word ‘manage’ is also more likely to attract male candidates to your job description.”

“Most hiring managers would probably shake their heads at that since we are so used to the word. AI can help us understand which of the things we are sure we know which don’t turn out to be true when you have data,” she concludes.

Despite this rather promising outlook, AI hiring is perceived more negatively on almost every outcome in comparison to human hiring.

Yet given that the financial and professional services industry has among the worst numbers for diversity in business, a reassessment of AI applications is due. A recent UK survey found that between 25% to 50% of senior managers in the global City came from independent or grammar schools, and almost 90% from higher socio-economic backgrounds.

The push for change is coming from many sources, including respected bodies like McKinsey, whose studies show that companies in the top quartile for ethnic diversity are 33% more likely to have industry-leading profitability. Regulators and governments are another source. In the UK, the Financial Conduct Authority (FCA) has linked diversity to its campaign on firms’ culture. It has reiterated that non-financial misconduct – meaning culture - remains a key factor in its supervision.

Meanwhile, the UK Government has launched a drive to increase the number of people from poorer backgrounds in senior positions in financial and professional firms, while the German government in January signed a bill to boost the number of females on its corporate Boards.

Creating a more inclusive corporate world, with more sustained profitability and innovation, will take many years and much effort. Artificial Intelligence has an important role to play, with the caveat that it must be used in conjunction with the human factor. In Satti’s words: “Technology is designed to enhance the way we live and work, not replace it.”


 Along with Associate Professor Grace Lordan, I am Director of The Inclusion Initiative at the LSE, which uses behavioural science and data to create inclusive company cultures. Feel free to sign up to our monthly newsletter which details events and research.

 
Read More