Insights for Organisations

Balancing act: finding the missing 10% and 34% in tech

Jonathan Young
27.03.2024 Published: 25.03.24, Modified: 27.03.2024 15:03:30

The digital skills shortage remains an unresolved and key topic in the UK, and I was lucky enough to be invited to speak on a panel of industry experts on this subject in the Houses of Parliament earlier this year. I strongly believe that, if we are to address this issue, we should be thinking about the “missing 10% and the missing 34%”. By this I mean the missing 10% of BAME employees in the technology industry and the missing 34% of women.

I also believe that “thinking about” the missing 10% and 34% means finding a way to listen to their collective voice and to understand how they think and feel about working in technology. Only when we understand how people think about something, and how they feel, can the industry start to try to change things to make all groups feel welcome in this amazing profession. We also need to be listening to the well represented groups (pale and male, like me) and how they feel because, without doing this, we have no way of comparing and contrasting what they think about a ‘a career in technology’.

In collaboration with Durham University, we have started down an exciting path of research to listen to all groups in society and to try to understand how they feel about working in technology. We are collecting large amounts of text and speech from people applying to join the technology industry in the UK and USA. We are capturing what they say and what they write and are starting to use AI and linguistic analysis engines to interrogate that data and look for patterns.

If the text can reveal to us how different groups think and talk and feel about technology careers, maybe we can use that to intervene and change parts of our recruitment process or advertising to encourage more women and people of colour to consider joining the IT industry. Maybe we could also take steps to encourage people from those same underrepresented groups to stick with their technology careers – as too many people start in technology and then leave too early.

The missing 10% and 34%? You made that up…

Underrepresented groups in IT remain underrepresented despite the various efforts to try and fix this. Estimates vary on the proportion of the technology workforce in the UK which is black, Asian or minority ethnic (BAME). ColorinTech.org (with PWC and the Guardian, 2019) stated that only 4% of the technology workforce is BAME, versus 14.5% of the population in the UK (Office of National Statistics). Further, government data from 2019 shows that whilst the number of women in technology had increased, so had the number of people overall in technology, and so the percentage of women in technology had remained largely stable at 16% for the preceding 10 years.

How are you proposing to listen to the missing people?

It is not possible to listen to people who are missing. Because, by definition, they are not present. However, as we have 100,000 people applying to start a career in technology at FDM each year, if we ask them how they feel when they apply, then we are starting to get somewhere. We cannot speak to the missing people themselves, but we can at least listen to people from the same underrepresented groups.

We do this in two main ways – Firstly by asking the direct question in the interview process. And secondly, our video interviews provide further data about how different people from different groups think.

Asking individuals what they think is one way to listen but not a very efficient one. But if we can look at the corpus or the text that comprises what the applicants write and say en masse then we can start to identify patterns. As part of our commitment to DEI, we encourage our applicants to also provide their diversity markers and many do. Whilst the data is anonymised before analysis, the diversity markers are kept which means that we should be further able to spot sentiment and topic clusters and patterns in feelings in specific groups in our applicants.

If we want to understand the effect of gender, say, on the way people think about a subject we could compare the comments from one man and one woman. However, if the woman is white, and 55 and living in Surrey, UK and the man an African American, 18 years old and living in Baltimore then there is likely to be a large number of other factors (besides just gender) affecting the way they think and speak about the chosen subject.

The great news also is that our applicants (who make up our dataset) are very similar in their stage in life in that most are university graduates, aged 21-30, interested in a career in technology, just starting their career and so on. So, the ‘noise’ that exists in public dataset is to some degree limited and so the influence of their diversity characteristics should be more pronounced.

To give you an idea of the amount of data that they are contributing us for this research, we have about 10 million words to the end of 2022, and the dataset continues to grow. That’s about 22,000-25,000 pages of A4 of people talking about technology and technology careers.

That’s great – people are talking, and you are capturing what they say. But how do you ‘listen’ to 10M words?

AI is big in the press and gets a bad press at the moment. Are people going to lose their jobs to AI? What chance is there for artistes if AI can replicate their music style in seconds?

A question which I was asked recently as a governor of a school was, “Will students cheat and pass exams using ChatGPT?”.

The questioner knew that ChatGPT technically knows nothing about the GCSE subject but has access to vast amounts of data and an AI engine to exploit that data. Also, that ChatGPT had been shown as capable of generating answers to GCSE exams which examiners would mark as “pass”. Whilst it may seem natural that this negative question is the question which is being asked, shouldn’t we also be taking advantage of what ChatGPT has revealed to us and thinking differently? Perhaps we could also ask:

‘Why do we test children’s understanding of a subject using examinations which “something” (an AI engine in this case) which knows nothing about the topic can pass?’

Similarly, we are looking at exploiting the positive power of AI to interrogate our amazing dataset. Examples could include:

That is how we are proposing to listen to what is being said.

Wow – what next?

It’s early days in our work but the initial data gathering is extremely encouraging and some initial ‘passes’ with some AI engines have shown that the data is rich and starts to reveal some interesting patterns. I guess it’s a case of ‘watch this space.’.

The Holy Grail is that we find some themes which lead us to change how we in the technology profession engage with the public and thereby encourage underrepresented groups to apply and stick with the application process and join this great profession.

For me personally I am just delighted to be heading up this work and to have a great and supportive employer who shares and respects my interest in this space.


We at FDM have partnered with Microsoft to organise an event ‘Artificial Intelligence for Real-life Business Challenges’ that brings together industry experts who will share their thoughts on how AI can shape your tech workforce.

Insights

Insights for Organisations

Is your business ready for AI?

FDM Consultant Jonathan van Kuijk works in the Workplace Technology department for a retail client.

Find out more
Insights for Organisations

From data to action: strategies for tackling financial crime in the UK

The UK loses a staggering £8.3 billion each year to financial crime, the government's Economic Crime Survey (ECS) has revealed.

Alumni

FDM Alumni's fast track journey to TechSkills accreditation

Alice Watkins is an FDM Alumni working as a Business Analyst for a global banking client.