Unfortunately, no one is exempt from having biases and therefore we need to implement systems and structures to reduce the negative and prejudicial impact it can have.
During our panel discussion on the unconscious bias battle, it became obvious that everything from how we behave, look and communicate down to the language we use has a huge impact on creating unconscious biases. This inadvertently stimulates key decision-making processes and snap judgements – whether in work or in our everyday life choices – and can be quite detrimental to the recruitment process when you want to hire exceptional talent.
It鈥檚 not easy to separate our biases – like Andrew Marcinko, Assistant Professor, Behavioural Science, Durham University, said: 鈥淲e are all biased, I鈥檓 biased, you鈥檙e biased and we go through life with these biases,鈥 meaning that we need help to remove them.
As a result of these biases, the gender pay gap has become a fact of life and we are not making progress fast enough to level the playing field. The average gender pay gap in April 2021 was 18.2% which is a 7.9% increase since April 2020.
More from Interviews
- A Chat With Madhu Nadig, Co-Founder & CTO Of Flagright And FinTech50 2026 Judge
- A Chat With Piero Pavone, CEO Of Preciso On How Native Advertising Is Shaping A More Sustainable Future
- Efficient Referrals: Meet Kirsty Sharman, Founder Of Referral Factory
- A Chat With Jo茫o Moura, CEO And Co-Founder, On Transaction Risk Platform: Fraudio
- A Chat With Michael J Bannach, Founder & President, Stealth Technology Group On How Employees Leak Company Secrets Into Chatbots 鈥 And What Safe, Approved AI Should Look Like
- A Conversation With Allister Frost, Future-Ready Mindset Author and Speaker, On How AI Panic Is Pushing Brits Into Rushed Career Swaps That Could Prove Costly
- Interview With Yuliya Barabash, Founder Of SBSB Fintech Lawyers On Where Crypto Companies Actually Win in 2026
- A Chat With Avion Gray (CEO) And Samantha Rosenberg (COO), Co-Founders Of Belong On Wealth Building
The primary reason for this is that there are more men in senior roles than women, largely due to the 1950s pattern of men going to work and women being expected to shoulder the responsibilities at home. Senior roles have not been designed to be inclusive enough and nearly 90% of the workforce want to work more flexibly. And this is where technology plays a role.
There are ways we can address the gender pay gap using technology – and applications of AI helping to reduce bias within recruitment have been gaining traction in the past five years. The main benefit of AI is the fact that it can be more objective than humans. For example, there are now tools to help organisations write a 鈥榖ias-free鈥 job description and AI empowers us to make more data-driven decisions to support our gut and human instinct.
Furthermore, AI-smart matching can ensure that candidates are matched to employers on the grounds of merit and experience, not gender. The way it works is by comparing and scoring objective facts from a candidate鈥檚 CV to the objective requirements of a job spec, and therefore a logical, impartial comparison is conducted.
One approach that has been impactful for Juggle Jobs is anonymising all candidate data until candidates progress to the interview stage. Employers fundamentally don鈥檛 need to know whether a candidate is male or female when they apply for a role at their company, and it shouldn鈥檛 be a factor in whether they are successful or not. Although it鈥檚 important to note that Juggle Jobs uses AI to enrich the recruitment process and qualify a candidate鈥檚 CV, it does not replace a human when onboarding candidates onto the platform. Human augmentation needs to sit alongside the technology to enrich the experience.
Ultimately, I think it鈥檚 a universal mission is to increase diverse representation at the leadership level across the UK through technology and flexibility, and this in turn will positively impact and phase out our unconscious biases, and ultimately the gender pay gap.
Written by Romanie Thomas, Founder and CEO at Juggle Jobs
![]()