So, for instance, you can’t ask the age of someone applying for a job, but the AI could look at the year that they graduated as a proxy. This might then result in discriminatory decisions based on age, even though age wasn’t included in the parameters. Bias can be built into the system by the AI designers too. Ask one person to set criteria for selecting a non-executive director, and they would probably provide quite a different list from someone with a different background, gender or career path. To stay on the right track, try to ensure clear delineation of AI vs human roles in decision-making. And remember to implement continuous monitoring and feedback loops to refine AI’s performance and maintain human oversight.
Using AI trained on historical data magnifies and reinforces its impact. So the use of historical data that is already biased, leads to further bias.
AI AND ETHICS | PART FOUR: BIAS IN AI
21
Made with FlippingBook Digital Publishing Software