UK data watchdog investigates whether AI systems show racial bias
The UK data watchdog is to investigate whether artificial intelligence systems are showing racial bias when dealing with job applications.
The Information Commissioner’s Office said AI-driven discrimination could have “damaging consequences for people’s lives” and lead to someone being rejected for a job or being wrongfully denied a bank loan or a welfare benefit.
It will investigate the use of algorithms to sift through job applications, amid concerns that they are affecting employment opportunities for people from ethnic minorities.
Source: UK data watchdog investigates whether AI systems show racial bias | Artificial intelligence (AI) | The Guardian