The Netherlands Court of Audit found that 3 out of 9 algorithms it audited met all the basic requirements, the other 6 did not and exposed the government to various risks: from inadequate control over the algorithm’s performance and impact to bias, data leaks and unauthorised access.
They included both simple and complex algorithms. Some were supported, such as those that send traffic fines to the right address or check whether aliens have not already registered in the Netherlands. Others took decisions partly automatically, for instance to award housing benefits, decide whether a business was eligible for financial support from the TVL scheme to combat the COVID-19 pandemic and to decide whether an applicant was medically fit to drive a motor vehicle.
The algorithms audited that were used by the police, the Ministry of Justice and Security’s Directorate-General for Migration and the National Office for Identity Data did not meet the basic requirements on several counts. 2 organisations had outsourced the development and management of algorithms but had not made agreements on who was responsible for what. The National Office for Identity Data could not independently verify that its algorithm correctly assessed the quality of passport photographs. Furthermore, it did not assess the consequences for data protection. The Criminality Anticipation System used by the police to forecast where and when there is a high risk of incidents does not check for bias.