CNIL Tests Tools to Audit AI Systems
The French Supervisory Authority (CNIL) has recently tested tools that could potentially help its auditors understand the functioning of an AI system. The CNIL tested two different tools, IBEX and Algocate.
While IBEX aims at explaining an AI system, Algocate seeks to justify the decisions made by a AI system by checking the decision against specific standards. Both tools enable “black box” audits, meaning that they focus on the ins and outs of an AI system rather than on its internal functioning.
The tools also rely on local explanatory methods, which provide an explanation for a decision related to a particular data input in the system; not on global explanatory methods which would attempt to explain all possible decisions simultaneously.
Source: CNIL Tests Tools to Audit AI Systems