A recent document, dated 12 May, reveals that the majority of the EU Council of Ministers are in favor of expanding the scanning of private messages to audio communications for detecting child sexual abuse material.
The Swedish presidency has posed a series of key questions concerning detection orders, voluntary detection, end-to-end encryption, and the scope of the legislative proposal to EU member states. Responses from all but three countries have been gathered, with a clear majority favoring the inclusion of audio communications in the regulation’s scope.
The proposed law will empower judicial authorities to issue detection orders on communication services like emails or messaging apps considered to be highly likely used for the dissemination of Child Sexual Abuse Material (CSAM). It contends that providers of online services should be required to employ AI-powered tools to detect and disallow any CSAM, along with newer forms of exploitation like grooming and abusive content.
However, member states are more hesitant when it comes to such types of illicit communication, as they tend to be harder to identify.
Extensions to this proposal that would cover audio content also have far-reaching implications. If implemented, it could potentially compromise user privacy and disrupt the security of an entire network. Moreover, the document does not clarify whether number-dependent or number-independent communications would be included in this measure.