The European Commission’s plans to combat the spread of child sexual abuse material (CSAM) have sparked vehement debate due to its potential invasion of privacy. The proposed regulation would give judicial authorities the power to issue detection orders to messaging apps and mail services deemed at great risk of distributing this type of illegal content.
Upon receiving such an order, a communications service provider would be required to implement automated systems to detect known cases of CSAM, as well as any new offensive material or act of grooming. This prompted the European Parliament to commission a supplementary evaluation to analyze these implications; it was presented to the Parliament’s Committee on Civil Liberties, Justice and Home Affairs on April 13th.
However, research has revealed that current technology is hardly developed enough for reliable scanning without a high rate of erroneous results – potentially affecting all messages on a platform. This proposal also conflicts with end-to-end encryption, which only allows the recipient to read a message.
According to this analysis, there presently is no technological remedy that can fulfill detection orders without putting end-to-end encryption in jeopardy, and it is doubtful that such a solution will be engineered in the two-to-five year period when the new policy would come into force.