Digital Omnibus: What’s proposed and why it matters
The European Commission’s draft “Digital Omnibus for the digital acquis” proposes a sweeping technical consolidation of EU digital law and targeted amendments to the General Data Protection Regulation (GDPR) and ePrivacy Directive, alongside changes to the Data Act, Data Governance Act and Free Flow of Data Regulation. The stated goal is simplification: cut compliance costs, clarify interplays, and streamline incident reporting via a single-entry point operated by ENISA. Critics warn that several GDPR amendments would materially reduce protections and should not proceed through a fast-track omnibus without full impact assessment. This article distills the official proposal and key commentary, with a focus on GDPR impacts.
First, the data acquis would be consolidated. The Data Governance Act (DGA), the Open Data Directive (ODD), and the Free Flow of Non-Personal Data Regulation would be repealed, and their relevant provisions merged into the Data Act. This creates one instrument for re-use of public sector data (both “open data” and certain categories of protected data), a voluntary EU label regime for data intermediation services and data altruism organisations, and retention of the prohibition on unjustified data localisation requirements in the EU. The Data Act’s cloud switching chapter would be “calibrated,” adding lighter regimes for custom‑made services and providers that are SMEs or small mid‑caps (SMCs), while still phasing out switching and egress fees. Trade secret safeguards would be strengthened to allow refusal to share where there is a high risk of unlawful disclosure to third countries or entities under their control.
Second, GDPR changes. The proposal introduces: (a) a clarification to “personal data” that focuses on whether a given entity has “means reasonably likely” to identify a person; (b) a narrower scope for “special categories” (Article 9) so enhanced protection applies only when data directly reveal sensitive attributes in relation to the specific individual, while keeping genetic and biometric data protections untouched; (c) explicit derogations to allow residual processing of special categories for AI training and biometric verification under strict safeguards; (d) aligning breach notification to the “high risk” threshold (Article 33), extending the deadline to 96 hours, and mandating use of the EU single-entry point; (e) EU-level harmonised lists for when DPIAs are required/not required, with EDPB preparing proposals and the Commission adopting implementing acts; (f) easing information duties (Article 13) in low-risk, clear relationships where individuals reasonably already have core information; and (g) clarifying automated decisions under Article 22, confirming “necessity” for a contract does not require that only automation could take the decision.
Third, GDPR–ePrivacy interplay and cookies. The Commission proposes to align the lawful grounds for processing of personal data obtained via terminal equipment (cookies and similar) under the GDPR only, simplifying the dual regime. A new path would enable automated, machine-readable signals of users’ choices (consent refusal and objection to direct marketing), obliging websites and apps to honour them once standards exist, with a six-month grace period. The Commission could require browser/OS vendors to provide these interfaces if uptake is insufficient. Media service providers would be excluded from the obligation to respect these signals. Separately, Article 4 of the ePrivacy Directive (security/breach provisions) would be repealed, given overlap with NIS2 and GDPR.
Fourth, incident reporting. A single-entry point hosted by ENISA would allow “report once, share many” notifications under NIS2, GDPR, DORA, eIDAS and the Digital Identity Regulation (and potentially CER). Entities would submit via one interface; competent authorities would receive the right data for their instrument. This is expected to cut reporting burdens and improve timeliness and completeness.
Fifth, platform regulation. The Platform-to-Business Regulation would be repealed, as the Commission considers its remaining value to be covered by the Digital Services Act and Digital Markets Act, with a transitional period to avoid legal gaps.
Key takeaways
- One data instrument: DGA, ODD and Free Flow of Data Regulation would be repealed and their key rules merged into the Data Act, with new voluntary EU labels for data intermediation and data altruism.
- Trade secrets safeguarded: Data holders could refuse sharing where there’s a high risk of unlawful disclosure to third countries or entities under their control.
- GDPR shifts: Narrower “special categories” scope; personal data test tied to an entity’s means; AI training under legitimate interests with safeguards; breach reporting threshold raised to “high risk” and 96 hours; DPIA lists/templates harmonised.
- Cookies and signals: Processing from terminal equipment governed solely by GDPR; automated machine-readable refusal/objection signals to be honoured once standards exist; browser/OS duties possible if market uptake is poor; media providers exempt from signal honouring.
- Incident reporting: ENISA single-entry point to “report once, share many” across NIS2, GDPR, DORA, eIDAS; expected cost and burden reduction.
- Platform rule repeal: P2B Regulation would be repealed, relying on DSA/DMA coverage; transitional clauses to prevent gaps.
- Critics’ warning: Civil society and experts argue several GDPR changes amount to deregulation, lack impact assessment, and risk undermining Charter rights; legislators should proceed with caution.
Practical implications for controllers:
- Personal data test: Controllers would assess identifiability based on means reasonably likely at their disposal; data might be non-personal for one entity but personal for another. This could shrink the set of data a given controller treats as personal. However, onward sharing to a party with means to identify would render it personal for that recipient.
- Special categories: Enhanced protection would apply where data directly reveal sensitive traits. Controllers relying on inference of sensitive traits (e.g., political leaning from browsing) may face fewer Article 9 constraints, but must still meet Articles 5 and 6, and be mindful of fairness, transparency and the risk of harm.
- AI training: Legitimate interests could support processing for training/testing/validation, with required safeguards (minimisation, removal of special category data where identified, protection against regurgitation and leakage, enhanced transparency, unconditional right to object). This would need robust governance and documentation.
- DPIAs and breach reporting: Expect harmonised EU lists/templates and use of the single-entry point; breach notifications shift to the high-risk threshold and 96 hours.
- Information duties: Potentially reduced in low-risk, clear relationships (craftspeople, clubs), but layered transparency is encouraged where the derogation doesn’t apply.
- Cookies and signals: Prepare to support machine-readable consent refusal and direct marketing objections; review consent frameworks, CMPs, and privacy UX. If you operate media services under the Media Freedom Act, note the exemption from honouring signals.
For public sector bodies and data re-users:
- A unified framework for open data and protected categories in the Data Act should simplify reuse processes, fees, and licences, with new discretion to set higher fees/conditions for very large enterprises or DMA gatekeepers, and incentives for SMEs/SMCs.
- Competent bodies and single information points remain to assist with protected data reuse, secure processing environments, and third-country transfer safeguards.
Many elements of the Digital Omnibus advance legitimate simplification: consolidating data laws, retaining free flow of data, cutting layered incident reporting, and bringing clarity to cloud switching. The GDPR proposals aim for harmonisation and proportionality, but several would materially change core privacy concepts and enforcement practice. Before adoption, legislators should scrutinise effects on inferred sensitive data, advertising ecosystems, AI training governance, and media exceptions to signals. Organisations should track the legislative process, engage with consultations, and prepare readiness for incident single-entry reporting, machine-readable privacy signals, and updated DPIA/breach templates.
However, this only proposal. It may take long way to be adopted. We don’t know how final version will look. But we can anticipate what is coming and prepare accordingly.