Insights
EMEA- Data Privacy, Digital and AI Round Up 2025/2026
Privacy SpeaksDec 18, 2025Summary
As anticipated in our 2024 privacy round up, 2025 has proven to be a defining year for data privacy and the broader digital landscape. Significant developments in AI regulation and cybersecurity have emerged, with legislative updates and regulatory activity accelerating as expected. Geopolitical dynamics continue to influence the adoption of new technologies, and questions remain over whether the EU will advance its tech regulation agenda, particularly following steps to delay certain implementation phases (most notably in AI) through its Digital Omnibus.
With global data protection developments continuing at pace and further changes expected in 2026, now is an opportune moment to reflect on what 2025 delivered for businesses and to consider what 2026 may hold for the EMEA region.
KEY DATA PROTECTION DEVELOPMENTS IN 2025
In this round-up, we look back over 2025 and consider the main developments across key themes including artificial intelligence, cybersecurity and online advertising in the UK, EU and the Middle East.
DATA USE AND ACCESS ACT 2025
The Data (Use and Access) Act 2025 (DUAA) received royal assent on 18 June 2025, introducing significant amendments to UK data protection law. The Act aims to simplify compliance and support innovation while maintaining strong privacy standards. Some of the key changes include: greater flexibility for use of automated decision-making; enhanced protections for children’s personal data; clarification of rules for scientific research and international transfers; new lawful bases for processing; simplified cookie requirements; and expanded ICO powers and duties that will affect enforcement approaches across all sectors (we have covered this in more detail).
To reduce the compliance burden on businesses, implementation of DUAA is occurring in phases, between August 2025 and June 2026, allowing businesses time to adapt their processes, with updated ICO guidance to reflect changes.
ARTIFICIAL INTELLIGENCE
Tasked by the UK Government to prioritise growth, in March 2025, the ICO outlined its package of measures in support of this agenda, with a focus on AI, online advertising, and international data transfers. On the topic of AI, the ICO committed to introducing simpler guidance for businesses developing or deploying AI.
DSIT also published a voluntary Code of Practice for the Cyber Security of AI, designed to protect AI systems from cyber threats and inform global standards. The Code applies to AI systems incorporating deep neural networks and provides lifecycle security guidance, including implementing AI-focused cybersecurity training, developing recovery plans, and conducting robust risk assessments.
The ICO launched a new AI and Biometrics Strategy (AIBS) in June 2025, marking its first dedicated framework for GDPR compliance in these areas. The strategy is positioned within the ICO’s broader commitment to supporting economic growth and responsible innovation while safeguarding individuals, particularly vulnerable groups. The AIBS adopts an outcome-based, risk-focused regulatory approach, prioritising cooperative engagement with compliant organisations and enforcement against serious breaches. It aims to address two main challenges: the lack of regulatory certainty for organisations using AI and biometrics, and public concerns over transparency and trust.
CYBER SECURITY/ CYBER RESILIENCE
The UK Government published its Cyber Governance Code of Practice which has been co-designed with technical experts from the National Cyber Security Centre to support company boards and directors with navigating cyber risks within their organisation. The Code covers the following principles: risk management; strategy; people; incident planning, response and recovery; and assurance and oversight. In addition to this, the UK Government published a consultation paper containing measures to address the escalating threat of ransomware attacks, which have grown significantly in scale and sophistication in recent years. The three proposals that formed part of the consultation include a targeted ban on ransom payments by public sector bodies and operators of critical national infrastructure, a ransomware payment prevention regime and the introduction of a mandatory reporting regime for any intended payments. These measures are intended to remove financial incentives for cybercriminals and align the UK with international efforts under the Counter Ransomware Initiative. We have covered this in more depth .
In November 2025, the UK issued its proposals for reform of the Network and Information Systems Regulations 2018 (which we discuss) to bring more businesses into scope and give the Secretary of State wide powers to issue regulations, codes of practice and national security directions and set strategic cyber security priorities. The Cyber Security and Resilience (Network and Information Systems) Bill extends the regulatory reach of the NIS Regulations to data centres, managed services, critical suppliers, and electricity “load controllers” and introduces enhanced incident reporting, customer notification, information-sharing, enforcement powers, and cost recovery by regulators.
The UK Government has also introduced a Cyber Growth Action Plan, backed by up to £16 million in funding for startups and academic spinouts. The Plan aims to create high-quality jobs, foster business innovation, and bolster digital and economic security amid rising cyber threats. For businesses, this initiative marks the start of a broader policy and legal framework that will shape future compliance obligations.
CHILDREN’S DATA
The ICO published a review in April 2025 into children's data in financial services, conducted under the ICO25 strategic plan. The review, which ran from March to September 2024 and gathered information from over 40 organisations offering products such as current accounts and savings accounts to children, examined how the sector processes children's personal data. Key findings revealed that whilst most organisations had robust age verification processes, only half provided age-appropriate privacy information, and there was limited monitoring of compliance with children's data policies. The review identified significant risks in relation to children agreeing to terms they do not understand, with many organisations passing transparency responsibilities onto parents. Important changes are affecting the online environment for children, with Ofcom also monitoring requirements for age verification measures, required under the Online Safety Act.
ANONYMISATION/ PSEUDONYMISATION
In March 2025, the ICO published guidance on anonymisation and pseudonymisation, recognising the benefits that sharing data can bring to organisations, people and society, whilst acknowledging the inherent risks. The guidance positions effective anonymisation as achievable when organisations use appropriate techniques to reduce identification risks to a sufficiently remote level, emphasising a risk-based, contextual approach. Pseudonymisation is outlined as a complementary measure that reduces risk whilst keeping data within regulatory protections, supporting compliance with data protection by design and security obligations.
Following this, in May 2025, the ICO published updated encryption guidance under a "must, should, could" framework, positioning encryption as an appropriate technical measure under Article 32 of UK GDPR. The guidance sets clear expectations for encryption in transit, at rest, and on removable media, urging organisations to adopt modern configurations and emphasising that encrypted data typically remains personal data, so data protection law continues to apply.
Together, these publications provided much-needed clarity on privacy-enhancing techniques that balance data utility with regulatory compliance, particularly as organisations navigate increasing pressure to share data for research, innovation and public benefit.
ONLINE ADVERTISING
In March 2025, the ICO unveiled pro-growth pledges following a meeting between Information Commissioner John Edwards and Chancellor Rachel Reeves. The regulator committed to relaxing enforcement of PECR consent rules for privacy-preserving online advertising, ahead of government exemptions for non-intrusive cookies and trackers. As part of this initiative, the ICO pledged to pilot an experimentation regime for innovative advertising models and publish free data essentials training for small businesses.
DIGITAL OMNIBUS
The European Commission has presented a new package of measures under its Digital Omnibus initiative aimed at reducing administrative burdens and enabling businesses to focus on innovation. The proposals seek to simplify rules governing AI, cybersecurity, and data while maintaining Europe’s high standards for fundamental rights, data protection, safety, and fairness.
The Commission intends to cut compliance costs by 25% for all companies and 35% for SMEs, in line with its Competitiveness Compass target. To achieve this, the Digital Omnibus will streamline procedures, reduce paperwork, and eliminate overlapping requirements, while maintaining fairness and safety standards.
Key reforms include innovation-friendly AI rules, such as clearer guidance and support for implementing the EU AI Act, targeted amendments to ease compliance, and opportunities for real-world testing. Companies will only need to apply rules for high-risk AI systems once support tools and standards are in place, giving them up to 16 months to comply. Cybersecurity reporting will be streamlined through a single-entry point, replacing multiple reporting obligations under different laws. Amendments to the GDPR will harmonise and clarify certain provisions without lowering data protection standards, while modernised cookie rules aim to improve user experience.
On data, the initiative consolidates EU data rules through the Data Act, introduces exemptions for SMEs, and provides new compliance guidance. It also seeks to boost European AI companies by unlocking access to high-quality datasets. The Data Union Strategy will expand access to data through tools like data labs, create a legal helpdesk for the Data Act, and strengthen Europe’s data sovereignty. The European Business Wallet will allow companies to digitalise operations, securely sign and exchange documents, and communicate with other businesses or public administrations across the EU.
In Germany, the German Consent Management Ordinance (EinwV), in force since 1 April 2025, establishes a framework for recognised consent management services under the Telecommunications Digital Services Data Protection Act (TDDDG) to combat “cookie consent fatigue”. These services, which must be recognised by the Federal Data Protection Commissioner (BfDI), allow users to manage, store, and revoke consent centrally, rather than repeatedly interacting with cookie banners. Recently, the BfDI recognised the first consent management service, "Consenter", under the Consent Management Ordinance. However, the Ordinance has faced criticism from stakeholders for not eliminating cookie banners, meaning users still encounter initial consent prompts. Additionally, its scope is limited to TDDDG consents (cookies and tracking technologies), leaving broader GDPR-related consents untouched. The integration of these services by website operators remains voluntary. As a result, many businesses may continue to rely on traditional consent banners, thereby limiting the Ordinance's practical relevance and user benefits.
INTERNATIONAL DATA TRANSFERS
After a 6 month extension, to the UK’s adequacy decision by the European Commission, the European Data Protection Board (EDPB) issued largely positive opinions on the European Commission’s draft adequacy decisions for the UK. The EDPB welcomed continued alignment between UK and EU data protection frameworks, which is critical for businesses relying on cross-border data flows between the UK and EU. However, it called for further clarifications on areas such as UK-to-third-country transfers and changes introduced by recent UK legislation. The EDPB also issued its opinion on the Commission’s draft adequacy decision of the European Patent Organisation, making it the first adequacy decision granted in connection with an organisation.
The CJEU confirmed the validity of the EU-U.S. Data Privacy Framework (DPF) by rejecting an annulment challenge in September 2025 (we have covered this in more detail). This ruling (although subject to an appeal) provides crucial certainty for businesses with transatlantic operations, ensuring data transfers under the DPF remain lawful and avoiding the operational disruption seen after previous framework invalidations.
ARTIFICIAL INTELLIGENCE
In France, on July 2025, the CNIL finalised its first set of recommendations on applying the GDPR to the development of AI systems to help professionals reconcile innovation with respect for individuals' rights. It has also provided a comprehensive set of guidelines on web scraping, notably for AI system training purposes.
The recommendations cover the entire development phase, from system design through database creation to training, and provide practical guidance on key GDPR obligations including defining purposes, determining legal bases, minimising data, ensuring security, and respecting individuals' rights.
In Germany, the German Conference of Federal and State Data Protection Authorities (Datenschutzkonferenz – DSK) issued two guidance papers that further clarify how data protection requirements apply to AI systems. The first is the Recommended Technical and Organisational Measures for AI Systems published in June 2025. This guidance builds on the DSK’s earlier May 2024 guidance on AI and Data Protection, which was primarily addressed to controllers planning to deploy AI applications. The June 2025 guidance is aimed mainly at providers and developers of AI systems. It offers practical recommendations for implementing data protection by design throughout the AI lifecycle, covering design, development, deployment, and operation. The June 2025 guidance draws on the Standard Data Protection Model (SDM), provided by the DSK for GDPR-compliant design of processing activities, to translate GDPR principles into concrete technical and organisational measures.
The second is AI Systems Using Retrieval-Augmented Generation (RAG) published in October 2025. RAG is a technique in which a user query to a large language model (LLM) is supplemented with relevant information retrieved from additional sources, such as knowledge bases or document repositories. The enriched query, together with the supplemental data, is then processed by the LLM to improve accuracy and reduce hallucinations. While this approach can enhance reliability, the DSK notes that it also introduces challenges regarding purpose limitation, data minimisation, and the handling of personal data in embeddings and vector databases.
On a wider EU note, on 28 October 2025, the European Data Protection Supervisor (EDPS) released an updated version of its guidance on the use of generative AI systems by EU institutions under the EU Data Protection Regulation. This update builds on the 2024 framework and reflects the rapid evolution of AI technologies and insights from EDPS oversight activities. The guidance focuses on applying core data protection principles throughout the AI lifecycle rather than prescribing technical measures, aiming to ensure lawful and ethical use of generative AI.
The revised guidance emphasises several key compliance priorities. Organisations must define clear purposes and legal bases for processing personal data at each stage of development and deployment, and identify roles and responsibilities across complex AI supply chains, distinguishing controllers, processors, and joint controllers. It mandates risk assessments and Data Protection Impact Assessments for high-risk processing and reinforces the need for data protection by design and by default.
This update is especially relevant as use of generative AI tools introduces unique privacy challenges such as large-scale personal data processing during training, transparency gaps, and difficulties in exercising data subject rights. The EDPS urges EU bodies to adopt trustworthy AI practices and maintain compliance throughout the entire AI lifecycle, signalling a growing regulatory focus on ethical and secure AI deployment.
CYBER SECURITY/ CYBER RESILIENCE
In June 2025, the European Union Agency for Cybersecurity (ENISA) published detailed guidance on NIS2 cybersecurity risk management measures. The guidance is intended to support the many types of entities within the NIS2 digital infrastructure and covers the technological and methodological requirements contained in NIS2 along with guidance, examples of evidence and tips with respect to each requirement.
The European Commission released draft implementing regulations under the Cyber Resilience Act (CRA) outlining the technical description of categories considered to be important and critical products and therefore in scope of stricter conformity assessments under the CRA. See our briefing.
CHILDREN’S DATA
At the beginning of 2025, the CNIL published its 2025-2028 strategic plan focusing on four key areas: AI, the protection of minors online, cybersecurity, and personal data in the context of mobile applications and digital identity. The plan emphasises continued dialogue with stakeholders whilst indicating an intention to increase both the number and variety of enforcement actions and sanctions. We have covered the CNIL’s strategic plan in more detail.
In February 2025, the EDPB adopted Statement 1/2025 on Age Assurance, providing specific guidance and high-level principles stemming from the GDPR for processing personal data for the purposes of age assurance. The statement recognised that age assurance posed specific risks to data protection with the potential to adversely impact fundamental rights including non-discrimination, integrity of the person, and free expression. The proposed principles sought to reconcile the protection of children with the protection of personal data, prioritising requirements concerning the main principles in Article 5 GDPR including lawfulness, fairness, transparency, purpose limitation, data minimisation and accountability.
ENFORCEMENT ACTION/INDIVIDUAL SANCTIONING
October 2025 saw the ICO launch a consultation on new enforcement procedural guidance, offering significantly more detailed transparency into its investigatory processes and powers under the Data (Use and Access) Act 2025. The guidance aims to increase transparency about the process the ICO follows when it suspects an organisation has failed to comply with its legal obligations to protect people's personal information under the UK GDPR and Data Protection Act 2018. This initiative arrived amid criticism from academics and legal practitioners calling for a parliamentary enquiry into the ICO's enforcement approach, citing concerns over declining public sector investigations and the decision not to formally investigate the Ministry of Defence's data breach affecting Afghan nationals.
We are also seeing other UK regulators leverage UK data protection laws when challenging misuse of personal data, with the Financial Conduct Authority bringing its first case under the Data Protection Act, in relation to the obtaining and disclosure of confidential customer data, contrary to s.170(1) of the DPA.
DIFC AMENDMENTS
On 8 July 2025, the DIFC amended its Data Protection Law (No.5 of 2020), with changes effective from 15 July 2025. These amendments introduced a private right of action for data subjects, clarification and broadening of extra‑territorial scope of the data protection rules, tighter controls around responding to public authority requests (including validity and proportionality checks before disclosure), and other compliance refinements.
CYBERSECURITY/ CYBER RESILIENCE
In May 2025, the UAE Cybersecurity Council published the UAE's first national guidelines for drone cybersecurity, developed alongside partners Reach Digital and Shieldworkz. These standards seek to bolster cyber defences for unmanned aerial vehicle operations across multiple industries, including agriculture, logistics and environmental monitoring. The framework tackles cyber vulnerabilities in drone technology, with a particular emphasis on safeguarding national airspace, critical infrastructure and information security in line with international standards.
The Financial Services Regulatory Authority of Abu Dhabi Global Market (ADGM) introduced a Cyber Risk Management Framework in July 2025, applicable from 31 January 2026 following a six-month transition period. The framework seeks to bolster cyber defences across ADGM's financial services sector and supports UAE-wide initiatives against cyber threats and financial crime. Regulated entities, including banks, insurers and investment firms, must develop written frameworks covering risk assessment, prevention measures, continuous monitoring, incident response and recovery procedures. Firms face a 24-hour reporting obligation for material cyber incidents, assessed by operational impact and regulatory significance. Organisations will need to integrate these requirements into existing governance structures, review third-party arrangements, and conduct annual framework assessments to ensure ongoing compliance.
Saudi Arabia signed the UN Convention Against Cybercrime. The new treaty criminalises a range of cyber-dependent and cyber-enabled offences, facilitates the sharing of electronic evidence across borders and establishes a 24/7 cooperation network among States. It also contains an article (Article 36), on the protection of personal data which broadly provides that the contracting States may transfer personal data amongst each other only if done so in accordance with the domestic and international laws, may impose conditions to ensure compliance, are encouraged to establish data-sharing arrangements, must safeguard received data, and require authorization to share it with third countries or organizations.
ARTIFICIAL INTELLIGENCE
Global AI Hub Law
The highly anticipated Draft Global AI Hub Law was issued in April 2025, building upon the concept of ‘data embassies’ in Saudi. The law creates three types of AI/data hubs in Saudi (private, extended, and virtual) and sets the rules for how they would get approvals and be run, overseen, and wound down.
Its aim is to make Saudi a go‑to place for advanced technology by attracting governments and companies, using the Kingdom’s location to offer secure and reliable ‘data embassy–style infrastructure’, and boosting innovation and R&D. Outside these hubs, normal Saudi laws would apply, however, inside those, foreign laws of countries having bilateral agreements with Saudi may apply.
In June 2025, Saudi joined the OECD’s Recommendation on Artificial Intelligence, standards that aim to foster innovation and build trust in AI technologies through responsible governance, including generative and general AI technologies. The recommendations are built upon the principles of: (a) inclusive growth, sustainable development and well-being; (b) respect for the rule of law, human rights and democratic values, including fairness and privacy; (c) transparency and explainability; (d) robustness, security and safety, and (e) accountability. Recommendations include: (i) investing in AI research and development; (ii) fostering an inclusive AI-enabling ecosystem; (iii) shaping an enabling interoperable governance and policy environment for AI; (iv) building human capacity and preparing for labour market transformation; and (v) international co-operation for trustworthy AI.
WHAT TO EXPECT IN 2026?
Given the breadth of developments in 2025, we have set out below our key predictions for another busy year in the data protection space in 2026.
- The impact of the Cyber Security and Resilience Bill will be significant, introducing stricter requirements for digital services and supply chains, expansion of regulatory oversight of the digital supply chain and increased enforcement powers. This will necessitate increased board level focus on cyber security issues, especially given 2025’s high profile cyber incidents affecting two major UK retailers, with the attendant concerns about the security of customer personal data in the wake of the incidents.
- It remains to be seen whether the ICO will leverage its enhanced enforcement powers to issue more substantial fines for breach of the Privacy and Electronic Communications Regulations, now the ICO is empowered under the Data Use and Access Act to issue fines of up to £17.5 million or 4% of global turnover (whichever is higher) for breaches of the direct marketing rules.
- Whilst collective actions by data subjects challenging misuse of personal data face hurdles, the appetite amongst claimant law firms to mount these types of claims on behalf of data subjects remains undimmed and dealing with them is time-consuming. We see this trend continuing as the range of cyber incidents affecting personal data rises with the increased use of AI tools to mount ever more sophisticated cyber-attacks.
- With the renewed focus on child safety online and parallel developments in Australia, we anticipate that the ICO will be working closely with Ofcom to ensure appropriate deployment of age verification measures and ensure businesses who engage with children secure appropriate consents and manage children’s personal data with due care.
- Whilst we may not get a decision in 2026, the question whether a threshold of seriousness applies to claims for damages under the GDPR and DPA 2018 has just been appealed to the UK Supreme Court, which should ultimately provide guidance as to the regime for compensation under UK data protection rules.
- The majority of the EU AI Act’s requirements start to apply as of August 2026 with enforcement also starting during that period too at both a national and EU-level. This milestone represents a significant regulatory shift, and organisations should begin preparing now to ensure compliance and mitigate potential risks associated with non-compliance.
- On the international data transfer front, the European Commission has adopted a draft adequacy decision for Brazil. The EDPB in its opinion found that Brazil’s data protection framework is closely aligned with the GDPR, however identified points requiring further clarity such as onward transfers and international agreements, law enforcement processing and redress. The Commission is expected to incorporate the EDPB’s clarifications into the decision, seek approval from the Member States, which will if endorsed, bring it into force.
- We anticipate that the world of AI will continue to develop next year both in terms of technological advancement but also in lawmaking. In the UAE, the National AI Charter and Ethics Guidelines is likely to continue to develop and mature to align with the UAE Strategy for AI which aims to position the UAE as a leading nation in AI by 2031.
- In Saudi, we are awaiting the coming into force of the Global AI Hub Law. That said, it is also possible that there may also be comprehensive legislation on the use and development of AI more generally within Saudi.
- In terms of enforcement, we are likely to see more instances of PDPL enforcement – either in the form of published alerts or decisions, or parties being approached directly by SDAIA and some initial jurisdiction adequacy decisions for cross-border data transfer. SDAIA may also issue the rules to govern the registration of controllers processing the personal data of Saudi residents from outside of the Kingdom, as already trialed in the “Rules Governing the National Register of Controllers Within the Kingdom”.
- Lastly, we anticipate more legislation or regulation for data centers. Given the country’s potential use cases and its ability to fund more to support the country’s tech ambitions, there is a strong emphasis on increasing data center capacity within the Kingdom. To facilitate this, it is possible that specific data center related legislation/guides and regulatory procedures may be introduced.
CONCLUSION
Based on 2025, 2026 is set to be another busy year in the data privacy and digital world. With various developments happening across the EMEA region, 2026 will see organisations grappling with varying (and often very divergent) global requirements. What is clear is that conversations around AI, cybersecurity, AdTech, child safety online and technology more widely, will continue, with developments (both regulatory and political) in this space this year to ramp up significantly.
Related Capabilities
-
Data Privacy & Security
-
General Data Protection Regulation