Header image

Data and Cyber Update - March 2026

Data Protection | 31/03/2026

Welcome to the latest edition of the Stephenson Harwood Data and Cyber Update, covering the key developments in data protection and cyber security law in March 2026.

In data regulation news, the UK government publishes its long-awaited Copyright and Artificial Intelligence report and impact assessment, and the European Data Protection Board and European Data Protection Supervisor issue a joint Opinion on the European Biotech Act proposal.

In cybersecurity news, the European Commission publishes draft guidance on the Cyber Resilience Act, Companies House confirms a cyber incident affecting its WebFiling platform, potentially exposing UK company directors' personal data and ICO and Ofcom release a joint statement on age assurance that clears expectations for protecting children online.

In our enforcement and civil litigation update, the Court of Justice of the European Union confirms that even a first subject access request could be refused for being excessive if it was an abuse of rights, the UK ICO issues its first monetary penalty explicitly focused on data minimisation and privacy by design, and we take a look inside the ICO’s new enforcement toolkit and what organisations need to know.
 

Data Regulation

Cyber security

Enforcement and civil litigation

Data Regulation

UK government publishes report and impact assessment on Copyright and Artificial Intelligence

On 18 March 2026, the UK government published its long-awaited Copyright and Artificial Intelligence report ("Report"), together with an economic impact assessment addressing the policy options under consideration. The Report, required pursuant to the Data (Use and Access) Act 2025, seeks to address the complex relationship between copyright law and the development and deployment of AI systems, particularly with regard to the use of training data for generative AI models. It follows a consultation held at the end of 2024 which outlined four policy options on copyright and the training of AI models, receiving 11,520 responses.

The consultation is of interest to data protection professionals, as the ability to rely on legitimate interest for scraping and using personal data for model training relies on the general lawfulness of the activity under laws including copyright. We have covered the points most relevant to the use of personal data below.

Key takeaways:

  • No immediate legal reform: The government has decided not to introduce immediate reform to UK copyright law regarding AI training; opting instead to "take the time needed to get this right". This approach reflects a lack of consensus among stakeholders and significant gaps in the evidence base, particularly regarding the economic impact of potential reforms. The government instead commits to further evidence-gathering and considering the developing impact of AI across the UK economy.
  • Rejection of TDM exception: A notable outcome is that the government has opted to move away from its earlier preferred proposal for a broad "text and data mining" ("TDM") exception under copyright law to allow for the training of AI models, with an opt-out for rightsholders. Most consultation responses - particularly from the creative industries - strongly opposed this, citing concerns that a broad exception would allow generative AI to learn from their works without compensation, and in direct competition to them. Technology sector responses were also mixed, with some arguing the opt-out would be impractical or insufficiently competitive internationally. No alternative specific approach is proposed, with the government admitting to there being "no consensus on how these objectives should be achieved".
  • Focus on transparency: Instead of legislative change, the government commits in the Report to prioritise measures to improve transparency around the data used to train AI models and encourage the development of licensing markets. A mandatory disclosure regime will not be pursued, despite strong support from the creative industries on this approach. There is support for labelling of AI-generated content, and adoption of technical tools and standards to help rightsholders control and license their works, but the government highlights that its approach must "promote clarity and enforcement for right holders, without disproportionate effects on AI development or deployment".
  • Attention on deepfakes: Given the growing risks to individuals posed by non-consensual "digital replicas" (i.e. deepfakes), the government will explore new protections to guard against deepfakes, considering the case for greater commercial protections, and whether these should form part of any wider safeguards or rights to personality within the UK. The government will launch a consultation in the summer to explore options for addressing the risks associated with deepfakes.

The outcome of the Report demonstrates that the government’s current position is one of caution; adopting a "wait and see" stance. While no immediate legislative changes are planned, the Report emphasised the government’s commitment to further evidence-gathering; to continue to consult stakeholders and monitor international developments to balance the interests of creators, AI developers, and users. This means that there is still uncertainty as to the reliance on legitimate interests for model training using personal data.
 

EDPB and EDPS support clinical trial harmonisation but call for stronger safeguards for health data

The European Data Protection Board ("EDPB") and the European Data Protection Supervisor ("EDPS") have issued a Joint Opinion (the "Opinion") on the European Commission’s Proposal for a European Biotech Act (the "Proposal"), which seeks to strengthen Europe’s biotechnology and biomanufacturing sectors. The Proposal aims to streamline the regulatory landscape and modernise the rules governing clinical trials, particularly where health data is involved.

Both authorities strongly support the Proposal’s aim to promote legal clarity in establishing a single EU-wide basis for processing personal data by sponsors and investigators. This represents a shift away from the fragmented approach currently applied under the Clinical Trials Regulation ("CTR") by individual EU member states on a domestic basis. Implementation of the Proposal is expected to reduce complexities and promote consistency across the EU.

However, in their Opinion, the EDPB and EDPS note that the Proposal must establish stronger safeguards for highly sensitive health and genetic data, particularly in the context of clinical trials, due to the level of protection required by this type of personal data. The authorities set out several key recommendations to address these risks. These include:

  • Clarifying the roles under data protection law for stakeholders involved in the funding and running of clinical trials;
  • Limiting the mandatory 25-year personal data retention period so that it applies only to data in the clinical trial master file;
  • Tightening rules on further processing of trial data through stronger safeguards; and
  • Ensuring that AI‑enabled biotechnology is promoted in a way that remains coherent with existing requirements under the AI Act.

While the EDPB and EDPS are both broadly supportive of the Proposal’s objectives, the Opinion underlines that EU-wide harmonisation must be supported by robust and clearly defined data protection measures. We will continue to monitor developments and provide updates as the Proposal progresses.
 

Cybersecurity

The Commission publishes Draft guidance on the Cyber Resilience Act

On 3 March 2026, the European Commission published its first draft guidance ("Guidance") to help businesses and regulators interpret the Cyber Resilience Act ("CRA"). As we reported in our January 2026 update, reporting obligations under the CRA will apply from 11 September 2026, and the CRA will be fully applicable from 11 December 2027.

The CRA aims to ensure "products with digital elements" are secure throughout their lifecycle by imposing mandatory cybersecurity requirements on manufacturers, importers, and distributors when making them available in the EU. The Guidance is non-binding and addresses several central aspects of the CRA with the aim of supporting compliance. It clarifies the rationale of certain key provisions and provides practical examples to illustrate how they could be implemented in practice.

Key takeaways include the following:

  • Scope and applicability: The Guidance clarifies the scope of the CRA, and the concept of “placement on the market” when making products with digital elements available. Acknowledging that further clarification is needed for intangible products supplied via digital means (i.e. software), the Guidance confirms that a standalone software product should be considered to have been placed on the market when its manufacturing phase is complete and it is first supplied for distribution or use on the EU market in the course of a commercial activity. Subsequent iterations of a software product are only considered as newly placed on the market when there has been a “substantial modification”. 
  • Software is considered part of a product if it is necessary for its intended functions, regardless of how or when it is delivered to the user. This means that combined hardware and its essential software will together constitute a product even if the software is obtained separately after the hardware is placed on the market.
    • Example: A fitness wearable is placed on the market, and a companion smartphone application is required to display the measurements, show history and allow configuration of the device. Although downloaded separately they together constitute a single product, because they are designed and intended to operate together to deliver the product’s functionality.
  • Support periods: The Guidance emphasises a five-year minimum support period for all products. This is not a default standard that can be applied to all products, so if a product is expected to be used longer than five years, it should have a longer support period. Each iteration of software must also have its own declared support period.
    • Example: A smartphone manufacturer declares an eight-year support period during which it provides regular security updates and new operating system versions that users can install for free without needing new hardware. Under the CRA, the manufacturer only needs to fix vulnerabilities in the latest version, as long as users can upgrade at no extra cost, but must still follow other requirements like coordinated vulnerability disclosure and information-sharing measures.
  • FOSS and open-source software stewards ("OSS Stewards"): Free and open-source software (FOSS) is only covered by the CRA if it is supplied as part of a commercial activity, such as charging for use, monetising related services, or conditioning access on personal data processing. FOSS that is openly shared and not monetised will not be considered "placed on the market" and remains outside the scope of the CRA, but legal entities publishing and supporting such software may have limited obligations as OSS Stewards.

The Guidance also addresses the practical implementation issues, such as the classification of products (including the distinction between important and critical products), cybersecurity risk assessment and due diligence, remote data processing and reporting and incident notification obligations.

Currently, the Guidance is still in draft form and is open for stakeholder feedback until 31 March 2026. Whilst further revisions are expected before its final adoption, this draft resource provides valuable support for companies supplying digital products to the EU as they prepare for CRA compliance.
  

Cyber incident at UK Companies House exposes director personal data

Companies House has confirmed that a security flaw in its WebFiling platform allowed logged-in users to view and potentially change elements of another company’s private records without consent. The vulnerability was introduced during a system update in October 2025 and remained active for around five months until its discovery on 13 March 2026. Once identified, the service was promptly taken offline and later restored following independent testing.

The flaw enabled access to company dashboards without the required authentication code. As a result, personal data not visible on the public register (including directors’ dates of birth, residential addresses and company email addresses) may have been exposed. It may also have been possible to submit unauthorised filings, including director changes or accounts, on behalf of another company. 

Companies House has reported no evidence so far of data extraction at scale, noting that exploitation required an authenticated session and could affect only one record at a time. Companies House also confirmed that information such as passwords, passport information and existing filed documents were not compromised. Nonetheless, the scale of potential impact is significant, as Companies House contains data on more than five million companies and many more individual appointments.

The incident has been reported to the UK Information Commissioner’s Office ("ICO") and the National Cyber Security Centre. Companies House has contacted all registered companies with guidance on checking their records and has issued an apology to all stakeholders affected. All UK organisations are being urged to review their filings and raise any concerns with Companies House via email.
 

Joint ICO–Ofcom Statement on Age Assurance: clear expectations for protecting children online

On 25 March 2026 the ICO and Ofcom published a joint statement on age assurance, marking a significant step in their collaborative efforts to protect children from online harm. We summarise key takeaways from the statement here.
 

Enforcement and Civil Litigation

CJEU confirms abuse of rights can justify refusing DSARs

In a notable judgment on 19 March 2026, the Court of Justice of the European Union ("CJEU") in Case C-526/24 (Brillen Rottler GmbH & Co. KG v TC) confirmed that even a first data subject access request ("DSAR") can be refused as "excessive" under Article 12(5) GDPR where it is pursued with an abusive intention.

The case concerned a family-run German optician, Brillen Rottler, which held only a small amount of personal data about the individual concerned. The individual voluntarily subscribed to its newsletter via an online form and, just 13 days later, submitted a DSAR. Brillen Rottler replied to the individual within the one month period but refused to act, treating the DSAR as abusive after identifying public reports that the individual concerned systematically subscribed to newsletters, issued DSARs, and then pursued GDPR compensation claims. The individual maintained his request and added a claim for €1,000 in non material damages for alleged infringement of his right of access arising from that refusal.

The CJEU held that the purpose of a DSAR is to enable individuals to be aware of, and verify, the lawfulness of processing, not to "artificially create the conditions" for compensation. Controllers may therefore refuse a DSAR for being excessive – including a first one – where they can unequivocally demonstrate such abusive intent. Relevant factors include the voluntary provision of data, the short lapse of time between provision and request, the data subject’s overall conduct, and publicly available information evidencing a pattern of DSAR driven claims.

While the judgment interprets the EU GDPR (not the UK GDPR), it is likely to be influential in the UK. The judgment does not establish a broad right to refuse DSARs - controllers must still assess requests case by case and carefully justify any refusal – but it arguably strengthens the ability of controllers to refuse a DSAR where they can demonstrate that the data subject has artificially created the conditions necessary for obtaining compensation. It also broadens the scope for reliance on the "excessive" grounds for refusal in scenarios that extend beyond the "repetitive character" of such requests.
  

ICO issues first minimisation and privacy by design fine

The ICO has issued its first monetary penalty explicitly grounded in the UK GDPR principles of data minimisation and privacy by design and default, fining Police Scotland £66,000. The case arose after an individual reported an alleged crime and the police force extracted the entire contents of the individual’s mobile phone, including highly sensitive information, despite only a small portion being relevant to the investigation. That full dataset was then disclosed to a third party during a misconduct investigation, and Police Scotland failed to report this incident to the ICO within the statutory timeframe.

This enforcement action by the ICO illustrates several concrete examples of what organisations should avoid in connection with the minimisation and data protection by design principles: extracting all data from a device when only limited information is needed; sharing full, unfiltered datasets with third parties instead of disclosing only what is strictly relevant; and failing to embed technical and organisational controls that restrict access and disclosure by default. The ICO’s decision makes clear that privacy by design is not optional: systems and processes must be configured so that excessive collection and disclosure simply cannot happen as a matter of course.

The ICO is clearly signalling that a lack of accountability, such as not having clear rules, safeguards and training around data extraction and sharing will be treated as a serious matter where it leads to a breach with a significant impact on an individual. Organisations should consider taking steps to analyse their processing activities and identify which activities require specific policies, processes and training to promote UK GDPR compliance.
 

Inside the ICO’s new enforcement toolkit: what organisations need to know

The ICO published its draft Data Protection Enforcement Procedural Guidance. To read more about that and the ICO's new powers introduced by the Data (Use and Access) Act 2025, please see our article here.

Share Article

Related Expertise

Contributors