Welcome to the latest edition of the Stephenson Harwood Data and Cyber Update, covering the key developments in data protection and cyber security law in March 2026.
In data regulation news, the UK government publishes its long-awaited Copyright and Artificial Intelligence report and impact assessment, and the European Data Protection Board and European Data Protection Supervisor issue a joint Opinion on the European Biotech Act proposal.
In cybersecurity news, the European Commission publishes draft guidance on the Cyber Resilience Act, Companies House confirms a cyber incident affecting its WebFiling platform, potentially exposing UK company directors' personal data and ICO and Ofcom release a joint statement on age assurance that clears expectations for protecting children online.
In our enforcement and civil litigation update, the Court of Justice of the European Union confirms that even a first subject access request could be refused for being excessive if it was an abuse of rights, the UK ICO issues its first monetary penalty explicitly focused on data minimisation and privacy by design, and we take a look inside the ICO’s new enforcement toolkit and what organisations need to know.
On 18 March 2026, the UK government published its long-awaited Copyright and Artificial Intelligence report ("Report"), together with an economic impact assessment addressing the policy options under consideration. The Report, required pursuant to the Data (Use and Access) Act 2025, seeks to address the complex relationship between copyright law and the development and deployment of AI systems, particularly with regard to the use of training data for generative AI models. It follows a consultation held at the end of 2024 which outlined four policy options on copyright and the training of AI models, receiving 11,520 responses.
The consultation is of interest to data protection professionals, as the ability to rely on legitimate interest for scraping and using personal data for model training relies on the general lawfulness of the activity under laws including copyright. We have covered the points most relevant to the use of personal data below.
Key takeaways:
The outcome of the Report demonstrates that the government’s current position is one of caution; adopting a "wait and see" stance. While no immediate legislative changes are planned, the Report emphasised the government’s commitment to further evidence-gathering; to continue to consult stakeholders and monitor international developments to balance the interests of creators, AI developers, and users. This means that there is still uncertainty as to the reliance on legitimate interests for model training using personal data.
The European Data Protection Board ("EDPB") and the European Data Protection Supervisor ("EDPS") have issued a Joint Opinion (the "Opinion") on the European Commission’s Proposal for a European Biotech Act (the "Proposal"), which seeks to strengthen Europe’s biotechnology and biomanufacturing sectors. The Proposal aims to streamline the regulatory landscape and modernise the rules governing clinical trials, particularly where health data is involved.
Both authorities strongly support the Proposal’s aim to promote legal clarity in establishing a single EU-wide basis for processing personal data by sponsors and investigators. This represents a shift away from the fragmented approach currently applied under the Clinical Trials Regulation ("CTR") by individual EU member states on a domestic basis. Implementation of the Proposal is expected to reduce complexities and promote consistency across the EU.
However, in their Opinion, the EDPB and EDPS note that the Proposal must establish stronger safeguards for highly sensitive health and genetic data, particularly in the context of clinical trials, due to the level of protection required by this type of personal data. The authorities set out several key recommendations to address these risks. These include:
While the EDPB and EDPS are both broadly supportive of the Proposal’s objectives, the Opinion underlines that EU-wide harmonisation must be supported by robust and clearly defined data protection measures. We will continue to monitor developments and provide updates as the Proposal progresses.
On 3 March 2026, the European Commission published its first draft guidance ("Guidance") to help businesses and regulators interpret the Cyber Resilience Act ("CRA"). As we reported in our January 2026 update, reporting obligations under the CRA will apply from 11 September 2026, and the CRA will be fully applicable from 11 December 2027.
The CRA aims to ensure "products with digital elements" are secure throughout their lifecycle by imposing mandatory cybersecurity requirements on manufacturers, importers, and distributors when making them available in the EU. The Guidance is non-binding and addresses several central aspects of the CRA with the aim of supporting compliance. It clarifies the rationale of certain key provisions and provides practical examples to illustrate how they could be implemented in practice.
Key takeaways include the following:
The Guidance also addresses the practical implementation issues, such as the classification of products (including the distinction between important and critical products), cybersecurity risk assessment and due diligence, remote data processing and reporting and incident notification obligations.
Currently, the Guidance is still in draft form and is open for stakeholder feedback until 31 March 2026. Whilst further revisions are expected before its final adoption, this draft resource provides valuable support for companies supplying digital products to the EU as they prepare for CRA compliance.
Companies House has confirmed that a security flaw in its WebFiling platform allowed logged-in users to view and potentially change elements of another company’s private records without consent. The vulnerability was introduced during a system update in October 2025 and remained active for around five months until its discovery on 13 March 2026. Once identified, the service was promptly taken offline and later restored following independent testing.
The flaw enabled access to company dashboards without the required authentication code. As a result, personal data not visible on the public register (including directors’ dates of birth, residential addresses and company email addresses) may have been exposed. It may also have been possible to submit unauthorised filings, including director changes or accounts, on behalf of another company.
Companies House has reported no evidence so far of data extraction at scale, noting that exploitation required an authenticated session and could affect only one record at a time. Companies House also confirmed that information such as passwords, passport information and existing filed documents were not compromised. Nonetheless, the scale of potential impact is significant, as Companies House contains data on more than five million companies and many more individual appointments.
The incident has been reported to the UK Information Commissioner’s Office ("ICO") and the National Cyber Security Centre. Companies House has contacted all registered companies with guidance on checking their records and has issued an apology to all stakeholders affected. All UK organisations are being urged to review their filings and raise any concerns with Companies House via email.
On 25 March 2026 the ICO and Ofcom published a joint statement on age assurance, marking a significant step in their collaborative efforts to protect children from online harm. We summarise key takeaways from the statement here.
In a notable judgment on 19 March 2026, the Court of Justice of the European Union ("CJEU") in Case C-526/24 (Brillen Rottler GmbH & Co. KG v TC) confirmed that even a first data subject access request ("DSAR") can be refused as "excessive" under Article 12(5) GDPR where it is pursued with an abusive intention.
The case concerned a family-run German optician, Brillen Rottler, which held only a small amount of personal data about the individual concerned. The individual voluntarily subscribed to its newsletter via an online form and, just 13 days later, submitted a DSAR. Brillen Rottler replied to the individual within the one month period but refused to act, treating the DSAR as abusive after identifying public reports that the individual concerned systematically subscribed to newsletters, issued DSARs, and then pursued GDPR compensation claims. The individual maintained his request and added a claim for €1,000 in non material damages for alleged infringement of his right of access arising from that refusal.
The CJEU held that the purpose of a DSAR is to enable individuals to be aware of, and verify, the lawfulness of processing, not to "artificially create the conditions" for compensation. Controllers may therefore refuse a DSAR for being excessive – including a first one – where they can unequivocally demonstrate such abusive intent. Relevant factors include the voluntary provision of data, the short lapse of time between provision and request, the data subject’s overall conduct, and publicly available information evidencing a pattern of DSAR driven claims.
While the judgment interprets the EU GDPR (not the UK GDPR), it is likely to be influential in the UK. The judgment does not establish a broad right to refuse DSARs - controllers must still assess requests case by case and carefully justify any refusal – but it arguably strengthens the ability of controllers to refuse a DSAR where they can demonstrate that the data subject has artificially created the conditions necessary for obtaining compensation. It also broadens the scope for reliance on the "excessive" grounds for refusal in scenarios that extend beyond the "repetitive character" of such requests.
The ICO has issued its first monetary penalty explicitly grounded in the UK GDPR principles of data minimisation and privacy by design and default, fining Police Scotland £66,000. The case arose after an individual reported an alleged crime and the police force extracted the entire contents of the individual’s mobile phone, including highly sensitive information, despite only a small portion being relevant to the investigation. That full dataset was then disclosed to a third party during a misconduct investigation, and Police Scotland failed to report this incident to the ICO within the statutory timeframe.
This enforcement action by the ICO illustrates several concrete examples of what organisations should avoid in connection with the minimisation and data protection by design principles: extracting all data from a device when only limited information is needed; sharing full, unfiltered datasets with third parties instead of disclosing only what is strictly relevant; and failing to embed technical and organisational controls that restrict access and disclosure by default. The ICO’s decision makes clear that privacy by design is not optional: systems and processes must be configured so that excessive collection and disclosure simply cannot happen as a matter of course.
The ICO is clearly signalling that a lack of accountability, such as not having clear rules, safeguards and training around data extraction and sharing will be treated as a serious matter where it leads to a breach with a significant impact on an individual. Organisations should consider taking steps to analyse their processing activities and identify which activities require specific policies, processes and training to promote UK GDPR compliance.
The ICO published its draft Data Protection Enforcement Procedural Guidance. To read more about that and the ICO's new powers introduced by the Data (Use and Access) Act 2025, please see our article here.