The upcoming EU Artificial Intelligence Act ("AI Act", "Act") imposes varying degrees of obligations on different operators in the AI value chain. One of the lesser-known operators is the authorised representative, which must be appointed by non-EU based providers of high-risk AI systems or general-purpose AI models.
The authorised representative acts as a crucial intermediary between the provider and regulatory authorities, ensuring that all necessary documentation is prepared, compliance is maintained, and cooperation with authorities is facilitated. The authorised representative role is particularly relevant for UK providers (since Brexit) and others based outside of the EU, who may need to appoint an authorised representative to comply with the AI Act.
In this insight we explore the role, obligations and appointment of authorised representatives, and the commercial considerations when documenting your appointment in a written mandate.
An authorised representative is defined under Article 3(5) of the AI Act as a natural or legal person located or established in the EU who has received and accepted a written mandate from a provider of a high-risk AI system or a general-purpose AI model to perform and carry out on the provider's behalf the obligations and procedures established by the AI Act.
Recital 82 of the Act sets out the rationale for appointing an authorised representative, which is to enable enforcement of the Act and create a level playing field for operators, and, taking into account the different forms of making available of digital products, ensure that, under all circumstances, a person established in the EU can provide authorities with all the necessary information on the compliance of an AI system. The recital further states that an authorised representative plays a pivotal role in ensuring the compliance of the high-risk AI systems placed on the market or put into service in the EU by the relevant provider who is not established in the EU and in serving as their contact person established in the EU.
Prior to making a high-risk AI system or general-purpose AI model available on the EU market, providers of high-risk AI systems (under Article 22) and general-purpose AI models (under Article 54) established outside of the EU must, by written mandate, appoint an authorised representative, who may be an individual or a corporate, which is established in the EU. The provider must enable its authorised representative to perform the tasks specified in such mandate.
The identity, address and contact details of a high-risk AI system provider's authorised representative must be included in the instructions for use (Article 13), the EU declaration of conformity (Article 47) and for registration (Article 49).
Considering the role at a high level, an authorised representative acts as a key point of contact within the EU, is responsible for verifying and keeping all mandatory documentation that demonstrates the compliance of the high-risk AI system or general-purpose AI model with the AI Act, receive reports of serious incidents relating to AI systems and models, and must cooperate with the EU authorities. There are also additional actions that the authorised representations may need to take on behalf of the provider, such as registering themselves and the AI system in the EU database under Article 49.
While these may appear to be documentation and “mailbox” requirements, the authorised representative has strict responsibilities to step into the position of the provider to ensure its compliance with the AI Act. The authorised representative must also terminate its mandate with a provider if it considers or has reason to consider the provider to be acting contrary to its obligations pursuant to the AI Act. In such a case, it must immediately inform the relevant market surveillance authority, as well as, where applicable, the relevant notified body, about the termination of the mandate and the reasons.
Under Article 22(3), the authorised representative must perform the tasks specified in the mandate received from the provider of a high-risk AI system and provide a copy of the mandate to the market surveillant authority upon request. The mandate must empower the authorised representative to carry out the following tasks:
Similarly, under Article 54(3), the authorised representative must perform the tasks specified in the mandate received from the provider of a general-purpose AI model and provide a copy of the mandate to the market surveillant authority upon request. Note that these obligations do not apply to general-purpose AI models released under a free and open-source licence unless the general-purpose AI model presents systemic risk.
The mandate must empower the authorised representative to carry out the following tasks:
An appointment letter setting out the mandate would be sufficient under the requirements of the Act. However, we would advise that the parties put in place a separate services agreement to clarify responsibilities, liabilities, compensation, termination rights and other commercial terms in more detail, particularly if appointing an external individual or firm.
Outside of the normal contractual terms to be considered in any engagement of this nature, key issues and topics to address include:
Authorised representatives will play a pivotal role in bridging the gap between high-risk AI system and general-purpose AI model providers outside the EU and the EU regulatory environment, ensuring that the relevant AI system or model is introduced into the EU market in a compliant and responsible manner. The complexity of this role is significant, and it is clear that the authorised representative must be well-versed in the AI Act, other EU law, and the specific technicalities of the AI systems and models they represent.
It is therefore of critical importance, to the provider, the authorised representative, and those that the AI Act is designed to protect, that the role and obligations between the provider and the authorised representative are properly considered and documented in detail.
Appointing an authorised representative should not be viewed as an innocuous exercise to fulfil a bureaucratic requirement.
If you enjoyed these articles, you may also be interested in our recent publications: