Chapter V — General-purpose AI modelsArticle 53

Article 53: Obligations for Providers of General-Purpose AI Models

In effect since 2 Aug 202510 min readEUR-Lex verified Apr 2026

Article 53 is the operational backbone of Chapter V: it sets the baseline obligations that every provider of a GPAI model must meet when placing the model on the Union market—regardless of systemic-risk status. The four pillars: (1)(a) prepare technical documentation per Annex XI; (1)(b) provide downstream AI system providers with information and documentation to understand capabilities and limitations and comply with their own obligations; (1)(c) establish a policy to comply with Union copyright law, including the text and data mining opt-out regime under Directive (EU) 2019/790; (1)(d) make publicly available a sufficiently detailed summary of training content using a template provided by the AI Office. Article 53(2) provides a narrow exception for free and open-source GPAI models that are not systemic risk.

Who does this apply to?

  • -Providers of GPAI models placing them on the Union market (open-weights or closed)
  • -Providers of GPAI models whose output is used in the Union via downstream integrations
  • -Engineering, legal, and compliance teams at foundation-model companies assembling Annex XI documentation and copyright policies
  • -Authorised representatives acting under Article 54 on behalf of third-country GPAI providers

Scenarios

A foundation model provider publishes a model card covering architecture, parameters, training data provenance, compute resources, and energy—mapped to every Annex XI Section 1 heading.

Aligned with Article 53(1)(a) documentation duty.
Ref. Art. 53(1)(a)

A GPAI provider supplies downstream deployers with integration instructions, known limitations, safety-relevant failure modes, and an acceptable use policy.

Aligned with Article 53(1)(b) downstream information duty.
Ref. Art. 53(1)(b)

A model training pipeline scrapes web data without honouring robots.txt opt-out signals from content creators under Directive 2019/790.

Potential Article 53(1)(c) breach—providers must establish a copyright compliance policy respecting text and data mining opt-outs.
Ref. Art. 53(1)(c)

An open-source model is released under a permissive licence with no commercial restrictions and is not classified as systemic risk.

May benefit from the Article 53(2) narrow exception for free open-source models—but the copyright policy (Article 53(1)(c)) and training content summary (Article 53(1)(d)) obligations still apply.
Ref. Art. 53(2)

The four Article 53(1) pillars (in plain terms)

Every GPAI provider must:

**Pillar (a) — Technical documentation (Annex XI):** Prepare and maintain technical documentation covering model architecture, training process, data, compute, and energy—scaled to the model's size and risk profile. This is the private-facing dossier the AI Office can request.

Pillar (b) — Downstream information: Provide enough information and documentation to downstream AI system providers so they can understand the model's capabilities and limitations and comply with their own regulatory obligations (e.g. Annex IV for high-risk systems).

Pillar (c) — Copyright compliance policy: Establish and implement a policy to respect Union copyright law, in particular the text and data mining (TDM) opt-out regime under Article 4 of Directive (EU) 2019/790. Identify and honour rights reservations expressed by rightsholders, including through machine-readable means.

Pillar (d) — Public training content summary: Make publicly available a sufficiently detailed summary of the content used for training, following a template provided by the AI Office. This is the public-facing transparency layer.

Article 53(2) — Open-source exception (narrow)

The obligations in Article 53(1)(a) and (b) (documentation and downstream info) do not apply to providers of free and open-source GPAI models—unless the model is classified as having systemic risk under Article 51.

Critical limits of the exception: - The copyright policy (1)(c) and training content summary (1)(d) still apply even for open-source models. - The exception requires the model to be released under a licence allowing access, use, modification, and distribution—with or without additional conditions. - The Act's definition of "free and open-source" for this purpose must be checked against the authentic text. - If the model later meets systemic-risk criteria, the exception falls away and full obligations activate.

How Article 53 connects to the rest of the Act

  • Annex XI — The checklist Article 53(1)(a) references: minimum content of technical documentation for GPAI models.
  • Annex XII — Transparency information that may overlap with the Article 53(1)(d) public training summary.
  • Article 51Systemic-risk classification criteria: if met, Article 55 duties layer on top of Article 53.
  • Article 52Procedure for classification (notification, rebuttal, publication).
  • Article 54Authorised representatives may discharge certain Article 53 duties on behalf of third-country providers.
  • Article 55Additional obligations for GPAI models with systemic risk (model evaluation, adversarial testing, incident reporting, cybersecurity).
  • Article 56Codes of practice as a compliance pathway.
  • Article 11 + Annex IVHigh-risk system documentation (downstream providers rely on Article 53(1)(b) information to populate parts of their Annex IV file).
  • Directive (EU) 2019/790, Article 4 — The TDM opt-out regime that Article 53(1)(c) requires providers to respect.
  • Article 101Penalties specific to GPAI providers.
  • Article 113Application dates (Chapter V from 2 August 2025; transitional for pre-existing models until 2 August 2026).

Official wording: Article 53 (English)

1. Providers of general-purpose AI models shall:

(a) draw up and keep up-to-date the technical documentation of the model, including its training and testing process and the results of its evaluation, which shall contain, at a minimum, the information set out in Annex XI for the purpose of providing it, upon request, to the AI Office and the national competent authorities;
(b) draw up, keep up-to-date and make available information and documentation to providers of AI systems who intend to integrate the general-purpose AI model into their AI systems. Without prejudice to the need to observe and protect intellectual property rights and confidential business information or trade secrets in accordance with Union and national law, the information and documentation shall:
(i) enable providers of AI systems to have a good understanding of the capabilities and limitations of the general-purpose AI model and to comply with their obligations pursuant to this Regulation; and

(ii) contain, at a minimum, the elements set out in Annex XII;

(c) put in place a policy to comply with Union law on copyright and related rights, and in particular to identify and comply with, including through state-of-the-art technologies, a reservation of rights expressed pursuant to Article 4(3) of Directive (EU) 2019/790;
(d) draw up and make publicly available a sufficiently detailed summary about the content used for training of the general-purpose AI model, according to a template provided by the AI Office.
2. The obligations set out in paragraph 1, points (a) and (b), shall not apply to providers of AI models that are released under a free and open-source licence that allows for the access, usage, modification, and distribution of the model, and whose parameters, including the weights, the information on the model architecture, and the information on model usage, are made publicly available. This exception shall not apply to general-purpose AI models with systemic risks.
3. Providers of general-purpose AI models shall cooperate as necessary with the Commission and the national competent authorities in the exercise of their competences and powers pursuant to this Regulation.
4. Providers of general-purpose AI models may rely on codes of practice within the meaning of Article 56 to demonstrate compliance with the obligations set out in paragraph 1 of this Article, until a harmonised standard is published. Compliance with European harmonised standards grants providers the presumption of conformity to the extent that those standards cover those obligations. Providers of general-purpose AI models who do not adhere to an approved code of practice or do not comply with a European harmonised standard shall demonstrate alternative adequate means of compliance for assessment by the Commission.
5. For the purpose of facilitating compliance with Annex XI, in particular points 2 (d) and (e) thereof, the Commission is empowered to adopt delegated acts in accordance with Article 97 to detail measurement and calculation methodologies with a view to allowing for comparable and verifiable documentation.
6. The Commission is empowered to adopt delegated acts in accordance with Article 97(2) to amend Annexes XI and XII in light of evolving technological developments.
7. Any information or documentation obtained pursuant to this Article, including trade secrets, shall be treated in accordance with the confidentiality obligations set out in Article 78.

Recitals (preamble) on EUR-Lex

The recitals in the same consolidated AI Act on EUR-Lex contextualise GPAI provider duties, copyright and TDM, the open-source exception rationale, and downstream information sharing. Use the official preamble on EUR-Lexdo not rely on unofficial recital lists without checking sequence and wording against the authentic text.

Compliance checklist

  • Prepare Annex XI Section 1 documentation (architecture, parameters, data provenance, compute, energy) for every model release on the Union market.
  • Establish a downstream information package per Article 53(1)(b) / Annex XII for AI system integrators.
  • Implement a copyright compliance policy honouring TDM opt-outs under Directive 2019/790 Article 4(3), including machine-readable opt-out detection.
  • Publish a training content summary using the AI Office template (once available).
  • If open-source: verify you meet the exception conditions and remember that (1)(c) copyright policy and (1)(d) training summary still apply.
  • If designated systemic risk: add Article 55 obligations and Annex XI Section 2 documentation on top of Article 53 baseline.
  • Version-control documentation with each model release or significant update.
  • Track AI Office implementing acts, templates, and codes of practice that operationalise Article 53.

Check your GPAI model against Article 53 obligations—free assessment.

Start Free Assessment

Related Articles

Related annexes

  • Annex XI — GPAI technical documentation (Article 53(1)(a) reference)
  • Annex XII — Transparency information for downstream providers (Article 53(1)(b) reference)
  • Annex XIII — Criteria for systemic-risk classification

Frequently asked questions

Do open-source models avoid all Article 53 duties?

No. The Article 53(2) exception only removes the documentation (1)(a) and downstream info (1)(b) duties for free open-source non-systemic-risk models. The copyright policy (1)(c) and training summary (1)(d) still apply.

What is the 'AI Office template' for the training summary?

The AI Office is mandated to provide a template for the publicly available training content summary. Track AI Office publications for the latest version and format requirements.

Does fine-tuning a base model trigger Article 53?

Fine-tuning can create a new provider relationship. If you substantially modify a GPAI model and place it on the market as your own, you may be treated as a GPAI provider with Article 53 obligations. Assess your specific facts against the definitions.

How does Article 53 relate to Article 11 (high-risk documentation)?

Article 53 covers GPAI model documentation (Annex XI). Article 11 covers high-risk AI system documentation (Annex IV). They operate at different layers—upstream model vs downstream system. The downstream provider uses Article 53(1)(b) information from the GPAI provider to help populate their Annex IV file.

What happens if I cannot share trade secrets with downstream providers?

Article 53(1)(b) explicitly states that information sharing is 'without prejudice to the need to observe and protect intellectual property rights and confidential business information or trade secrets'. You must share enough for compliance but can protect genuinely confidential material.