Article 2: Scope
Article 2 is the EU AI Act’s scope map: it lists who is covered (providers, deployers, importers, distributors, and more), territorial hooks for third countries, partial application for certain high-risk systems tied to Annex I products, and a series of exclusions (military, R&D, personal use, FOSS in specified cases, etc.). Pair it with Article 1 (subject matter) and Article 3 (definitions). Always verify paragraph-level wording on EUR-Lex.
Who does this apply to?
- -Providers placing AI systems or GPAI models on the Union market or putting them into service—including third-country providers caught by Article 2(1)(a) or (c)
- -Deployers established or located in the Union, and third-country deployers where AI output is used in the Union
- -Importers, distributors, product manufacturers integrating AI, authorised representatives, and affected persons in the Union as specified in Article 2(1)
- -Legal and compliance teams triaging military, R&D, FOSS, employment, and intermediary-service edge cases under Article 2(3)–(12)
Scenarios
A Singapore-based company hosts a general-purpose model API and contracts with customers in Italy; weights are served from outside the EU.
A Polish bank deploys a credit-scoring AI only for its Warsaw operations; the vendor is Canadian.
A U.S. image generator’s outputs are consumed only inside the Union by a marketing team for EU ad campaigns.
An AI component is embedded in a medical device already regulated under Union harmonisation legislation listed in Annex I, Section B.
What Article 2 does (in plain terms)
Article 2 answers who must comply and which situations fall in or out of the AI Act. Paragraph 1 is the core affirmative list—from Union-established deployers to third-country actors linked by placing on the market, putting into service, or use of output in the Union.
Paragraph 2 is a partial application rule for a subset of high-risk AI systems tied to Annex I, Section B products: only certain articles apply unless sector legislation has integrated the AI Act’s high-risk requirements.
Paragraphs 3–12 layer exclusions and carve-outs (national security / defence lines, R&D, pre-market testing with limits, personal non-professional use, FOSS conditions, labour-law autonomy, GDPR primacy for personal data, etc.). None of this replaces Article 3 role labels or operational chapters—Article 2 is the boundary provision you read immediately after Article 1.
Article 2(1): the coverage checklist
Use EUR-Lex for exact definitions of each actor; in practice, triage in this order:
- (a) Is someone providing an AI system or GPAI model to the Union market (including from abroad)?
- (b) Is a deployer established or physically in the Union?
- (c) Is output of a third-country AI system used in the Union?
- (d)–(f) Importers, distributors, manufacturers placing AI with product, authorised representatives
- (g) Affected persons in the Union where the provision creates standing or procedural hooks
If none of the hooks fire and no exclusion applies, the AI Act may not govern the scenario—but document the reasoning and re-check when facts change.
How Article 2 connects to the rest of the Act
- Article 1 — Subject matter: what the Act is for; Article 2 decides who sits inside the tent.
- Article 3 — Definitions: who counts as provider, deployer, importer, distributor, etc.
- Article 4 — AI literacy: once in scope, providers and deployers must build staff competence proportionately.
- Article 5 — Prohibited practices (still subject to scope).
- Article 6 + Annex III — High-risk classification (Article 2(2) narrows the AI Act slice for certain Annex I Section B intersections).
- Article 50 — Transparency systems (FOSS carve-out in Article 2(12) explicitly references Article 5 or Article 50).
- Article 113 — When obligations become applicable.
For personal data, Article 2(7) preserves primacy of Regulation (EU) 2016/679 (GDPR), (EU) 2018/1725, Directive 2002/58/EC, and Directive (EU) 2016/680; align DPIA and AI conformity workstreams.
Practical checklist (scoping)
- Map every entity in the value chain to Article 2(1)(a)–(g) before arguing “we are not in scope.”
- Third-country + output scenarios: treat Article 2(1)(c) as a deliberate jurisdictional hook—validate use location and data flows.
- Defence / military lines: read Article 2(3) carefully; national-security competence language is narrow and politically sensitive—use real legal advice for classified programmes.
- FOSS: Article 2(12) is not a blanket exemption—high-risk placement, Article 5, or Article 50 systems can still be caught.
- R&D vs market: Article 2(6)–(8) distinguish scientific R&D, pre-market R&D, and real-world testing (explicitly excluded from the R&D safe harbour).
- Staggered law: keep Article 113 beside Article 2 in every implementation memo.
Official wording (excerpt): Article 2 in full
Note: The following paragraphs reproduce Article 2 as commonly cited from the English consolidated text of Regulation (EU) 2024/1689 on EUR-Lex. Always re-open EUR-Lex for the definitive wording, numbering, and any later amendments before compliance decisions.
1. This Regulation applies to:
(a) providers placing on the market or putting into service AI systems or placing on the market general-purpose AI models in the Union, irrespective of whether those providers are established or located within the Union or in a third country;
(b) deployers of AI systems that have their place of establishment or are located within the Union;
(c) providers and deployers of AI systems that have their place of establishment or are located in a third country, where the output produced by the AI system is used in the Union;
(d) importers and distributors of AI systems;
(e) product manufacturers placing on the market or putting into service an AI system together with their product and under their own name or trademark;
(f) authorised representatives of providers, which are not established in the Union;
(g) affected persons that are located in the Union.
2. For AI systems classified as high-risk AI systems in accordance with Article 6(1) related to products covered by the Union harmonisation legislation listed in Section B of Annex I, only Article 6(1), Articles 102 to 109 and Article 112 apply. Article 57 applies only in so far as the requirements for high-risk AI systems under this Regulation have been integrated in that Union harmonisation legislation.
3. This Regulation does not apply to areas outside the scope of Union law, and shall not, in any event, affect the competences of the Member States concerning national security, regardless of the type of entity entrusted by the Member States with carrying out tasks in relation to those competences.
This Regulation does not apply to AI systems where and in so far they are placed on the market, put into service, or used with or without modification exclusively for military, defence or national security purposes, regardless of the type of entity carrying out those activities.
This Regulation does not apply to AI systems which are not placed on the market or put into service in the Union, where the output is used in the Union exclusively for military, defence or national security purposes, regardless of the type of entity carrying out those activities.
4. This Regulation applies neither to public authorities in a third country nor to international organisations falling within the scope of this Regulation pursuant to paragraph 1, where those authorities or organisations use AI systems in the framework of international cooperation or agreements for law enforcement and judicial cooperation with the Union or with one or more Member States, provided that such a third country or international organisation provides adequate safeguards with respect to the protection of fundamental rights and freedoms of individuals.
5. This Regulation shall not affect the application of the provisions on the liability of providers of intermediary services as set out in Chapter II of Regulation (EU) 2022/2065.
6. This Regulation does not apply to AI systems or AI models, including their output, specifically developed and put into service for the sole purpose of scientific research and development.
7. Union law on the protection of personal data, privacy and the confidentiality of communications applies to personal data processed in connection with the rights and obligations laid down in this Regulation. This Regulation shall not affect Regulation (EU) 2016/679 or (EU) 2018/1725, or Directive 2002/58/EC or (EU) 2016/680, without prejudice to Article 10(5) and Article 59 of this Regulation.
8. This Regulation does not apply to any research, testing or development activity regarding AI systems or AI models prior to their being placed on the market or put into service. Such activities shall be conducted in accordance with applicable Union law. Testing in real world conditions shall not be covered by that exclusion.
9. This Regulation is without prejudice to the rules laid down by other Union legal acts related to consumer protection and product safety.
10. This Regulation does not apply to obligations of deployers who are natural persons using AI systems in the course of a purely personal non-professional activity.
11. This Regulation does not preclude the Union or Member States from maintaining or introducing laws, regulations or administrative provisions which are more favourable to workers in terms of protecting their rights in respect of the use of AI systems by employers, or from encouraging or allowing the application of collective agreements which are more favourable to workers.
12. This Regulation does not apply to AI systems released under free and open-source licences, unless they are placed on the market or put into service as high-risk AI systems or as an AI system that falls under Article 5 or 50.
Recitals (preamble) on EUR-Lex
The recitals in the same consolidated AI Act on EUR-Lex often illuminate why scope, territorial, and exclusion rules are framed as they are. Use the official preamble on EUR-Lex when interpreting Article 2—do not rely on unofficial recital lists without checking sequence and wording against the authentic text.
Compliance checklist
- Maintain a dated Article 2 decision tree (affirmative hooks in §1, then exclusions in §3–§12) for each product line.
- For Annex I Section B intersections, flag Article 2(2) partial application and track sector regulation integration.
- Where output is consumed cross-border, document where “use in the Union” occurs and which actor is provider vs deployer under Article 3.
- Align GDPR records (lawful basis, DPIA) with AI Act documentation—Article 2(7) preserves data-protection law.
- Re-run scope analysis when licensing model changes (FOSS vs commercial, on-prem vs API, military dual-use narratives).
Unsure whether your setup falls within EU AI Act scope? Start the free assessment.
Start Free AssessmentRelated Articles
Article 1: Subject matter
Article 3: Definitions
Article 4: AI literacy
Article 5: Prohibited AI Practices
Article 6: Classification Rules for High-Risk Systems
Article 50: Transparency Obligations for Providers and Deployers of Certain AI Systems
Article 51: Classification of GPAI Models with Systemic Risk
Article 113: Entry into Force and Application Dates
Related annexes
- Annex I — Union harmonisation legislation (Section B referenced in Article 2(2))
Frequently asked questions
Does Article 2 exempt all open-source AI?
No. Article 2(12) excludes AI systems released under free and open-source licences only until they are placed on the market or put into service as high-risk systems or as systems falling under Article 5 or 50. Read the paragraph as a conditional carve-out, not blanket permission.
We only do R&D—are we automatically out of scope?
Article 2(6)–(8) exclude certain research and pre-market development, but the carve-outs are specific (e.g. scientific R&D, activities before placing on the market). Real-world condition testing is explicitly not covered by the R&D exclusion—verify each activity on EUR-Lex.
Does Article 2 replace Article 3 definitions?
No. Article 2 sets scope; Article 3 defines actors and core concepts. Compliance memos should cite both: scope first, then role-specific duties in later chapters.