Article 27: Fundamental Rights Impact Assessment
Article 27 requires certain deployers of high-risk AI systems listed in Annex III to conduct a Fundamental Rights Impact Assessment (FRIA) before putting the system into service. This applies to public-sector bodies, private operators providing public services, and deployers of certain credit scoring and insurance pricing systems. The FRIA must describe processes, affected persons, risks to fundamental rights, and mitigation measures. The deployer must notify the market surveillance authority and provide the FRIA results. Where a GDPR DPIA already covers equivalent content, it may supplement rather than replace the FRIA. Always verify on EUR-Lex.
Who does this apply to?
- -Bodies governed by public law
- -Private operators providing public services
- -Operators deploying high-risk Annex III systems under Article 6(1) with certain risk profiles
Scenarios
A municipality deploys high-risk AI for welfare eligibility scoring affecting large populations.
A private clinic deploys Annex III medical device software already covered by an exhaustive GDPR Article 35 DPIA that mirrors FRIA elements.
When a FRIA is required
Article 27 applies to deployers of high-risk systems under Article 6(1) when the system is Annex III and a deliberate assessment of natural persons is performed on the basis of biometric data, or the system is used for education, essential public services, law enforcement, migration, justice, or democratic processes—subject to thresholds in Article 27(1).
Always read the exact legal conditions in the consolidated text; this guide is not a substitute for legal advice.
What the FRIA must contain
At minimum, the assessment should describe the processes in which the system will be used, periods and frequency of use, categories of persons and groups likely affected, risks of harm to fundamental rights, and mitigation measures. Cross-reference GDPR rights and sector-specific safeguards.
Authority notification and consultation
Deployers must notify the market surveillance authority when first use begins and submit the FRIA (Article 27(4)). Authorities may comment within three months; deployers must give written feedback on serious concerns (Article 27(5)).
Compliance checklist
- Gate new Annex III deployments with a FRIA/DPIA decision tree.
- Engage impacted stakeholders and document their input.
- Map each fundamental right risk to concrete controls and owners.
- File notifications in the format your Member State requires.
- Archive FRIA updates alongside model or policy changes.
See if your deployment triggers FRIA—run the free assessment.
Start Free AssessmentRelated Articles
Related annexes
- Annex III — High-risk use cases
Frequently asked questions
Is FRIA the same as a DPIA?
They overlap but are not identical. Article 27(6) allows reliance on an existing DPIA only when it fulfils FRIA elements; otherwise you must add material.
Which deployers must perform a fundamental rights impact assessment?
Article 27 requires FRIAs from deployers that are bodies governed by public law, private entities providing public services, and specific private deployers using high-risk AI for creditworthiness assessment (credit scoring) or risk assessment/pricing in life and health insurance.
How does the FRIA differ from a DPIA under the GDPR?
The FRIA under Article 27 focuses specifically on the impact of the high-risk AI system on fundamental rights (non-discrimination, privacy, freedom of expression, human dignity), whereas the DPIA under GDPR Article 35 assesses risks to personal data protection. Article 27(4) allows deployers to combine both assessments where the AI system processes personal data, but the FRIA scope is broader than data protection.