EU AI Act HR Compliance Matrix & Checklist - Interactive Tool
Posted by: Pesync Team
| Category | Action Item | HR Specifics & Examples |
|---|---|---|
| 1. Strategy | Annex III Classification | Inventory all HR tech. Systems for recruitment, promotion, and termination are strictly "High-Risk." You must verify if your vendors are compliant with Articles 8-15. |
| 2. Governance | AI Literacy Training | Under Article 4, training is mandatory. Recruiters must understand "automation bias"—the tendency to trust AI scores over human judgment—and how to spot biased outputs. |
| 3. Transparency | Candidate Rights Notice | Inform candidates if AI is evaluating them. Article 86 gives affected persons a legal right to a clear and meaningful explanation of the AI system's role in any decision that produces legal effects or significantly impacts their health, safety, or fundamental rights. |
| 4. Impact | Fundamental Rights Assessment | Article 27 requires a FRIA before deployment. You must document how the tool impacts non-discrimination, worker privacy, and data protection (often paired with a DPIA). |
| 5. Oversight | Human-in-the-Loop (HITL) | Per Article 14, high-risk HR systems must not "Auto-Reject." A human must have the final say and the clear authority to override the AI's recommendation without fear of reprisal. |
| 6. Records | Logging & Retention | Article 12 requires automatic logging capability to be built into high-risk systems. Article 19 requires deployers to keep those logs for at least 6 months. Technical documentation for the system must be kept for 10 years under Article 18. |
| 7. Incidents | 15-Day Reporting | If a system malfunctions or shows systemic bias, you must report the serious incident to your national authority within 15 days of discovery, as per Article 73. |
|
© 2026 Pesync LLC. All rights reserved.
|