DPDP Act 2024: A Plain-English Implementation Guide for Indian Companies
The Digital Personal Data Protection Act 2024 is live. This plain-English guide tells Indian companies exactly what they need to do to comply.
The EU AI Act has extraterritorial reach. If your AI system is deployed in the EU, you're in scope — regardless of where you're based. Here's what to do
When the EU AI Act came into force in August 2024, most coverage focused on its implications for EU companies. What received less attention was its extraterritorial scope — which means that US and UK companies whose AI systems are used within the EU are subject to the regulation, regardless of where those companies are headquartered.
This guide explains what the EU AI Act requires, which provisions apply to non-EU companies, and what the practical compliance steps are.
The EU AI Act classifies AI systems into four risk tiers:
A small category of AI applications prohibited entirely: social scoring by governments, real-time biometric surveillance in public spaces (with narrow exceptions), AI that exploits vulnerabilities of specific groups, and AI that manipulates human behaviour subliminally.
This is where most enterprise AI falls. High-risk systems include: AI used in critical infrastructure, education (including assessment tools), employment decisions (CV screening, performance monitoring), essential services (credit scoring, insurance risk), law enforcement, migration management, and administration of justice.
High-risk AI systems require: a conformity assessment before deployment, technical documentation, a risk management system, human oversight mechanisms, logging and audit trails, and registration in the EU AI database.
AI systems that interact with humans — chatbots, emotion recognition, deep fakes — must inform users they're interacting with AI. This applies to customer service chatbots, AI-generated content, and similar systems.
AI systems like spam filters, inventory management, and most recommendation engines fall into this category with no mandatory requirements beyond the general principles.
The EU AI Act applies to: providers of AI systems placed on the EU market or put into service in the EU, operators of AI systems within the EU, and providers or operators of AI systems in a third country where the output of the system is used in the EU.
In practice: if you're a US SaaS company whose product uses AI and you sell to EU customers, you are in scope. If you're a UK company whose AI system processes data about EU residents, you are in scope. There are no small-company exemptions for high-risk systems.
The Act is being applied in phases. Provisions on prohibited AI practices applied from February 2025. Governance and general-purpose AI model provisions apply from August 2025. High-risk AI system requirements apply from August 2026. By 2027, all provisions are fully in force.
For UK and US companies with EU market exposure, August 2026 is the critical deadline — and compliance projects for high-risk AI systems typically take 6–12 months.
Step 1: AI inventory — catalogue all AI systems your organisation uses or provides, including third-party AI embedded in your products. Step 2: Risk classification — classify each system against EU AI Act risk tiers. Step 3: Conformity assessment — for high-risk systems, commission a conformity assessment and technical documentation. Step 4: Governance framework — build the risk management system, human oversight mechanisms, and logging requirements. Step 5: EU representative — if you're a non-EU company, you may need to appoint an EU-based authorised representative.
Chabil Consulting provides AI strategy and governance advisory for UK and US companies navigating the EU AI Act. Our 8-week AI Governance programme produces a compliant framework for your high-risk AI systems. Contact us at hello@chabilconsulting.com.
The Digital Personal Data Protection Act 2024 is live. This plain-English guide tells Indian companies exactly what they need to do to comply.
CSRD Wave 1 is live. The materiality assessment is the foundation of your disclosure. Here's how to do it properly — step by step
An exploration of why Scope 3 supply chain emissions are so difficult to measure accurately, and the practical steps companies can take to improve data quality.
Want to discuss this topic?
Our advisors are available for a no-obligation conversation.