DPDP Act 2024: A Plain-English Implementation Guide for Indian Companies
The Digital Personal Data Protection Act 2024 is live. This plain-English guide tells Indian companies exactly what they need to do to comply.
Boards are approving AI deployments without the governance frameworks to manage them. Here are the five questions every board should be able to answer.
There is a pattern emerging in boards across the UK, Europe, and the US: leadership is excited about AI's potential, the technology team is deploying AI tools at pace, and the board has no visibility into what's being deployed, what data it's processing, or what happens when something goes wrong.
This isn't just a regulatory risk — though the EU AI Act, the UK's AI framework, and the US executive orders on AI create genuine compliance obligations. It's a governance failure that exposes the organisation to reputational, liability, and strategic risk.
AI systems make or influence decisions at scale. A flawed AI model used in HR screening can discriminate unlawfully against thousands of candidates before anyone notices. A customer service AI that generates incorrect information creates liability. A fraud detection algorithm that's biased by training data creates regulatory exposure. The board, as the body ultimately accountable for risk management, cannot delegate all responsibility for these outcomes to the technology team.
The EU AI Act makes this explicit: the governance and oversight requirements for high-risk AI systems must be implemented at a management level with appropriate authority and accountability.
This includes officially approved tools and shadow IT. The answer frequently surprises boards: employees are using ChatGPT, Copilot, Gemini, and dozens of other AI tools — some processing sensitive company and customer data — that IT and legal have never reviewed. An AI inventory is the starting point for any governance programme.
AI systems that process personal data are subject to data protection law in addition to AI regulation. If your HR AI processes employee data, your customer service AI processes customer interactions, or your fraud detection system processes financial behaviour data, you need to map the data flows and assess the legal basis for processing.
The board needs to know who, by name, is accountable when an AI system causes harm. This means a named accountable person (often the CISO, CTO, or a designated AI Officer), a clear escalation path, and an incident response procedure.
If you have high-risk AI systems used by EU customers or employees, you need conformity assessments, technical documentation, and registration in the EU AI database. These are not optional — and non-compliance carries fines of up to EUR 30 million or 6% of global annual revenue.
Most organisations have employees using generative AI tools informally. Without a policy, you have no visibility into what data is being input, what outputs are being used, or whether those outputs are creating liability. An acceptable use policy is the minimum baseline.
A functional AI governance framework for board purposes has four components: an AI register (what systems exist and what they do), a risk classification framework (aligned to EU AI Act tiers), accountable ownership at senior level, and a review cadence (at least annually, or whenever significant new AI systems are deployed).
Chabil Consulting builds AI governance frameworks for organisations at board and management level. Our 8-week programme produces all four components plus a CPRA and GDPR-aligned data processing assessment for existing AI systems. Contact us at hello@chabilconsulting.com
The Digital Personal Data Protection Act 2024 is live. This plain-English guide tells Indian companies exactly what they need to do to comply.
CSRD Wave 1 is live. The materiality assessment is the foundation of your disclosure. Here's how to do it properly — step by step
An exploration of why Scope 3 supply chain emissions are so difficult to measure accurately, and the practical steps companies can take to improve data quality.
Want to discuss this topic?
Our advisors are available for a no-obligation conversation.