dpiaai-actgdprcompliance

DPIA for AI Projects: A Step-by-Step Guide for SMBs

April 15, 20267 min readPixel Management

This article is also available in Dutch

A DPIA (Data Protection Impact Assessment) is a mandatory analysis when a processing activity creates high risk to data subjects. For AI projects that touch customer data, employee data, or health data, it's almost always required — and under the EU AI Act the bar is now even higher.

This step-by-step guide shows when a DPIA is required, what goes in it, and how to complete one in 2–4 weeks without stalling the project.

When is a DPIA required?

Under GDPR, a DPIA is required when processing creates "high risk" to the rights and freedoms of data subjects. The Dutch DPA publishes a list of cases where this is automatically so. For AI projects, you'll almost always meet at least two of these criteria:

  • Large-scale processing of personal data — for example, training or running an AI model on a customer database
  • Systematic monitoring — AI tools that analyze behavior or performance of people
  • Automated decision-making with legal effect — credit scoring, hiring selection, price differentiation
  • Processing of special categories — health, religion, criminal record, biometrics
  • Use of innovative technology — and AI explicitly falls under this

If two or more of those apply to your project, a DPIA isn't optional. When it's ambiguous, run one anyway — it's cheaper than a regulator's retrospective. For the broader context on AI regulation in the Netherlands, see our pillar on AI legislation in the Netherlands and the EU AI Act.

What must a DPIA contain?

Article 35(7) GDPR prescribes four mandatory components — and the risk analysis is one of them, not an addition. In practice, the risk analysis and mitigation sections are where most of the real work sits.

1. Systematic description of the processing What exactly is processed, by which system, for what purpose? For AI this means: which data is used for training, which for inference, where it's stored, who has access, and how long it's retained.

2. Necessity and proportionality Is this processing truly required to achieve the purpose? Can it be done in a less invasive way? The "data minimization" question: are you using more personal data than strictly necessary?

3. Risk assessment What risks do data subjects face from this processing? Think discrimination, loss of autonomy, reputational harm, or being categorized incorrectly. AI-specific risks include bias, hallucinations, and wrong decisions based on outdated data.

4. Planned measures What will you do to reduce those risks? Encryption, pseudonymization, human-in-the-loop controls, audit trails, an option for data subjects to challenge automated decisions.

This doesn't need to be an 80-page document. 15–25 pages is enough for most SMB projects, as long as the content is concrete and choices are well-reasoned.

Seven steps to an efficient DPIA

A properly run DPIA for an AI project doesn't have to take months. This approach gets you through in 2–4 weeks.

Step 1: Define the scope (week 1, day 1–2)

Pin down which AI project you're assessing. One system, one use case, one data flow. Don't try to assess three projects at once — it makes the analysis vague and the mitigation unclear.

Step 2: Map the data flow (week 1, day 3–5)

Draw a diagram of the data flow: from source to processing to storage to output. Who are the data subjects? What data do you collect? Which vendors are in the chain (OpenAI, AWS, a RAG platform)? This step connects directly to getting business data ready for AI — without a clear data map you can't run a DPIA.

Step 3: Assess necessity (week 2, day 1–2)

Ask yourself: is AI really the best solution, or could less intrusive technology work? Are you using more personal data than needed? This step is crucial because it's where many projects get red-flagged by the data protection officer.

Step 4: Identify risks (week 2, day 3–5)

Systematically walk through GDPR risks and AI-specific risks. Bias, hallucination, model drift, adversarial input, bad training data. For a checklist of AI-specific risks, our post on AI risks and liability is a useful starting point.

Step 5: Design mitigation measures (week 3, day 1–3)

For each identified risk: what measure reduces it? Not every risk must be brought to zero, but each risk needs a proportional measure. Common measures:

  • Human-in-the-loop — a person reviews AI decisions that impact individuals
  • Audit logs — every decision can be reconstructed afterwards
  • Bias testing — structurally test for unwanted patterns before deployment
  • Transparency to data subjects — clearly communicate that AI is being used
  • Right to object — a fallback to human review

Step 6: Formal review (week 3, day 4–5)

Have the DPIA reviewed by your data protection officer (DPO). If you don't have one, use an external privacy lawyer — typically €500–€1,500, and it saves a lot of risk. If residual risk remains "high" after mitigation, you must consult the Dutch DPA before starting the processing.

Step 7: Register, communicate, monitor (week 4)

Log the DPIA in your record of processing activities, communicate outcomes to the involved teams, and schedule a review moment. A DPIA isn't a one-off — significant changes to the system or data require an update.

Save 8 hours per week on setting up AI compliance and auditing processes

Common mistakes

In practice, we see a handful of recurring mistakes on DPIAs for AI projects:

  • Starting too late — the DPIA is run after the AI is already built. Result: you find structural privacy issues only when fixing them is expensive.
  • Staying generic — "we use encryption" isn't a mitigation. Describe which encryption, where, and against which specific risk.
  • Underestimating vendor risk — if you use an external AI service (OpenAI, Anthropic, Google), you also have to assess their subprocessors, data locations, and retention periods.
  • Forgetting to inform the data subject — transparency is itself a mitigation. If users don't know AI is being used, you can't obtain valid consent.
  • Not integrating with your AI compliance checklist — the DPIA doesn't stand alone; it's part of a broader compliance approach.

DPIA versus Conformity Assessment under the AI Act

Since August 2026, the EU AI Act requires a separate conformity assessment for "high-risk" AI systems, in addition to the GDPR DPIA. This isn't a replacement but an addition: the DPIA protects personal data, the conformity assessment proves your AI system is safe, transparent, and non-discriminatory.

For most SMBs this means two documents. The saving grace is that they overlap by roughly half — data flow diagrams, risk analyses, and mitigation measures carry over cleanly.

Learn more about AI consulting?

View service

What does a DPIA cost?

Rough costs for an SMB AI project:

  • Internal DPIA by your own DPO: 40–80 hours of work. No direct cash cost, but opportunity cost.
  • External privacy lawyer as advisor: €1,500–€5,000 for review and advice
  • Fully outsourced to a specialist firm: €5,000–€12,000
  • Tools and templates: €200–€800 for software or a good template

For a first DPIA, external guidance is usually worth it — you learn the process properly and the second one you do yourself faster. Combine it with a broader AI consulting engagement, and you also get a realistic read on which AI projects are actually worth pursuing.

Start before you build

Start the DPIA before you start the AI project, not afterwards as a compliance tickbox. Many risks are cheap to fix in design and expensive to fix in production — a DPIA that begins after code has already shipped is almost always the DPIA that forces painful rework.

The DPIAs that hold up under scrutiny aren't the longest ones, either. They're the ones where someone can point at any risk on the list and immediately show the matching mitigation running in the production system. Write yours so that you don't have to rewrite it the day an incident happens — that's the real test.

Curious how much time you could save?

Request a free automation scan. We'll analyze your processes and show you where the gains are — no strings attached.