Rules for using AI & algorithms
If you process personal data by applying an algorithm, you must comply with the GDPR. Important aspects include lawfulness, transparency and security.
On this page
Important rules from the GDPR
Lawfulness
You need a legal basis to be allowed to process personal data. Have you done the check and found no legal basis in your circumstances? If this is the case, you are not allowed to use the personal data.
Transparency
- When processing personal data, you must be transparent towards your customers, patients, citizens or employees. In the GDPR, these persons are called data subjects (Articles 12, 13 and 14 of the GDPR).
- You must also set up a processing register (Article 30 of the GDPR).
- Do you use an algorithmic system for your decision-making, for example? You must then provide information about the underlying logic and the expected consequences of that processing for the data subject.
- For governments: use the algorithm register.
Purpose limitation
You may only process personal data for a purpose that you have determined in advance. You are limited to this purpose. This means that you may not simply process this personal data for any other purpose.
Data minimisation
If you use personal data for a specific purpose, you must do so with as little personal data as possible. You may not process data that are not demonstrably necessary.
You may also only store the data for a limited period. You must determine the retention periods in advance.
Accuracy
The personal data you process must be accurate (correct). This prevents incorrect or unexpected outcomes and undesirable effects for the data subject.
Security
You must properly secure all personal data that you process. To this end, you must take technical and organisational measures (Article 32 of the GDPR). When doing so, you have to take into account:
- the state of the art, or what is technically possible;
- the nature, scope, context and purposes of the processing;
- the risks for the rights and freedoms of data subjects.
Privacy by design & default
Are you going to develop an algorithmic system yourself? Then you must take into account the GDPR principles privacy by design and privacy by default. This means that you must develop, design and deploy systems in a privacy-friendly manner. User settings should be privacy-protective by default.
Drawing up a DPIA
Do you want to use algorithmic systems and will personal data be processed in the process? In that case, you are obliged to carry out a data protection impact assessment (DPIA). A DPIA is an instrument for identifying the privacy risks of data processing beforehand. You can then take measures to mitigate those risks.
Criteria for DPIA
In general, as an organisation you must carry out a DPIA if there is a high privacy risk for data subjects. To assess whether this is the case, you can use a list of criteria for a DPIA. Do 2 (or more) of the 9 criteria from this list apply to your processing? Then a DPIA is mandatory.
Please note: this also applies to pilots, tests and trial projects.
Voluntary DPIA
Even if a DPIA is not mandatory in your case, it may still be advisable to carry one out. You must comply with the GDPR in any case when using an algorithm. For example, you must:
- take technical and organisational measures to protect personal data;
- record your processing in a register;
- comply with a number of other transparency obligations;
- ensure that you properly facilitate the privacy rights of data subjects.
A DPIA can be a good guideline for this, allowing you to guide the (development) process in the right direction and involve the right experts.
Points for attention
In addition to the rules for regular DPIAs, you must pay specific attention to:
- technical aspects, such as algorithm choice, bias and NFL;
- transparency and explainability of the algorithm and its use;
- rights of data subjects, such as the right to erasure and the right to rectification.
Format for DPIA algorithms: IAMA
If you are looking for an impact assessment that enables you to assess human rights, you can use the Impact Assessment Human Rights and Algorithms (IAMA). This was developed on behalf of the central government and is recommended for use in the development and deployment of algorithms for governments and public organisations. The IAMA establishes links with relevant rules, instruments and assessment frameworks in the field of algorithms.
Prior consultation
It is quite possible that the use of algorithms poses a high risk to data subjects. And that you are unable to find (sufficient) measures to mitigate this risk. Then you have to consult with the Dutch Data Protection Authority (Dutch DPA) before you start processing.
This is called prior consultation. Your Data Protection Officer (DPO) can advise you on how to request prior consultation.
Rules regarding AI & algorithms
The GDPR grants certain rights to people whose personal data are processed. The aim of this is to ensure that people retain control over their personal data. These rights also apply if their personal data are processed by an algorithmic system. For example, people have the right to information and to access their data.
They also have a right to human intervention in decision-making processes. This means that as a company or organisation, you are not allowed to make an important decision automatically, but that an employee must do this.
For more information see: Privacy rights under the GDPR.
Privacy story
Jason (30) was looking for a rental home but was excluded by an algorithm. "I kept missing out due to a small error. And no one noticed."
