On this page: safeguards, measures, technical, organisational, procedure,
design, access control, minimisation, transparency, pseudonymisation,
abstraction, information, accountability, rights
Date of last review: 2022-10-31
To incorporate the concepts of Privacy by Design and Privacy by Default into your project, the approach of privacy design strategies (Hoepman, 2022) offers a way to make the GDPR principles more concrete. Hoepman distinguishes 8 strategies that you can apply to protect the personal data in your research: minimise, separate, abstract, hide, inform, control, enforce, and demonstrate. Below, we explain what these mean and how you can apply them.
The GDPR does not prescribe which specific measures you should apply in your project, only that they should protect the personal data effectively. Which measures will be effective, will depend on your specific project, the risks for data subjects, and the current progress in technology (i.e. will the data be protected on the long haul?). So make sure that your protective measures are up-to-date as well!
Limit as much as possible the processing of personal data, for example by:
- Collecting as little data as possible to reach your research purpose.
- Collecting only personal data from the amount of individuals necessary.
- Preferably not using tools that automatically collect unnecessary personal data. If possible, prevent tools you do use from doing so (Privacy by Default). For example, the survey tool Qualtrics can automatically register location data, which can be turned off by using the “Anonymize Responses” option.
- Removing personal data when you no longer need them. Remove them from repositories, data collection tools, sent emails, back-ups, etc. (see also the Storage chapter). Use directly identifying information only if you legitimately need them, for example to keep in touch with data subjects or to answer your research question.
- Pseudonymising or anonymising personal data as early as possible.
- Use portable storage media only temporarily.
Separate the processing of different types of personal data as much as possible, for example by:
- Storing directly identifying personal data (e.g., contact information) separately from the research data. Use identification keys to link both datasets, and store these keys also separately from the research data.
- Separating access to different types of personal data. For example, separate who has access to contact information vs. to the research data.
- Applying secure computation techniques, where the data remain at a central location and do not have to be moved for the analysis.
Limit as much and as early on as possible the detail in which personal data are processed, for example by:
- Pseudonymising or anonymising the data.
- Adding noise to the data, e.g., voice alteration in audio data.
- Summarising the data to simply describe general trends instead of individual data points.
- Synthesising the data, e.g., for sharing trends in the data without revealing individual data points.
Protect personal data, or make them unlinkable or unobservable. Make sure they do not become public or known. You can for example do so using a combination of:
- Using encryption, hashing or strong passwords to protect data. Consider using a password manager to avoid losing access to the data.
- Using secure internet connections and encrypted transport protocols (such as TLS, SFTP, HTTPS, etc.). Do not connect to public WiFi on devices containing personal data.
- Applying privacy models like Differential privacy, where noise is added to individual data points to hide their true identity.
- Only providing access to people who really need it, and only for the necessary amount of time and with the necessary authorisations (e.g., read vs. write access; only the relevant selection of personal data, etc.). Remove authorisations when access is no longer required.
- Encrypting and regularly backing up data on portable storage media.
- Keeping a clear desk policy: lock your screen and store paper behind lock and key when you leave your desk.
Inform data subjects about the processing of their personal data in a timely and adequate manner, for example by:
- Providing information via an information letter or privacy notice on a project website.
- Providing verbal explanation before an interview.
- Obtaining explicit consent via an informed consent procedure.
Give data subjects adequate control over the processing of their personal data, for example by:
- Specifying a procedure and responsible person in case data subjects want to exercise their data subject rights.
- Providing data subjects with a contact point (e.g., email address) for questions and exercising their data subject rights.
Commit to processing personal data in a privacy-friendly way, and adequately enforce this, for example by:
- Using only Utrecht University-approved tools to collect, store, analyse and share personal data.
- Entering into agreements with third parties if they are working with UU-controlled personal data. Such agreements will make sure everyone will treat the data up to UU-standards.
- Always keeping your software up-to-date and using a virus scanner on your devices.
- Appointing someone responsible for regulating access to the data.
- Always reporting (suspicions of) data breaches. At UU, contact the Computer Emergency Response Team.
- If needed, drawing up a privacy and/or security policy that specify roles and responsibilities and best practices on how personal data are handled throughout a project.
- Using a Trusted Third Party when linking individual data from different sources together.
Demonstrate you are processing personal data in a privacy-friendly way, for example by:
- Registering your research project in the UU processing register (once available).
- Performing a Privacy Scan and storing it alongside the personal data.
- Performing a Data Protection Impact Assessment (DPIA) for projects that have a high privacy risk for the data subjects.
- Keeping information for data subjects and (signed) informed consent forms on file. This is not needed if you can fully anonymise the data: then you should delete the (signed) consent forms as well.