Debate and guidance about data protection and the General Data Protection Regulation (GDPR) has focussed on helping businesses achieve compliance. This is clearly valuable. The strengthened rights that individuals enjoy under GDPR have, however, received less attention.
Important questions are left to be explored relating to the public understanding of data protection. How aware is the public about data protection and what their rights are? If they are aware of their rights, how well do they understand what those rights entail? Where do these rights fit within people’s everyday lives?
Open Rights Group has carried out research over the last year to investigate these questions. We have also created a website with Projects By IF called Data Rights Finder. On Data Rights Finder, we present analyses of organisations' privacy policies to make it clearer how they use data and make it easier for people to make requests to organisations using their data protection rights. As a starting point, we focussed on providing information about the main banks, insurance providers, comparison websites, and financial services organisations.
This report explores the findings from our research and our experiences of creating Data Rights Finder. We first discuss interviews that we carried out with members of the public to build a better understanding of how aware people are about data protection and their rights. We then use our research findings and our experience of creating Data Rights Finder to make recommendations to regulators and civil society organisations about how to communicate well about data protection, rights, and GDPR. Finally, we look at how organisations who use data can support the data protection rights of their users, members, customers, supporters, subscribers and so on.
Over the course of this project, we have carried out three rounds of qualitative, interview-based research. To our knowledge, we avoided speaking to people who are data protection experts, digital rights activists, or Open Rights Group members. Desk-based research and our experiences of creating a website that includes our analysis of privacy policies also inform the findings of this report.
The first round of interview research started in June 2018. To help us understand how people think about data protection and how data about them is used, we interviewed eight people. We spoke to three women and five men with a range of ages and careers. We screened our interviewees to ensure we did not talk to data protection experts. The interviews were carried out over the phone and lasted between 20 and 30 minutes each. We used a semi-structured approach to the interviews which means that we covered the same topics in each interview, but we varied the order and phrasing of the questions. This helped us to keep a natural conversation and elicit useful insights about what our interviewees thought about data protection.
We carried out usability testing of the website Data Rights Finder in September 2018. Data Rights Finder is a website that provides information about how companies use data and helps people contact companies to use their data protection rights. The website is discussed in more detail later in the report. In this research, we wanted to find out how people used the website by observing them while they carried out tasks on the site. We wanted to find the most important things to improve on the website and test assumptions we had about the reasons people would use the site. We carried out six usability tests – three of those were in-person and three were remote over video-conferencing. We spoke to a chef, a retail worker, a housewife, two academics, and a vicar. They lived in three different places in the UK. Four of them were in their 30s, one was in their 50s and one was in their 60s. This is a useful methodology for quickly discovering the easiest-to-find usability issues with a website.
In December 2018, we carried out user research to improve our understanding of people’s experiences of the insurance sector and to test assumptions we had about a potential website. We wanted to find out how people thought about the way insurance companies decide whether to offer coverage or not. We also wanted to see how people feel when automated decisions are made that affect their lives. We carried out four semi-structured interviews that lasted around 25 minutes each. Three of the interviews were carried out over the phone and one was in-person. The people we spoke to were in various careers including academia and publishing. Three people we spoke to were in their 30s and one was in their 60s.
Clearly, further research would be needed to draw fully generalised conclusions about these issues. Due to restrictions of time and resources, we have interviewed a small number of people in this project. The people we spoke to cannot be said to be a representative sample of the British public. We favoured quickly capturing small amounts of relatively rich data through interviews which gave us deeper insights into how people thought about these issues. We decided against quantitative alternatives such as a survey which would have presented challenges in acquiring a deeper understanding of the views of the people involved in our research. The research was carried out within a six month period with three rounds each lasting four to ten days. As a result, this is not longitudinal research; we captured a snapshot of viewpoints. We encourage other researchers to explore the issues we discuss here at a greater scale and over a longer period. There is a basis for future work within this research.
Open Rights Group research indicates that:
The General Data Protection Regulation (GDPR) and the UK’s Data Protection Act 2018 provide people with data protection rights. These are not all new rights. Some of them have existed since the 1995 Data Protection Directive, enacted in the UK as the Data Protection Act 1998. There are caveats to all of the rights, but in summary, they are:
As part of a previous report, Open Rights Group conducted research interviews with eight people in the UK to help us get some insight into whether they were aware of or understood these data protection rights. One of our findings was that awareness of these rights among the people we spoke to was very low. Generally speaking, people did not know that they had legal rights, for example, to get a copy of their data or to have data about them erased. The people we spoke to indicated they would contact an organisation if they thought that organisation had made a mistake relating to their data or if they wanted to complain. They were not necessarily aware of what their rights would be when they complained, however. This could leave them at a disadvantage to the organisation in a complaints procedure as the organisation would be more likely to have greater awareness of data protection law and the individual would be unlikely to be aware of what their legal rights afford them.
Open Rights Group has also carried out other research suggesting that awareness of data protection rights is low. We worked with Projects by IF to release a website called Data Rights Finder in June 2018. Data Rights Finder helps people understand how companies use personal data and to make requests using their data protection rights. Our starting focus has been on the main banks, insurance providers, comparison websites, and financial services organisations.
Many organisations are not making it easy enough for people to understand how their data is being used. To help address this problem, we analysed the ‘privacy policies’ of around 40 companies to pull out the details we thought would be important to somebody trying to understand how a company was using data. We also put together the best contact details we could find for each organisation and provided a message template to help people use each of their data protection rights when they contact an organisation.
Open Rights Group carried out usability research to see how people used Data Rights Finder. We showed the website to six people with varied career backgrounds, genders and ages, asked them for their impressions on the homepage and the site in general, and asked them to perform some tasks with the site like ‘Find out which organisations your bank shares data with’ and ‘Ask Paypal for a copy of the data they hold about you’.
All of the people we observed using Data Rights Finder arrived at the section displayed in the image below. The site helps you contact an organisation. Organisations, especially larger ones, often have different contact methods for each data protection right: email@example.com versus firstname.lastname@example.org.
A screenshot from datarightsfinder.org showing the data protection rights a user could use when contacting an organisation – Ed Johnson-Williams
When a user clicks on one of the dropdown sections, it reveals a brief description of the legal right (including a link to a fuller explanation of the right), an example of why you might want to use the right, and the best contact methods we could find for the organisation. All but one of the people who took part in the usability testing clicked on at least one of these dropdown sections.
A screenshot from datarightsfinder.org showing an example of how a user could contact an organisation through the site – Ed Johnson-Williams
The usability testing helped us to understand how easy the site was to use. For the purposes of this report, the most important finding was that people were surprised that they had these kinds of rights over data. After seeing these options, one person said, “Many of these are things I wouldn't have realised I could do.” Sometimes they knew they could do these things, but not that it was a legal right. This revealed a lack of general awareness of the rights contained within GDPR.
After several rounds of research, Open Rights Group is yet to talk to a member of the public who is clear on what their data protection rights are under GDPR. However, most people we spoke to had heard of GDPR. This had, in nearly every case, been through emails sent by organisations to refresh consent to remain on a mailing list or to announce updated privacy policies in the lead-up to GDPR coming into force in May 2018. Because of this, nearly all of the people we have spoken to have only really thought about GDPR as a law that requires consent to process data. This is not accurate, but it is likely that this is common. Further research would be required to confirm this. We first wrote about this in an earlier research report Debates, awareness, and projects about GDPR and data protection.
In general, the people we spoke to believed that organisations should get explicit and informed consent to collect and process data. Below is a good example indicating the sentiment we heard:
“Well I just think it’s explicit agreement to say from our side [companies’] ‘This is what we’re going to do XY and Z with your data.’ and from your side [individuals’] ‘Are you happy with this? If yes, great. If no, then we won’t do XY and Z with your data.’ And I think something as explicit as that would have been a good idea.”
Other research by the Norwegian Consumer Council has shown how Facebook and Google used manipulative user interface design and language to nudge users towards privacy-intrusive options in the lead up to GDPR coming into force. The research questioned whether “consent given under these circumstances can be said to be explicit, informed and freely given.” There is the issue about whether giving up access to your email account or main social media presence if you refuse to consent to your data being processed is fair. Putting that aside for the moment, the Norwegian Consumer Council findings are not surprising given the low levels of public awareness of data protection law and the rights individuals have. It seems likely that at least part of Facebook and Google’s success in retaining high consent rates in the era of GDPR relied on low levels of awareness about what the conditions for consent are within GDPR. Those conditions for consent under GDPR are that an individual makes “a clear affirmative act establishing a freely given, specific, informed and unambiguous indication” that they agree to an organisation processing personal data relating to them.
Returning to our research, some of the people we spoke to made fascinating links between asking for consent and, variously, political correctness, politeness, and transparency. The linking of these values with consent as a data protection concept suggested that people saw a kind of morality in how organisations went about handling their data. To summarise, we understood people as saying that ‘good’ organisations were honest, respectful, and open about how and why they used data. ‘Bad’ organisations were manipulative, secretive, and did not treat people whose data they handled with dignity. We did not find any solid evidence that people do not care about privacy or control of personal data.
These findings are from interviews and analysis carried out by an employee of Open Rights Group with a small group of participants. There would be great benefit in pursuing this avenue of research on a greater scale with independent analysis.
Through our conversations with people in research interviews and in usability testing sessions of the Data Rights Finder website, it became clear that people do not go about their everyday lives thinking about daily events as having relevance to data protection. People are not routinely mapping their data protection rights on to the contexts and situations they experience.
The major exception to that was with the way people talked about the Cambridge Analytica story. We did not bring up Cambridge Analytica in our questions or framing of the conversations. Despite this, many people we spoke to talked about Cambridge Analytica in passing and some of them some delved into what they thought the effects of it have been. People understood Cambridge Analytica as being a company that carried out society-wide manipulation of individuals that relied on misuse of data about them. The people who spoke in greater detail about Cambridge Analytica saw it as a story about power imbalances and secretive or unseen manipulation of the public. As an example, one person said,
“People don’t like being confronted with the idea that maybe their actions weren’t entirely what they would have been. They don’t like knowing they’ve been manipulated.”
“Probably not…Unless I had an issue, I probably wouldn't go looking for it.”
We learned a number of things from the usability testing. One, as discussed earlier in this section, was that awareness of data protection rights is low. Another was that when people had a problem with a bank, insurance company, or financial institution relating to data about them, they would be likely to contact that company to resolve the issue. They would not necessarily realise, however, that a) the problem related to data, or b) that data protection law could help them in that situation.
When people experience problems in their life that they want to resolve, they will do what they can to deal with that problem. The route to alleviating an issue might include their data protection rights, but they are unlikely to know that. Importantly, they do not think first about their data protection rights and then think about what problems in their life that they could solve with those rights. Rather, they realise they have a problem they want to deal with and then look for ways of dealing with their problem.
Having discussed what our research has indicated about how people understand their data protection rights and where they fit within their everyday lives, we now look at what that means for a) data protection regulators and organisations that support data protection rights, and b) for organisations that collect and use people’s personal data.
Considering the findings discussed in section one, these are important points to consider when communicating in support of data protection rights:
Our research suggests that it is rare for members of the public to know about and understand their data protection rights. When people find out about the rights, it is not always immediately obvious to them when the rights would be useful in their daily lives.
The implications of this are that when organisations who support data protection rights communicate about those rights, they must provide real-world examples of when those rights could be useful or vital to people. Otherwise, it is likely that people will not see the immediate relevance of them to their everyday lives or be able to remember that they have those rights when a situation arises where they need them.
A good example of communicating data protection rights themselves and providing added context is the Your Rights section of the Information Commissioner’s Office Your Data Matters pages. While some of the content has very little about why someone might want to use these rights, it goes on to offer specific situations where these rights are relevant.
A screenshot from ico.org.uk/your-data-matters showing links to information about data protection rights – Ed Johnson-Williams
Screenshot 1 from ico.org.uk/your-data-matters showing links to advice about data protection in specific contexts – Ed Johnson-Williams
Screenshot 2 from ico.org.uk/your-data-matters showing continued links to advice about data protection in specific contexts – Ed Johnson-Williams
More work is needed to find the best routes of communicating GDPR rights to people to increase awareness and understanding of these rights. Very few people we spoke to during our research were aware of the Information Commissioner’s Office, so would be unlikely to visit their website. Communicating the rights is likely to be most successful by going to where people are. Social media, advertising, and press campaigns that highlight real-world problems that data protection rights can help with may be more successful than attracting people to generic information on an institutional website.
Another useful area for future research would be to gather evidence of what the most common situations are where data protection rights could be useful or vital. This would help organisations who want to communicate examples of data protection rights in action by allowing them to highlight the kinds of contexts that a large portion of their audiences would encounter. It would also help creators of tools or services that help people use their data protection rights. They could use this information to make tools that a) are specific to a single context, b) allow a user to select their use-case as a first step, or c) provide examples of situations of when a data protection right is useful in a tool that is rights-focussed.
Our understanding from the people we talked to in our various rounds of research is that people do not categorise their life by which data protection rights would be most useful at a given time. Rather, they go about their life and, in some situations, they may look for something that could help them. This appears to be the best opportunity for tools or services that help people use their data protection rights to find their target audience.
Services that deal with specific domains to help people with a particular problem are more likely to align with users’ needs and current knowledge than with services that merely present the data protection rights and let the user work out how they can use them. An example of this problem-focussed approach, which Open Rights Group is looking to develop, would be a tool to help people with the problem of knowing why an insurance company has set a high quotation when applying for an insurance policy. People often need insurance coverage for their car, house, life, health, dental treatment, travel, public liability, gadgets, pet and so on. If the quotation for that coverage is unaffordable, then it can be a serious problem for the applicant. People could contact the insurance company through a website – built using the public data of company contact details from Data Rights Finder – to improve their understanding about how their quotation for insurance coverage had been set.
The models that insurance companies use to set quotations are likely to include many factors outside of an applicant’s knowledge or control. Insurance companies may make their decisions based solely on automated processing, particularly through comparison websites which are a very popular way of applying for insurance. Companies making fully-automated decisions are obliged to provide meaningful information about how they arrive at their decisions and the significance and consequences for the applicant. In this context, that might mean an increased likelihood of having higher quotations in the future.
Understanding the reasons for a quotation being set could help an applicant know that they need to turn to a specialist insurance company or know how to modify their behaviour in the future to improve their chances of being given an affordable quotation. In any event, the data subject should be informed of sources of personal data including public sources such as social media.
This is an approach that helps people with a specific, relatively common, and serious problem. During limited user research, we have found that this is a problem that people recognise and would like help with. There is a good chance that users will quickly understand the value of a service like this and be able to map it onto their everyday lives.
People told us during our research that the data held in Data Rights Finder, such as organisation contact details, how organisations use data, and who they share it with, seemed very useful. People also said, however, that they would be unlikely to see the immediate relevance of the data protection rights presented on the site to their everyday lives.
Our findings highlight the point that helping people access their data protection rights has to be grounded in specific contexts, needs, motivations, and problems. Many organisations and companies are trying to support individuals in engaging data protection rights. We would recommend that time and resources are allocated to carrying out research with potential users – at the beginning and throughout the project – to discover those user needs. We understand that undertaking user-centred research like this might require extra resources, but it can reduce costs and time requirements over the course of a project as well as increasing the likelihood of creating a compelling product.
As discussed in the previous section of this report, Open Rights Group is looking at creating a website that concentrates on the particular problem of the lack of clarity in how insurance coverage quotations are set. To arrive at this problem, we spoke first with a group of experts in finance, debt assistance, and banking regulation to find an area where people experience difficulties relating to finance and data. One person in the group talked about the problem of being rejected for insurance coverage and not being told why. We then spoke with a small number of people in one-to-one interviews to explore their experiences of applying for insurance and of automated decision-making. During that round of research, it became clear that, although the people we spoke to were not rejected for insurance coverage very often, they were confused about how the prices were set and why they were sometimes very high. Although we expect further research to be required to test this idea, this was a better user need on which to focus a digital tool that helped people tackle a real-world problem and that used their data protection rights – to be informed about an automated decision in this case.
This is a short example that shows why much more work is needed to create services and messages about data protection that stand a greater chance of resonating with and making sense to the public.
From our experience of analysing organisations’ privacy policies to create Data Rights Finder, talking to people about data protection issues, and carrying out desk research, Open Rights Group has the following recommendations on how organisations can communicate better about data protection:
Several of the companies whose privacy policies we analysed for Data Rights Finder did not provide an email address or contact form to contact their data protection officer. In these cases, the only way to make a request which uses a data protection right is to send a message by post. This is a dark pattern that appears intended to dissuade people from using their data protection rights.
As part of this research project, Open Rights Group started an application for contents insurance through Direct Line. We applied for the coverage on Direct Line’s website and received a quotation by email. We replied to the email that included the quotation to request an explanation of how the automated process had arrived at the quotation. The logic involved had not been communicated to us during the application process. We also made a phone call to Direct Line’s customer service line to ask for an explanation and for electronic contact details for U K Insurance’s data protection office. The customer advisor did not know that information. We were put on hold for ten minutes and then disconnected. At the time of this report’s publication, we have been waiting for a reply to our request for over a month. In that time, we have, however, received two (automated) requests for feedback by email.
Ideally, all organisations would provide an email address to help people contact them to make a request using their data protection rights. In the case of companies such as U K Insurance that do their business online, not offering electronic means for their customers to use their data protection rights appears to be a manipulative attempt to dissuade people from using their data protection rights.
Right to erasure. You have the right to request erasure of your personal information that: (a) is no longer necessary in relation to the purposes for which it was collected or otherwise processed; (b) was collected in relation to processing that you previously consented, but later withdraw such consent; or (c) was collected in relation to processing activities to which you object, and there are no overriding legitimate grounds for our processing. If we have made your personal information public and are obliged to erase the personal information, we will, taking account of available technology and the cost of implementation, take reasonable steps, including technical measures, to inform other parties that are processing your personal information that you have requested the erasure of any links to, or copy or replication of your personal information. The above is subject to limitations by relevant data protection laws.
An example from coinbase.com/legal/privacy of data protection rights being communicated
This text appears to be generic. We very quickly found many examples of identical text in the privacy policies of other companies. We expect very few people to be able to find this information, to understand it, to be able to use it, or to know why they would use it.
This is not to make an example of Coinbase. Coinbase is not unusual in this regard. However, we have established in our research that people do not understand their data protection rights without context. Repeating the text of GDPR is not helping people to understand how an organisation uses data. It is important to explain how these rights interact with the particular activities or business that the organisation does and make it clear how this affects the individuals involved.
It is important to be clear with people about what data is collected about them and what it will be used for. Being transparent helps to build trust and accountability about how data is used.
Article 12.1 of GDPR says that organisations that process data should provide information “in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child.” To help organisations do that, the Article 29 Data Protection Working Party (WP29) – a body made up of representatives from the data protection authorities in each EU member state – has published comprehensive guidance on transparency around processing of personal data under GDPR.
The WP29 guidance gives examples of good and bad practice in this area. They are particularly useful to understand what the requirements are so we are reproducing them from WP29’s guidance here with very light editing of line-spacing to improve readability.
Poor Practice Examples
The following phrases are not sufficiently clear as to the purposes of processing:
- “We may use your personal data to develop new services.”
(It is unclear what the “services” are or how the data will help develop them.)
- “We may use your personal data for research purposes.”
(It is unclear what kind of “research” this refers to.)
- “We may use your personal data to offer personalised services.”
(It is unclear what the “personalisation” entails.)
Good Practice Examples
- “We will retain your shopping history and use details of the products you have previously purchased to make suggestions to you for other products which we believe you will also be interested in.”
(It is clear that what types of data will be processed, that the data subject will be subject to targeted advertisements for products and that their data will be used to enable this.)
- “We will retain and evaluate information on your recent visits to our website and how you move around different sections of our website for analytics purposes to understand how people use our website so that we can make it more intuitive.”
(It is clear what type of data will be processed and the type of analysis which the controller is going to undertake.)
- “We will keep a record of the articles on our website that you have clicked on and use that information to target advertising on this website to you that is relevant to your interests, which we have identified based on articles you have read.”
(It is clear what the personalisation entails and how the interests attributed to the data subject have been identified.)
Examples of clear and plain language as given by the Article 29 Data Protection Working Group
As well as following the WP29 guidance, it is also a good idea for organisations to carry out usability and readability testing of their privacy-related information. This would look to answer questions including:
These questions could be addressed in usability testing with real users.
Recent research in America looked at people’s preferences when online behavioural advertisers communicate about why an advert is being shown. They found that people preferred explanations which included specific and personalised information about why an advert was presented to them. They also found that “vague and oversimplified language made many existing ad explanations uninterpretable and sometimes untrustworthy.” Many organisations could benefit from carrying out similar research to understand how their users would prefer to be told about how data about them is being used.
Many organisations bundle privacy policies into general Terms and Conditions that users are required to agree to in order to use the service. In many cases, the processing of personal information will not be based on consent, but users could be easily confused between the acceptance of general Terms and consent for data processing. In cases where the processing is not based on consent, people still need to be informed. There might be value in developing a design pattern that separated out collecting consent from collecting confirmation that the user has read the information about processing.
Your email address
WHY WE COLLECT IT AND HOW WE USE IT:
- To create and support your account and provide you with Services.
- To communicate with you, for example, informing you about your account status, security updates and website and mobile application information.
- To contact you about features, products, services and other promotions that can enhance your use of the Services, in accordance with your communications preferences.
- To enforce compliance with our Terms.
HOW WE RETAIN IT:
We keep your email address until you delete your account by sending an email to email@example.com
An example from otter.ai/privacy of a structure that links category of data very closely with purpose of processing
We have looked at a lot of privacy policies during this project and found that information about how organisations use data is often presented in an inconsistent and unstructured way. This means users will find it hard to discover the information and to compare various organisations’ use of data. There is also a community of researchers and service designers who want to examine how organisations use data and to make tools which help individuals understand how their data is processed or to use their data protection rights.
Cliqz – a company that makes browser extensions to reduce the effectiveness of web trackers – has made recommendations for four machine-readable standards for communicating how an organisation uses data. We do not necessarily think all of these recommendations are the best answer to the above problems, but they are a useful intervention in this area.
<link rel="personal-data-usage" href="https://organisationname.com/whereever-their-privacy-policy-is">
We would like to see further investigation into what standards around structured data about how organisations use data would be most useful for individuals, researchers, service designers and others. It would be important to understand the drivers of adoption of other similar standards such as ‘robot.txt’ and ‘ads.txt’ and to build support for the standard among organisations themselves. In the future, we would like to be able to integrate structured data created by organisations into Data Rights Finder in an automated way.
 Open Rights Group, Debates, awareness, and projects about GDPR and data protection. – https://www.openrightsgroup.org/about/reports/debates-awareness-and-projects-about-gdpr-and-data-protection
 Open Rights Group, Debates, awareness, and projects about GDPR and data protection. – https://www.openrightsgroup.org/about/reports/debates-awareness-and-projects-about-gdpr-and-data-protection
 Norwegian Consumer Council, Deceived by Design: How tech companies use dark patterns to discourage us from exercising our rights to privacy. – https://fil.forbrukerradet.no/wp-content/uploads/2018/06/2018-06-27-deceived-by-design-final.pdf
 Ibid, pg 4
 WhoTracks.me, GDPR - What happened? – https://whotracks.me/blog/gdpr-what-happened.html#third-party-services-the-winner-takes-it-all
 Recital 32 of GDPR says, “Consent should be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject's agreement to the processing of personal data relating to him or her, such as by a written statement, including by electronic means, or an oral statement.”
 In practice, it is often a separate company – an underwriter – that sets the price of insurance. We may have to collect some further contact details data for underwriters that is linked to each insurance company to create this tool.
 We had originally thought about making a tool to help people understand why they had been rejected for an insurance policy. In user research to test that idea, it emerged that high and unaffordable prices was a more common problem that was just a serious as not being offered insurance at all.
 Data Rights Finder API Documentation – https://github.com/datarightsfinder/website/blob/master/docs/api.md
 U.S. Dept. of Health and Human Services, Benefits of User-Centered Design. – https://www.usability.gov/what-and-why/benefits-of-ucd.html
 Dark patterns are deceptive user experience or user interface designs that aim to manipulate or mislead users or to make them do something that they do not want to do. This is a good introduction to dark patterns: Arushi Jaiswal, Dark patterns in UX: how designers should be responsible for their actions. – https://uxdesign.cc/dark-patterns-in-ux-design-7009a83b233c
 There is a space between the ‘U’ and the ‘K’ of U K Insurance.
 This is now replaced by the European Data Protection Board.
 Article 29 Newsroom, Guidelines on Transparency under Regulation 2016/679. – https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=622227
 See pages 7-10 of the WP29 guidance for further information in this area.
 See page 9 of the WP29 guidance for the original of this.
 Jakob Nielsen, Cloze Test for Reading Comprehension. – https://www.nngroup.com/articles/cloze-test-reading-comprehension
 Motahhare Eslami, Sneha R. Krishna Kumaran, Christian Sandvig, Karrie Karahalios, Communicating Algorithmic Process in Online Behavioral Advertising. – https://social.cs.uiuc.edu/papers/eslami-CHI18-ads.pdf
 Projects by IF, Just-in-time consent. – https://catalogue.projectsbyif.com/patterns/just-in-time-consent
 Joseph Turow, Michael Hennessy & Nora Draper, Persistent Misperceptions: Americans’ Misplaced Confidence in Privacy Policies, 2003–2015. – https://www.tandfonline.com/doi/full/10.1080/08838151.2018.1451867
 Accent Group, Your Information. – https://www.accentgroup.org/how-we-use-your-information
 Privacy International, How We Use and Protect Your Data. – https://www.privacyinternational.org/basic-page/618/how-we-use-and-protect-your-data
 WhoTracks.me, GDPR - What happened? – https://whotracks.me/blog/gdpr-what-happened.html#recommendations-for-gdpr-20