Consultation response to a call for evidence of the Data Protection Act.
The Open Rights Group actively campaigns on digital privacy and technology issues. We have over 15,000 active supporters, of whom 1,400 are paying contributors. Our experience of privacy issues includes work on the Data Retention Directive, interception, the technology provided by the company Phorm, the implications of deep packet inspection and the Digital Economy Act.
The EU’s Data Protection regime is now supported by the EU’s Charter of Fundamental Rights. Article 8 provides a fundamental right to protection of personal data.[i] The Directive needs to extend to law enforcement and judicial matters, as the result of extension of the EU’s remit under the Lisbon Treaty.[ii] The right to privacy is also supported by domestic UK law, by the Human Rights Act.[iii]
The Data Protection Directive has been recognized to be a key component of worldwide privacy regulation, giving good standards of protection, especially compared to some other legal systems. However the UK implementation is widely acknowledged to be faulty. This means we have less protection in the UK than is needed and should rightly be afforded to us by EU law.
The DPD and DPA were drafted in a much different age, long before the proliferation of computing devices. Underlying the use of personal data today is an exponential trend to greater capacity for collection, storage, transmission and processing of data.
This has extended to new types of data. Traditionally, data protection has mainly concerned itself with textual records, mainly of facts about ourselves. Now, however, there are relatively routine collections of categories of data such as biometrics and genetic information that can tell much more about us.
There are also categories of data such as photographs, video and audio which can increasingly be interpreted by machines to identify facts about us, and therefore can fall into the realm of data protection. Cases such as Google’s Streetview, potentially revealing location and other data about individuals, shows the type of problems data protection and privacy laws increasingly needs to deal with.
The public has quite rightly become increasingly aware and alarmed by these concerns, as they are frequently under threat from poor security and data protection practices. There are now frequent incidents of ‘data loss’, exposing people to potential fraud and “moral harm” such as stress. Perhaps the most iconic of these events was the loss of 25 million people’s data including individuals’ bank details by the HMRC. [iv]
This concern has been noted by both the EU, in the Eurobarometer, and the ICO, in their annual tracker surveys. In 2009, the ICO’s survey found near-universal concern over sale and transfer of data and security. [v]
The capacity of machines and networks is not entirely bad news for our privacy. While we should expect the creation of more data by and about us, there is nothing to say that we cannot devise technical means to protect our privacy rights. The challenge is to create privacy laws that encourage and develop ever more responsible use of technology, reinforcing the principles of transparency, control and consent. Laws can encourage – or hinder – the development of such technological practices. This is what the next Directive needs to aim to do.
We note that this consultation has not dealt with data protection issues as applied to law enforcement in the UK and EU. The changes to the EU’s competencies include law enforcement is one of the reasons why the Directive is being reviewed. We therefore fully expect to see a separate consultation on these matters in due course.
The Data Protection Act is weak compared to the Directive. While both are in need of revision, the UK’s implementation has particular problems, some of which were recently highlighted by the threat of legal action by the EU Commission.
There are increasing problems with the Act and Directive. They are increasingly weak in relation to the problems they have to solve. The key problems include:
Enforceability of rights
Rights of access and correction are difficult to exercise and require some determination. There is little that citizens can do if data is incorrectly released. Transparency rules are widely bypassed.
Problems with redress
Rights of redress are limited. There is no concept of ‘class actions’ nor of civil society groups taking complaints to court. The concept of moral damage is limited in the UK’s implementation.
Non-compliance with the EU Directive
Significant problems include lack of powers for the ICO, incorrect definitions of personal data and problems with lack of redress for non-financial harm (moral damage) and the notion of “implied consent”.
Problems with definition of ‘personal data’
The UK’s definition of personal data is narrow and vague – data is judged on its “relevance or proximity”, rather than the clear definition in the Directive as data relating to somebody.
Non-transparent privacy policies
Privacy policies are complicated, hidden, and subject to revision without notice.
Lack of control
Users are forced to relinquish control via privacy policies. Implied consent is accepted in the UK.
Problems with export of data and jurisdiction
Data hosted outside of the UK or EU may be subject to weaker protections. The EU’s process for creating agreements with countries outside the EU is slow. Many key services are located in the US with weaker data protection laws.
Non compliance of the UK Act with the Directive
The UK implementation does not conform with the Directive – according to the Commission up to a third of the Directive’s articles are incorrectly transposed. [vi] While full implementation would not deal with the whole range of problems that data protection faces, it would improve the position of UK citizens.
The UK’s problems came to the attention of the EU partly through campaigning work with which ORG was involved, as the result of technology company Phorm’s behavioural advertising systems. These involved interception of users’ data. [vii]
The European Commission has since requested that the UK “strengthen the powers of its data protection authority so that it complies with the EU’s Data Protection Directive.” [viii] Reding noted that “Having a watchdog with insufficient powers is like keeping your guard dog tied up in the basement.”
The Commission’s statement said that:
The statement added that in 2007 the Commission found fault with the UK’s transfer into law of 11 of the Data Protection Directive’s articles, almost a third of the whole Directive.
The UK interpretation of ‘Personal data’
The Directive defines ‘personal data’ as “information relating to identified or identifiable person”. ‘Identification’ is explained in the recitals to cover: “all the means likely reasonably to be used either by the controller or by any other person to identify the said person”[x]
The UK has taken a narrower view, reinforced by widely-criticised court decisions. Most importantly, the Durant v Financial Services Authorities case interpreted ‘personal data’ on the basis of ‘relevance or proximity’ to the data subject. This ‘focus’ test is subjective, controversial and not how personal data is defined elsewhere in the EU.
The practical result is that many forms of what should be classed as ‘personal data’ is not in the UK. This affords UK citizens lower protection and rights over much data that relates to them.
IP addresses are a telling example of the problems with the UK legal view. They are not, by themselves, an indicator of a person, but they can be used to identify someone in conjunction with other information. They are also in extremely common use, are often publicly visible and very frequently retained. Their ubiquity in records of internet traffic makes them potentially a way to reveal highly sensitive information about someone, including information their about health, sexuality, opinions and beliefs.
Taking this into consideration, the Article 29 Working Group concluded that:
Unless the ISP is in a position to distinguish with absolute certainty that the data correspond to users that cannot be identified it will have to treat all IP information as Personal Data to be on the safe side. These considerations apply equally to search engine operators. [xi]
The UK’s law and ICO should be making this clear: where IP addresses may indentify people, by linking information with it, they should be treated as personal data. This understanding is frequently under attack, not least from entertainment industries who wish to harvest such information for their own surveillance purposes.
Pseudonymous and “anonymised” data
A further complication is the ability of people to identify individuals from so-called ‘anonymised’ data. Anonymous, or pseudonymous, data, can increasingly easily be relinked to individuals. This has not been sufficiently recognized by data protection authorities, with the result that extremely sensitive data sets have been handed to third parties without due regard for the potential consequences, or the true need for informed consent.
We strongly recommend that the law should ensure that so-called anonymisation techniques do not allow data controllers to avoid their data protection responsibilities.
Data records that relate to individuals should be treated as personal data even when purportedly anonymised or pseudonymised. Only when data sets have been aggregated sufficiently should they cease to be regarded as personal data.
When data controllers release purportedly anonymised or pseudonymised data but personal data can be extracted from it, the data controller should be prosecuted.
Perhaps the most startling example of re-linking IP data to individuals occurred in the AOL search data case. [xii] In this incident, in 2006, search data for 650,000 AOL users was exposed to the public. It had been intentionally published for research and analysis without the consent of the users involved.
Once made available to the public, journalists and others re-linked the search data and user numbers to real people.
Tools are emerging that can make it very easy to re-link individuals to apparently anonymous data. Possible methods have been in discussion since at least 2001. [xiii] In 2006, Netflix released supposedly anonymous data about users’ movie ratings, and a year later researchers demonstrated that they could link this back to real people from named comments on the IMDB database.
IP addresses are a related case. They are in many instances like a pseudonym, albeit one that may be shared. With other data, they can be linked back to an individual.
There are at least three easy ways to link IP addresses back to individuals. One is via the ISP, who will record the IP address against an account holder for a period of time, usually via a court order. Another is via records on a website, such as comments, if these log the IP address. A third, practical for individual cases, is via email headers, which display the IP address from which the email is initially sent.
Search engines, advertisers and private surveillance schemes are all being used to collect this data and profile users, despite which it is not being treated as personal data in much UK practice. The ICO has not been giving clear advice. Meanwhile, the government, for instance in the Digital Economy Act debate, received extensive advice from the entertainment industry claiming that IP addresses were not personal data in the case of their private surveillance.
Clear advice is needed from the ICO to ensure that such industries known when IP addresses are personal data.
ORG has repeatedly stated that IP addresses harvested on behalf of entertainment industries need to be treated as personal data. We are also aware of at least one instance where very large volumes of psuedonymised communications data was handed to researchers in the USA without the consent of the individuals.[xiv]
The lack of proper advice from the ICO has resulted in misapprehension from the entertainment industry, and the entertainment industry being the major voice that influenced government policy on this matter:
the methods that BPI has employed to detect and notify infringers – and which we would expect to continue to deplore in the context of the prefered approach – have no implications for data protection provisions...[xv]
Rightholders do not have the ability to link an IP address to an individual ISP customer.... ISPs have the ability to make the connection between an IP address and customer details...[xvi]
Giving evidence to the Joint Committee on Human Rights, the Alliance Against IP Theft said:
The Alliance is clear that the creation and maintenance of such a list does not infringe human rights and is in compliance with all European directives. IP addresses alone are not personal data. It is information publicly available when engaging in file-sharing activity.[xvii]
The Periodicals Publishers Association (PAA), which like the Alliance Against IP Theft, had lobbied for the Digital Economy Act, stated that:
IP addresses alone do not constitute personal data for the purposes of the Data Protection Act 1998
For the protection of users, clear public advice is needed from the ICO about when IP addresses should be treated as personal data, in line with the EU Data Protection supervisor’s views of such schemes, most recently discussed in relation to ACTA:
Directive 95/46/EC is applicable since the three strikes Internet disconnection policies involve the processing of IP addresses which — in any case under the relevant circumstances — should be considered as personal data... If one considers the definition of personal data provided in Article 2 of Directive 95/46/EC, any information relating to an identified or identifiable natural person (data subject); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number‘, it is only possible to conclude that IP addresses and the information about the activities linked to such addresses constitutes personal data in all cases relevant here. Indeed, an IP address serves as an identification number which allows finding out the name of the subscriber to whom such IP address has been assigned. Furthermore, the information collected about the subscriber who holds such IP address ( ̳he/she uploaded certain material onto the Web site ZS at 3 p.m. on 1 January 2010‘) relates to, ie is clearly about the activities of an identifiable individual (the holder of the IP address), and thus must also be considered personal data.[xviii]
This is certainly the belief of UK citizens, who according to the ICO, viewed this as more sensitive than many of those categories currently regarded as ‘sensitive’, such as religious beliefs or trades union membership. [xix]
Biometric data are being collected with increasing frequency. We welcome therefore the fact that the UK and EU are looking at this question seriously.
In the ‘Comparative Study on Different Approaches to New Privacy Challenges in Particular in the Light of Technological Developments’ submitted by LRDP Kantor and Center for Public Reform, [xx] to the EU Commission the researchers found that:
Biometric data are increasingly associated with highly sensitive data, such as in passports, or law enforcement databases.
Financial data should be included, such as credit information and bank account details. These data can be life changing in its significance, either through crime, or through its application to the data subjects’ ability to, for instance, obtain credit.
Web browsing histories are another area that should be considered carefully, as they allude to, and could be used to deduce, information that is currently regarded as sensitive, such as political and religious beliefs, health and sexuality.
In our view, they currently cause confusion. For instance, in the context of social media, with users’ increasingly uploading personal data about themselves and others, it becomes difficult to distinguish the controller and processor and therefore determine who has responsibility. There is a similar problem with cloud computing.
In each case, there is a further problem when a platform allows third parties to access data. Even assuming that it is clear when a third party service requests the use of your data, their privacy policies may not be easy to access.
It is possible to envisage that third party services should be bound by the initial terms and conditions, and the user of the services should only need to approach the person they have initially contracted with. In any case, data protection should aim for simplicity for the user (data subject) rather than providing complicated legal flexibility (loopholes) for companies.
The current subject access obligations are set too broadly. The result is that it can be easy for data controllers to comply with the letter of the law, while avoiding their spirit.
In particular, there is not limit to the time given to authenticate a Subject Access Request, so this can be used as a delaying tactic to find more time to comply with the request.
The cost can also be a barrier. The cost of £50 for requests for health information and education records is prohibitive and unfair.
Question 13. Do businesses have any evidence to suggest that this obligation is too burdensome?
Question 14. Approximately how much does it cost your organisation to comply with these requests?
Question 15. Have you experienced a particularly high number of vexatious or repetitive requests? If so, how have you dealt with this?
Question 16. What evidence is there that technology has assisted in complying with subject access requests within the time limit?
These questions are unfortunately designed to get one side of the problem to reveal itself. The converse needs to be asked: do citizens have evidence to suggest that it is difficult to obtain their data, and are unfair barriers placed on their requests, despite on paper confirming with their obligations?
Going back to basic data protection principles, transparency and access are vital for fair processing. Unfortunately, these rights are too frequently theoretical rather than practical. Yet, with ease of collection and storage, also, potentially comes ease of retrieval and access. Despite this, the same barriers exist to access, including long time lags and charges.
With the right technology and a well designed data architecture, a map of data collected should be easy to retrieve and make access to data even in archives much easier.
There is the possibility to create incentives for systems that make authentication and data access trivial. Given that ‘data portability’ may need to become an economic right necessary to maintain competition and avoid vendor lock-in, this is worth incentivizing.
Automated data processing and transparency by design could allow data subjects to control the data flow more in real time instead of having to exercise a delayed subject request.
Fees are certainly a barrier to making subject access requests, and there is a strong case for free subject access requests where the business’ model is wholly or largely based on the use and processing of user data.
In these cases, portability of data is an economic as well as privacy right. For instance, a “cloud” service user might store emails, documents, web search histories, and website traffic analysis data from within one web based provider. The cost of ‘migration’ to another service might be the loss of all or part of that data, or at least inconvenience in moving that data. In such a case, there is little reason to place financial barriers in front of subject access requests, whether the reason for data retrieval is privacy related or service related.
A distinction could therefore be created between different categories of organisations according to their size and perhaps business function.
Not many people are aware of these rights, especially the rights to Section 13 for damages as a result of a data breach as a consequence of lack of notification.
Secure authentication of data subjects should made easier and faster.
The main area for greater improvement is redress. Currently, the ICO cannot take data protection breaches to court, but must act within their own enforcement powers. Class actions are not possible. Civil society organizations cannot take complaints to court. Only individual actions are possible.
Furthermore, as noted elsewhere, the UK’s remedies available under the Act are too limited. The main private remedy is compensation for ‘damage’. Damage under Section 13 allows compensation to be awarded if “the individual also suffers damage by reason of the contravention”.[xxi] Damage is restricted to financial or other tangible harm. However, it is usually difficult to identify tangible harm resulting from data breaches, though they may cause great distress.
The recent ACS:Law leak[xxii] is a case in point, as most of the harm experienced from public accusations of file sharing pornography will be in the form of embarrassment, stress and damaged relationships. None of these may be claimed under section 13. Tangible harm that amounts to "damage" within the meaning of section 13 will be much less frequent and hard to prove. Thus the members of the public whose privacy has been breached by ACS:Law will for the most part be left without any judicial remedy of their own.
Individuals, whether they have suffered distress or have simply had their rights abused, are left without a privately accessible remedy. This is expected in the Directive in Article 22: “Member States shall provide for the right of every person to a judicial remedy for any breach of the rights guaranteed him by the national law applicable to the processing in question”.[xxiii]
Data breaches can endanger people’s identity or financial status, and in such cases there is absolutely no case for failing to notify individuals.
The current ICO scheme, where they advise whether a leak is serious enough is entirely flawed. Only the data subject can know whether a particular leak is significant; no third party is able to make that judgement for them.
Former ICO Richard Thomas recognized the risks from centrally-stored data:
We have already seen examples where data loss or abuse has led to fake credit card transactions, witnesses at risk of physical harm or intimidation, offenders at risk from vigilantes, fake applications for tax credits, falsified Land Registry records and mortgage fraud. Addresses of service personnel, police and prison officers and battered women have also been exposed. Sometimes lives may be at risk[xxiv]
Individuals should have an absolute right to protect themselves.
However, data breach notifications have a second role, which is as a deterrent, and incentive to make sure systems are properly secure, as public acknowledgement would cause reputational damage.
This principle has already been acknowledged in the recent Telecoms Package for the telecommunications sector.[xxv] There is every reason to extend this principle further.
The cost to a competent organisation that never leaks data is zero, unless they over-invest in protection. All the costs fall on the incompetent and disorganised.
Overall costs for mandatory notification could be low. Communications costs can be very low. They should reduce across industry over time as practices improve, and the savings from improved security should outweigh any initial costs.
The ICO does not have adequate powers to carry out its duties. We detail specific concerns below.
Of particular concern is a specified right of investigatory powers without prior-notice. We reiterate the Commission’s concerns that:
We also draw attention to the ‘Surveillance: Citizens and the State’ report of the House of Lords Select Committee on the Constitution.[xxvii] Several of their recommendations related to giving the ICO a greater role in ensuring government and private sector compliance with privacy and data protection concerns. Their recommendations include that:
454. The Government should consider expanding the remit of the Information Commissioner to include responsibility for monitoring the effects of government and private surveillance practices on the rights of the public at large under Article 8 of the European Convention on Human Rights.
455. We regret that the Government have often failed to consult the Information Commissioner at an early stage of policy development with privacy implications. We recommend that the Government instruct departments to consult the Information Commissioner at the earliest stages of policy development and that the Government should set out in the explanatory notes to bills how and when they consulted the Information Commissioner, and with what result. (paragraph 231)
456. We welcome the Government’s decision to provide a statutory basis for the Information Commissioner to carry out inspections without consent of public sector organisations which process personal information systems, but regret the decision not to legislate for a comparable power with respect to private sector organisations.
460. We recommend that the Government amend the provisions of the Data Protection Act 1998 so as to make it mandatory for government departments to produce an independent, publicly available, full and detailed Privacy Impact Assessment (PIA) prior to the adoption of any new surveillance, data collection or processing scheme, including new arrangements for data sharing. The Information Commissioner, or other independent authorities, should have a role in scrutinising and approving these PIAs. We also recommend that the Government—after public consultation—consider introducing a similar system for the private sector. (paragraph 307)
461. We believe that the Information Commissioner should have a greater role in advising Parliament in respect of surveillance and data issues. We therefore recommend that the Government should be required, by statute, to consult the Information Commissioner on bills or statutory instruments which involve surveillance or data processing powers. The Information Commissioner could then report any matters of concern to Parliament. (paragraph 370)
462. We recommend that the Government, in conjunction with the Information Commissioner, undertake a review of the law governing citizens’ consent to use of their personal data. (paragraph 397)
463. We share the Information Commissioner’s disappointment that the Government have not made a specific commitment to working with the Information Commissioner’s Office to raise public awareness. We recommend that the Government reconsider this matter and commit to a plan of action agreed with the Information Commissioner.
The first pecuniary enforcement envisaged is the case of the ACS:Law data breach. While this case, involving the leak of data concerning copyright infringements, seems very clear cut to us, we are yet to see if the ICO does manage to fine them. They have indicated that they wish to, at least.
In the case of Google Street View, many ICOs initiated investigations on Google’s collect of Wifi data[xxviii] but this did not take place in the UK.[xxix] In fact, Privacy International had to intervene to stop the destruction of evidence on the advice of the ICO in the UK and elsewhere.[xxx]
This perhaps indicates a lack of will as well as powers, at times. The ICO has had a long history of being toothless and taking a light-touch approach. If the ICO is to use powers, they must take amore direct and principle-based approach. They must use their powers to demonstrate what is right and wrong.
The EU principles have been adopted by other countries, including the OECD, and throughout Asia and Africa, but have had less impact in the USA. They have acted, in our view, as a global force for higher protection of privacy rights. As the EU’s major independent study of EU-wide data protection legislation concludes:
The basic European principles should therefore be re-affirmed and if anything strengthened: and efforts to obtain their adoption world-wide should continue”.[xxxi]
The report’s authors state that they have reservations on “the application and enforcement” of these principles especially regarding the more intrusive computing and personal data collection. They highlight the use of new technological developments including profiling, the ubiquitous internationalisation of such processing, user generated content, and the effects they have on the Directive in practice, but they believe the underlying principles to be sound. We agree.
The UK implementation of the directive has distorted the “freely given specific and informed indication” definition of consent[xxxii] and has instead created an ‘implied’ consent as sufficient. We strongly disagree with this approach.
UK, the Commission and Phorm
The lack of proper definitions in the UK was highlighted to the EU by the actions of citizens concerned by the technology Phorm. The Open Rights Group was among the groups that wrote to the Commission and highlighted the problems that we were encountering.
As a result, the Commission drew the UK government’s attention to their incorrect definition of ‘consent’.[xxxiii] The UK is now being taken to court, in part over our incorrect definition of consent.[xxxiv]
In the Internet age, it is more important than ever to make sure that consent is properly given. However, consent, in the case of behavioural advertising, is disregarded. Cookie-based “online behavioural advertising” is but one example. Consent is not sought; instead users are expected to repeatedly opt-out and assert their lack of consent. The UK’s IAB guidelines state:.[xxxv]
2.1 Each Member shall provide an approved means for consumers to decline OBA from that Member.
2.2 Each Member shall provide information on how to decline OBA with respect to that Member and ensure that this information is prominently displayed and easily accessible on its website.
2.3 Each Member shall provide the IAB with an up-to-date URL to this information so that the IAB can link to this information from its information portal.
Guidance Note 3.
Members shall: … obtain informed consent for the use of personally identifiable information, where required by law;
In our view, none of this is compliant with the idea of freely given, specific and informed consent. The basis for consent is sometimes held to be “browser settings” which allow cookies. But the reality is that most users neither understand settings, nor cookies; nor are cookies primarily used to facilitate consent for behavioural advertising.
Furthermore, the IAB’s guidelines may well be taking an incorrectly narrow view of ‘personally identifiable information’, which may in fact include cookie IDs or IP addresses.
Website terms and conditions
On websites, consent often consists of nothing more than leaving a tick box checked. Privacy statements are complex and unread. The idea that consent has really been given seems rather strong, especially as the rights individuals give up can be very extensive. Click-wrap terms and conditions may in fact be frequently unenforceable in practice.
Online copyright surveillance
If IP addresses are personal data, then consent should be sought before they are harvested. This poses particular problems for online copyright surveillance practices, as has been highlighted by Peter Hustinx[xxxvi] and the Article 29 Working Group[xxxvii]. These are particularly troubling for the implementation of the Digital Economy Act.
Clearly, prior, informed consent is not being sought before IP addresses are harvested by these private surveillance systems, whether the IP addresses are public or not.
While justifications for surveillance may be made, such as the need for law enforcement, these would normally imply that such surveillance should be a law enforcement activity. It is however a private activity.
Finally, such surveillance would need to show that it was proportionate (that real harm could be demonstrated and was being tackled) and necessary (that there is no other, less intrusive alternative). Given that solutions such as market reform, levies and mandatory licensing exist, these should be placed ahead of generalized surveillance that can touch upon anyone who owns a node in the network, whether they infringe copyright or not.
Consent of minors
There are questions about the consent of minors, as well. It is difficult to see how a child can be expected to give informed consent. This is problem that needs to be looked at closely in this review.
We also draw attention to default settings in applications such as Facebook. In our view, these should be protective of privacy, rather than open. This was the view of the Article 29 Working group.[xxxviii]:
An important element of the privacy settings is the access to personal data published in a profile. If there are no restrictions to such access, third parties may link all kinds of intimate details regarding the users, either as a member of the SNS or via search engines. However, only a minority of users signing up to a service will make any changes to default settings. Therefore, SNS should offer privacy-friendly default settings which allow users to freely and specifically consent to any access to their profile's content that is beyond their self-selected contacts in order to reduce the risk of unlawful processing by third parties.
The clear implication of social networking sites is that they need to think about the context they are operating in. If users are submitting sensitive information, this should be kept under their control. Yet recently, Facebook changed their defaults to publish status information, in a long trend to increasingly open default settings.[xxxix] A site “Your open book” launched to show the disastrous consequences with live searches of open statuses.[xl]
User control of data
A longer term question should be how we reassert control over our own data. This goes to the heart of the principle of consent. Consent is limited if we are in practice unable to control what we agree to.
Currently, services are promoting a very centralized notion of data usage. The same information is submitted to multiple websites, and users lose practical control of their information.
However, technologies are emerging that reassert a strong notion of control and consent. A very basic example is OpenID, where a single service enables log-ins to any service supporting the protocol[xli]. Thus users do not have to submit passwords and email addresses to more than one service. The protocol is widely supported by many well-known services.
Another example is the idea of “Vendor Relation Management”.[xlii] VRM tools would allow individuals to store more complicated data sets that would help them authenticate themselves and tell specific information to vendors in a way that they control. While this is privacy friendly, it also potentially makes economic sense for individuals. They can reassert control over their data, and reclaim more of the financial value from it.
The current review of the Directive should identify incentives for privacy-enhancing technologies, and help these approaches by strengthening the types of rights that they would enhance.
Question 33. Should the definition of consent be limited to that in the EU Data Protection Directive i.e. freely given specific and informed?
We strongly agree with this and find it remarkable that a UK consultation should need to ask this question.
The definition of consent should be the same across EU countries, so citizens’ are afforded the same protection in all member states, and should protect the data subject’s fundamental right to data protection.
Question 35. Do you have evidence to suggest that data subjects do or do not read fair processing notices?
Question 37. Do you have evidence to suggest that the exemptions are not sufficient and need to be amended or improved?
Clarification of the notion of purely personal or household activity’ is needed in the context of user generated content and social networking.
[v] ICO Annual Track 2009, 4.1 and 4.2
[vi] ‘Data protection: Commission requests UK to strengthen powers of national data protection authority, as required by EU law’ 24/06/2010 http://europa.eu/rapid/pressReleasesAction.do?reference=IP/10/811
[viii] ‘Data protection: Commission requests UK to strengthen powers of national data protection authority, as required by EU law’ 24/06/2010 http://europa.eu/rapid/pressReleasesAction.do?reference=IP/10/811
[x] Recital 26 95/46/EC
[xi] Article 29 Working Group Opinion 4/2007 on personal data, Jun. 20, 2007, p. 17 http://ec.europa.eu/justice_home/fsj/privacy/docs/wpdocs/2007/wp136_en.pdf
[xiv] The structure and strength of social ties in a nationwide communication network
[xv] BPI Limited response to BERR Consultation: ‘Legislative Options to Address Illicit Peer- to-Peer (P2P) Filesharing‘, The British Recorded Music Industry, October 2008, pg.10
[xvi] BPI Limited response to BERR Consultation: ‘Legislative Options to Address Illicit Peer- to-Peer (P2P) Filesharing’, The British Recorded Music Industry, October 2008, pg.11 http://www.bis.gov.uk/files/file49709.zip
[xvii] Legislative Scrutiny: Digital Economy Bill – Fifth report of session 2009-10, House of Lords & House of Commons, Joint Committee on Human Rights, January 2010, pg.41
[xviii] Opinion of the European Data Protection Supervisor on the current negotiations by the European Union of an Anti-Counterfeiting Trade Agreement (ACTA), 2010/C 147/01, 22 February 2010, paragraph 25 to 27.
[xx] 'Comparative Study on Different Approaches to New Privacy Challenges in Particular in the Light of Technological Developments' submitted by LRDP Kantor and Center for Public Reform
[xxxi] ‘Comparative Study on Different Approaches to New Privacy Challenges in Particular in the Light of Technological Developments’ submitted by LRDP Kantor and Center for Public Reform http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1636706
[xxxii] Article 2(h) Directive 95/46/EC http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:EN:HTML
[xxxv] See the IAB’s guidelines: