Helping IoT developers to assess ethics, privacy, and social impact

GDPR (General Data Protection Regulation) introduces a mandatory Data Protection Impact Assessment. This is to help organisations to identify and minimise the data protection risks of a project to individuals. But there are other consequences to collecting and using personal data beyond privacy and data protection considerations. We should also be thinking about the ethical and societal outcomes of what we do with data. Open Rights Group (ORG) is exploring these issues as part of the VIRT-EU consortium alongside the London School of Economics, Uppsala University, Polytechnic University of Turin, and Copenhagen Institute for Interaction Design.

The project is researching Internet of Things (IoT) development and development culture. It is also creating tools and frameworks to help foster ethical thinking among IoT developers. One of these tools will be the Privacy Ethical and Social Impact Assessment (PESIA), which augments and interacts with the Data Protection Impact Assessment from GDPR. The PESIA is being developed predominantly by Alessandro Mantelero at the Polytechnic University of Turin with the help of ORG. It will be a voluntary, self-assessment tool to help organisations who collect and process personal data to assess the wide variety of risks and repercussions related to how they use data.

While Privacy Impact Assessments and Data Protection Impact Assessments look primarily at issues around privacy, the PESIA extends beyond that by including ethical dimensions and societal consequences of using data. Privacy- and data protection-focused assessments seldom address issues like the possibility of discrimination affecting individuals and groups due to decisions made using big data. This is despite Recital 75 to Article 35 of GDPR which highlights “social disadvantage” and “discrimination” as some of the potential consequences of data processing. These considerations will be integrated within the PESIA.

The PESIA emphasises a public, participatory approach where a product or service engages its users and people it affects in the process of identifying the issues around relevant privacy, ethical, and social impacts. This contrasts with Privacy Impact Assessments and Data Protection Impact Assessments which, for the most part, are carried out internally and are not necessarily easily accessible for users and customers.

We hope that the PESIA will help developers to integrate ethical and social values into their work and the devices and services they create. Previous attempts to create a Social Impact Assessment have highlighted the significant investment of time and other resources that can be required for organisations to assess the impact of their products and services. For this reason, the PESIA aims to be easy to implement by people within the organisation itself – at least for the early stages of the analysis. Some involvement by experts may sometimes be necessary to assess social consequences of data processing.

The PESIA will not be a box-ticking exercise where organisations assess privacy-, ethical-, and social impact-related issues without fully engaging in the consequences of their work. Instead, our goal in the development of PESIA is that this tool will help IoT developers to develop a clearer understanding of the ways in which their use of data has an impact on society and what the ethical implications of their work are. Being a voluntary self-assessment, PESIA leaves the developers to decide how to address the issues that are raised in the process of thinking through these implications.

Because ethical and social values can vary significantly between countries and cultures, one of the most challenging aspects of developing the PESIA is defining the ethical and social values that the assessment uses. This is quite a significant shift from some aspects of conventional data protection assessments where technological considerations such as data security can be much more easily generalised across countries and cultures.

The VIRT-EU project is looking to address this challenge by separating out the PESIA into layers. The foundational layer presents the common values of international human rights charters and other law including the EU Charter of Fundamental Rights. It also brings in the case law of decisions regarding data processing made by data protection authorities, the European Data Protection Board (previously the Article 29 Data Protection Working Party), the European Court of Justice and the European Court of Human Rights. This layer should directly allow us to secure common ground across different contexts. The PESIA assessment will provide short cases and examples to help developers (who usually do not have a legal background) to understand these questions.

The second layer deals with the diversity of ethical and social values across Europe. The project is undertaking a comparative analysis of the decisions of different national data protection authorities on similar cases to identify values that may have informed and led to different outcomes in those cases. Specific values will not necessarily be obvious in those legal decisions and we do not aim to be able to describe or define national values.

The third and most specific layer involves what sorts of values are enacted by IoT developer communities. Our colleagues at the London School of Economics and at the IT University of Copenhagen are trying to learn from IoT developers through active participation in those communities. In doing so, they hope to gain an understanding of their work practices and the ethical and social values they enact in the course of technology development.

We will be developing the PESIA assessment over the next year and then trialling it with IoT developers. We hope these trial sessions will provide feedback about the content and process of the assessment. If you are a UK-based IoT developer and are interested in participating in the these trial sessions, please contact ORG researchers at ed@openrightsgroup.org and we will contact you closer to the time.