Changing the tone of the conversation about Data

From fear to understanding

Andrea Gonzalez Paz, Data Protection Lawyer, explores the benefits of data sharing for the development of healthcare technology.

By Andrea Gonzalez Paz

The 21st century comes with many challenges, and technology is a strong competitor in the permanent race to address them. Technology needs data. Your data, our data. Have you ever wondered where the information you submit online is going? Who is using it, and for what purpose? Has this caused a feeling of slight uneasiness, distrust, or even fear? To most of us, the answer to these questions is ‘yes’. Data needs demystifying.

TECHNOLOGY & HEALTHCARE

By 2020, the global population over the age of 60 will reach one billion, equivalent to 12.3% of the total world population. By 2050, the estimate is 2 billion, 22% of the global number. The number of patients with chronic health conditions is on steep increase, and so is the demand for long-term treatments. Technologies are trying to keep up and facilitate a healthcare system fit for the health challenges of the 21st century. In order to do so, they need large volumes of data. Used in the right way, digital information can contribute to the improvement of precision diagnostics and facilitate personalised treatment – it can bring about the social improvements our generation needs.

Artificial Intelligence technologies relying on data can assist in simple administrative tasks, such as organising, structuring and optimising patient records. This could reduce inefficiencies in accessing the right doctors or clinics, at the right time. Imagine what it would be like if every hospital in Europe could, in seconds, access a patient’s entire medical history. The treatment they could provide in case of emergency would be more personalised, accurate and more likely to save a  person’s life.

Data-based technology can also significantly aid medical drug research: Berg, for example, is a company which relies on data sets to compare images of cancerous and non-cancerous cells in humans. Through this analysis they managed to isolate several enzymes, which now form the basis of new medication. Certain methods for improving diagnostic precision also rely on having access to data from former and current patients.

So how exactly do entities (mainly technological companies) gain access to this data? Most of the time, through individuals’ consent. Individuals must give permission for certain types of personal data to be used. The new General Data Protection Regulation (GDPR) has come to standardise and better enforce this procedure. As a lawyer, I recognise the importance of empowering individuals with control and autonomy over their personal data. Yet, obtaining consent is a challenge that often hinders the social benefits of ethical data sharing. Where there is no established relationship between organisations and the individual, trust is non-existent and hinders consent. But even where there is a relationship, users tend to automatically say ‘no’ to sharing their health and medical data, out of fear. The problem seems to lie in the lack of meaningful conversations about why organisations require such data, how they intend to use it, and to whose benefit.

Additionally, to digital users, consent usually takes the form of a box they tick mindlessly after swiftly scrolling through another long list of terms and conditions in small print. This, however, is becoming increasingly untrustworthy. The recent data scandals associated with the interweaving of social networks and politics have made this a particularly sensitive and controversial topic. Private and public organisations need to explore exactly what stops consumers and individuals from consenting to the use of their data: where does this fear and worry come from?; what tools or methods can organisations employ to guarantee fair and ethical uses of data?; how can they transform individuals’ fear into trust? Work must be done to improve explanations to individuals as to why they should give consent to certain uses of their data, and to ensure that very data is protected and used fairly.

“Work must be done to improve explanations to individuals as to why they should give consent to certain uses of their data, and to ensure that very data is protected and used fairly.”

ETHICS & BOUNDARIES

The Eurobarometer in the field of data protection in 2015 concluded that, on average, 81% of individuals do not feel in control of the data they provide online. Whilst most citizens are reluctant to share their sensitive personal information with technology companies, they are comfortable sharing it with physicians, health insurers, and research institutions. Big, faceless organisations have a reputation for making use of data for profit purposes without defining their boundaries, whereas people evidently working for ‘the common-good’ are assumed to handle information ethically.

When big companies use individual’s data without defining ethical boundaries, i.e., without detailing specifically what lines they can and cannot cross, they perpetuate a feeling of distrust and compromise their future acquisition of data. Defining these boundaries is more than a mere exercise of formalistic compliance with the laws and rules on data protection, though that in itself is crucial. It is an ongoing effort to advocate for meaningful and sustainable relationships with individuals.

What exactly qualifies as unethical uses of data? Key concerns include the possibility of re-identification of individuals through data-mining, data-linking, data-merging, and the re-use of large data-sets. Another serious worry is that the identification of particular groups of individuals (according to religion, ethnicity, gender, sexual orientation, age), independently of the anonymisation of each of their data, can lead to grave ethical problems such as group discrimination and group-targeted forms of violence. Entities have to open up conversations, and guarantee that ethical problems like these will be proactively addressed.

“81% of individuals do not feel in control of the data they provide online.”

Certain companies avoid the possibility of unethical treatment of information by engaging in human-centric uses of data. MIDATA – a Swiss digital platform where users can securely store, manage and control access to their personal data – contributes to medical research and clinical trials by providing access to sets of data across cooperatives. Members feel informed and empowered, being active participants in decision-making processes, and are allowed to withdraw their data at any point. Another example is UK-based global innovation foundation Nesta which supports initiatives like the ALT personal device – a place for users to store and monitor their own data. Both companies are excellent examples of how vision and leadership translate into effective data practices that allow informed, ethical uses of personal information while assuring individuals that their data will be handled responsibly.

However, all organisations need to mature their data practices to protect human dignity: even when users do not have direct control over what happens to their personal data, they should be able to trust the entities who do to use it fairly. In order for trust to be fostered, there needs to be transparency.

TRANSPARENCY & UNDERSTANDING

It is not enough for organisations to improve their privacy policies and to ensure data is used within ethical boundaries. They must also work to let their individuals know exactly how they are doing it. Transparency is necessary. For organisations, starting meaningful and clear conversations about data with their users, ensuring they feel informed, powerful, and included in how their personal information is being put to use, is of the utmost priority.

It is no news that data and fear come hand-in-hand in our society. We hear about data breaches and the dangers of sharing information more than we hear about the undeniable benefits that responsible data sharing can bring to the world: how it can advance technology, research, and healthcare, to name but a few. The tone of the conversation around data needs to change from a negative one surrounded by fear, to one that considers the positive impact data can bring.

However, companies and public institutions are still figuring out exactly how to achieve this. They need to better explain their handling of data to customers – the technical procedures, pros and cons, and purposes of such handling. They also need to consider how to positively engage with users – be it through their political representatives, or through pairing up with hospitals or GPs to advocate for the social benefits of data.

“For organisations, starting meaningful and clear conversations about data with their users, ensuring they feel informed, powerful, and included in how their personal information is being put to use, is of the utmost priority.”

CHANGING THE GAME

How can transparency be achieved? How can we make sure individual data providers – us – know what they are getting into when they share their information? How can we ensure a better understanding of data? These are all pivotal questions surrounding the data debate.

The Eurobarometer concluded that data subjects in the EU are largely unaware of the rights they possess, as well as of the privacy practices and policies employed by entities that process their personal data. Individuals feel there is an inequality between powerful entities who hold this knowledge, and themselves as powerless data subjects. However, this is, to some extent, illusory. The public could easily instigate major changes in the privacy field. Unfortunately, this ability seems to be weakened due to a lack of privacy awareness and education.

There needs to be a revolution in data education, starting from schools, and trekking all the way up to big companies and organisations. Preparing the future generations to live in this digitalised world requires a change in the curricula to include data education. For the current adult generations, who are now faced with a world they did not imagine when they were growing up, understanding must come from different sources.  Governments need to make citizens aware of their rights, and organisations need to come up with innovative ways to communicate and improve understanding; explaining everything in small print is no longer sufficient.

“Organisations need to get individuals emotionally invested in the conversation around data, make them aware of the ways in which a fair use of their personal information can largely benefit society.”

Although technical, fact-based explanations of the workings and uses of data are necessary, they may not be accessible to everyone. Organisations also need to get individuals emotionally invested in the conversation around data, make them aware of the ways in which a fair use of their personal information can largely benefit society. Private and public institutions should channel their efforts to expose, in meaningful and creative ways, true stories that reflect this positive impact, resonate with people’s humanity, and call for empathy.

As an individual I would like to see myself as a beneficiary or contributor, not as an “usage object”. In order for this to happen, the conversation around data needs to change. Only then can we fairly maximise the positive impact data can have on individuals and society as a whole. As a lawyer, I work to make this happen.

Andrea Gonzalez Paz is a Data Protection Lawyer for Phillips. Her views and opinions are her own.

Leave a Comment