_MG_0308-Letterbox.jpg

Photo: Unfold Stories / Simon Davis

50 Shades of Privacy: Consent and the Fallacy that will Prevent Privacy for All

– Information Age

When it comes to data sharing, consent and choice are two different things. Susan Morrow argues that we need to shift towards systems that support genuine choice

The introduction of the GDPR in May 2018 has led to much discussion about how best to protect user privacy, but Susan Morrow suggests that we still need to resolve the conflict between consent and choice.

After the new regulations came into force, a number of services – including both Google and Facebook – asked users to consent to sharing their data. In this regard, both organizations were in keeping with the legislative requirements set out in GDPR. However, if users chose not to consent to their data being shared, they were locked out of the platforms. It follows that, in many cases, if users want access to a service, they have to consent; there is no choice.

Morrow explores the consequences of this paradox, arguing that choice is an essential element of data privacy.

"Data privacy must be more than an on/off consent switch. Because data privacy lies at the heart of how we transact via our digital persona, it needs to have considerations that go beyond the purely legal or the purely technical. Privacy needs a social prism."

Morrow goes on to consider the risks of providing increased choice without considering this "social prism." She notes that some privacy experts recommend a system whereby users can sell companies access to their personal information, putting individuals in charge of the commodity that is their own data. However, Morrow observes flaws in this concept:

"I pointed out that, whilst in a perfect world this was fine, in a less than perfect one we would be creating a tiered privacy system; the wealthy having the choice to retain data privacy rights whilst those in need having less choice.

Choice suddenly becomes less black and white and more 50 shades of grey; along the lines of, "I made the choice to sell my data because my baby needed food'."

Instead, Morrow argues that social considerations must be central to software design, so that informed and genuine choice is deeply engrained within digital systems.

‘"Software designers must understand the subtle nuances of how privacy can be exploited. This happens at the point where the digital world reaches out and touches the real one. Design decisions have to take legal frameworks, like GDPR, into account. But they should also be taking people into account too."

We are all on a journey through life, our privacy goes with us. My decisions about who takes a digital piece of me should be mine to make without coercion

View article