By Marcus Grazette, Europe Policy Lead at Privitar

Meetings, socialising, schooling, exercise classes, shopping. Adjusting to social distancing means shifting everyday activities online, where they are mediated by organisations operating digital platforms. These organisations will collect increasing volumes of data, much of it highly sensitive. This increases the risk of privacy harms for individuals, with corresponding risks of reputational damage and loss of trust for the organisations collecting that data. Building in privacy by design will help to protect individuals and maintain trust.

Covid-19 is pushing us to live digitally
The shift online, prompted by social distancing, means that organisations are collecting more data and using it in new ways. We can group those organisations into three broad categories. First, well-established digital platforms, including social media, whose customers are using their services more intensively. Second, organisations with some pre-existing digital footprint scaling up to meet unprecedented demand. Supermarkets with online delivery services fall into this category. Finally, organisations launching entirely new online services.

For all three, building trust is paramount. But that may be more difficult for organisations without well-established practices for handling customer data. Data is powerful. Used well, it can provide the insights that fuel innovative organisations. But, in some cases, data use can feel ‘creepy’ for consumers, which could lead to a loss of trust and customer churn.

This is often the case when data is used in ways that consumers do not expect, and worse when their information is exposed through breaches or inappropriate sharing. Predictability is crucial. NIST, the US National Institute for Standards and Technology, describes predictability as “core to building trust”.[1] Organisations have to make choices about how they use data, and what controls they apply.

Interacting online allows platforms to collect more data, and increases privacy risk
Shoshana Zuboff, author and Harvard professor, argues that we live in an age of surveillance capitalism. The surveillance element of her argument comes from the fact that some of the most valuable data is collected by observing how a user interacts with a platform. She calls this our ‘digital exhaust’. The times of day a user is active, the devices they use or the options they consider before adding an item to their shopping basket all provide rich insights. Zuboff points out that companies use information about us to build ever more detailed profiles and to nudge us towards actions that benefit the company, such as clicking on advertising. Collecting and using data carries two related risks.

First, the risk of privacy harm to the individual. In simple terms, this could be where data is used in a way that the consumer did not expect. It contributes to the sense of ‘creepiness’ and can lead to a loss of trust in the organisation. This can be particularly challenging where consumers feel that they have little choice but to use an online option (e.g.  to comply with social distancing measures in place to fight COVID-19). In addition, the risk increases with scale, as more people use online service, and as more aspects of a consumer’s life shifts to online and becomes linkable data points in their digital profile.

Similarly, organisations responding to demand from users outside of their traditional target market will need to tread carefully. For example, services usually found in corporate settings now host all types of digital interactions, from birthday parties to Parliamentary debates. Are existing privacy policies and protections appropriate for new types of users? The change in context matters and can have significant implications for brand and reputation. This is particularly relevant now as consumers explore their options for shifting online.

Second the risk to the organisation, including a loss of trust and reputational damage. Techlash, the FT’s ‘year in a word’ in 2018, partly describes public frustration with organisations using data in ways that feel creepy or manipulative.[2] But techlash is not an automatic consequence of data use. The Information Technology and Innovation Foundation (ITIF), a US think tank, draws a parallel with the introduction of the automobile. First, there was excitement at new options for transport, then concern over issues like safely or the environment. In the automobile example policy responses, including regulation, have helped to address issues of concern.[3] In the data protection space, regulation also plays a key role in protecting privacy and building trust between individuals and organisations.

Platforms can build trust by focussing on privacy
As organisations respond to the shift to digital in unprecedented times, they should focus on building and maintaining customer trust. A clear, comprehensive approach to data privacy is key. Some of the new data being used will be highly sensitive. For example, the government has allowed online supermarkets some access to data on vulnerable people. Using that to prioritise deliveries to those who need it most is a useful outcome. Even when under pressure to move quickly, organisations should ensure that the core principles of data protection including transparency and purpose limitation, continue to govern their data use. Building trust, including by protecting privacy, will help to ensure that organisations respond successfully and emerge stronger.

[1] NISTIR 8062 Privacy Engineering and Risk Management, January 2017
[2] FT, Year in a Word: Techlash, December 2018
[3] ITIF, A Policymaker’s Guide to the “Techlash”—What It Is and Why It’s a Threat to Growth and Progress, October 2019