Self-service access to safe data
Protect data and manage risk
Analyze conversational chat data
Reduce the time and cost to comply
Right data in the right hands
Align control and business use
Controlled access to data
Flexibility, consistency, scalability
Our professional services
Power responsible use
From clinical to commercial
Optimize data tests
Open new revenue streams
Realize the potential of the cloud
Protect data from misuse
Transform your data
Opinion and industry insights
An A to Z of the industry
The podcast for data leaders
Press releases, awards, and more
Staying at the cutting edge
The team behind Privitar
A thriving partner ecosystem
Our story, values, and careers
Dedicated customer assistance
Apr 06, 2021
Access to data is a core element of efficient markets in the digital age. Digital platforms have unrivalled access to data, and regulators are increasingly considering requiring them to share that data to level the playing field. This blog examines how data sharing might work in practice, and the role of PETs in sharing personal data safely.
Broadly speaking, competition law aims to promote well-functioning markets by banning two main types of activity: anti-competitive agreements and abuse of a dominant market position. Banning these activities enables businesses to compete effectively, which in turn allows them to offer the best possible product or service at the best possible price. Competition regulators focus on activity that harms consumers in a specific market.
Applying traditional competition law ideas to digital platforms can be challenging partly because digital platforms are ‘multi sided.’ In other words, they operate in different, but related markets. Amazon is a good example. Amazon’s Marketplace connects buyers and sellers. But Amazon is also a retailer, as its products compete in the Marketplace it controls. Amazon also offers services to sellers (for example warehousing, logistics, payment processing) and collects data about buyers using the Marketplace and other Amazon products and services.
Multi-sided digital platforms have advantages. For consumers, interacting seamlessly with a large platform offering many products and services may be more attractive than interacting with a large number of small firms. Consumers may also benefit from network effects, as platforms become more useful as more people use them.
As always, there are also disadvantages. The Furman Review concludes that concentration in digital markets can generate substantial costs, by raising prices, reducing choice or impacting quality. Even where services are ‘free’ in financial terms, the Review notes that “it might have been that with more competition consumers would have given up less in terms of privacy.”
Digital platforms use the data they collect to measure and improve performance, provide targeted recommendations, support iterative innovation and more. A 2019 European Commission report co-authored by Jacques Crémer, Yves-Alexandre de Montjoye (one of Privitar’s academic advisers) and Heike Schweitzer argues that data is “a crucial input to many online services, production processes, and logistics. Therefore the ability to use data to develop new, innovative services and products is a competitive parameter whose relevance will continue to increase.”
As we saw in the example above, multi-sided digital platforms are particularly well placed to collect and use data because they provide a range of services and benefit from network effects. This could allow the platform to use non-public data (for example, data on sales by third party sellers using the platform) for its benefit, for example to target the platform’s own sales strategy to more profitable products. Using data in this way could breach competition law.
Competition regulators have, unsurprisingly, taken a keen interest. We’ve seen several cases against digital platforms accused of using their privileged access to data, or ability to collect it, to distort markets. For example, the European Commission launched antitrust investigations into Amazon’s use of Marketplace data in 2019 and 2020, and the US FTC is also reportedly collecting information from third party sellers as part of their own investigation into Marketplace.
Google has also been in the spotlight. For example, the UK’s CMA undertook an online platforms market study and has launched an investigation into Google’s approach to third party cookies. There have been similar reviews of digital advertising in Australia. The US Department of Justice brought a case against Google in 2020 and European regulators fined the company in 2019 for breaching competition rules around digital advertising.
The classic competition enforcement actions described above hold companies to account when they are found to be breaking the rules. However, recent interventions suggest that lawmakers want to proactively boost competition by levelling the data playing field. If we accept that data fuels innovation, which produces economic benefits, it makes sense for the legal regime to facilitate, incentivize or maybe even compel organizations to share data, including with potential competitors. That could mean platforms sharing more data with sellers operating on them, so that those sellers can also benefit from insights currently only available to the platform itself.
We’re following two initiatives closely. First, the European Commission’s proposed Digital Markets Act (DMA), which prohibits platforms from using non-public data to compete with sellers on their platform and aims to level the playing field by requiring platforms to share data with the buyers and sellers they connect. Second, the data openness recommendation in the UK’s Furman Review, which frames access to data held by digital platforms as a pro-competition measure.
Both the DMA and the Furman Review stress the need for data sharing to respect privacy and to comply with GDPR. Both suggest sharing aggregate data as an option for achieving those aims. However, aggregates are vulnerable to well known privacy attacks.
We recommend using differentially private aggregates to overcome the privacy risks. Differential privacy guarantees a limit to the amount of information specific to any individual, that is revealed by an analysis. Differential privacy is a property of an algorithm, and there are differentially private algorithms designed to generate aggregate statistics.
Sharing differentially private statistics would enable platforms to open up the data on which their insights are based, while protecting individual privacy. This could support competition, as other businesses use the insights to develop their own products and services.
Sorry, no posts matched your criteria.
Our team of data security and privacy experts are here to answer your questions and discuss how modern data provisioning can fuel business growth.