Default settings, privacy-enhancing tech, and the difference between transparency and legalese: Watch Privitar’s Kieron Guinamard talk about the key issues around Facebook/Cambridge Analytica.
[0:10] Why are Facebook and Cambridge Analytica on the front pages?
Facebook allows third party application developers access a huge amount of information on your profile. This includes your likes, your dislikes, your friends' list, and anything you shared on your timeline. It also allows third party application developers access to this information when your friends use their application. There is a little-known part of Facebook, in the settings, known as "Apps others use" which controls access. Effectively Facebook users, by default, were delegating consent to their friends to share their information.
Numerous quizzes, name tests "Which Pokémon are you?", have been developed with the sole aim of harvesting as much personal information as possible. Cambridge Analytica, partnering with academics, uses this information to develop detailed psychological profiles of up to 100 million users, mostly without their explicit consent.
[1:10] How could this situation have been avoided?
All companies need to be very careful with how they treat their customer data. It's no longer sufficient to bury your terms and conditions behind difficult to understand legalese, and the controls of privacy settings need to be easy to find and easy to use. Furthermore, if you are changing your privacy settings, this needs to be communicated back to the users very very clearly. And if the users' friends or clients on the platform, are also sharing their data, that needs to be communicated back to the original user.
Ideally, Facebook would allow little information as possible to leave their system: using only privacy preserving insights to share data with third parties, and internally, using privacy engineering techniques, to restrict the amount information that an honest but curious employee might stumble across
[2:10] How will it affect the industry moving forwards?
It's important to know that these techniques were not unique to Cambridge Analytica. Scammers use very similar techniques to find vulnerable people, to target them with ads with products they don't need, and potentially harmful. Facebook themselves are engaged in a constant game of "whack a mole", trying to police their own platform, and shut down bad actors.
Going forward the industry needs to understand that if it wants to access this highly valuable data, it has to be treated extremely carefully. The data has to be shared only with reputable third-party companies, the personal information stored has to be minimised, and if you need to join disparate datasets together for analysis, this should be done in such a way as to minimise the risk to the individuals in those datasets.