• On March 16, Facebook announced via a Newsroom blog post that they had suspended Cambridge Analytica and its parent company, Strategic Communication Laboratories (SCL), from Facebook.
• This follows reports that data acquired on Facebook through an unrelated app was passed along to SCL/Cambridge Analytica and Chris Wylie of Eunoia Technologies, in violation of Facebook’s platform policies, still exists. All parties had previously provided Facebook with signed certifications of deletion.
• Facebook have hired independent forensic auditors from Stroz Friedberg to conduct a comprehensive audit of Cambridge Analytica. On the evening of March 19th Stroz Friedberg was on site at Cambridge Analytica’s London office, but the request of the UK Information Commissioner’s Office, which has announced it is pursuing a warrant to conduct its own on-site investigation, the Stroz Friedberg auditors stood down. Details here.
• 2013 – Global Science Research (GSR, Dr. Aleksandr Kogan’s company) develops a quiz app. At the time, Facebook allowed c. 270k users to give their own data, plus limited friends data, if those friends had not opted out, leading to a bigger number impacted.
• 2014 – Facebook announced changes to the way the platform works to give much less data.
• May 2015 – All apps were migrated over to version two of the platform, putting into place the changes announced in 2014.
• Dec 2015 – Facebook learned from journalists about GSR selling data to Cambridge Analytica and took legal action.
• 2018 – Facebook learned from journalists that the data may not have been deleted as certified and suspended everyone involved.
Facebook’s stance (official comms from Facebook)
• This is a huge violation of trust. We are taking this issue incredibly seriously and we’re beyond disturbed by the reports we’ve seen that data was misused and that our policies were violated.
• Make no mistake – we take the privacy of people’s data and the integrity of our platform very seriously, and we condemn what Kogan, CA/SCL, and Wylie did. They violated our policies; they improperly obtained and shared people’s information.
• Even before learning about Kogan’s activities, we made changes to our product to help prevent this type of access to people’s information. We’re always improving and evolving our products to help ensure people have positive experiences on Facebook.
• We made a product change so that people can’t share detailed information about their friends with developers. That means the extent of information Kogan accessed is no longer possible to access today. But we’re continuing to improve our product and our policies to prevent further abuse. o Our policies have always restricted developers’ ability to use data they obtain from Facebook.
• In this case, people who installed Kogan’s app chose to share information with the app, which was subject to those policies. We’re deeply upset that Kogan violated our policies by misleading people about how the data they shared with him would be used, and by improperly sharing that information with CA/SCL and with Wylie.
• When we learned that Kogan may have improperly shared data, we required Kogan, Cambridge Analytica, and Wylie to delete data they had obtained through Kogan’s app. All three certified that they had deleted the data, but further reports suggest the data still exists. Considering those reports, we are requiring Kogan, Wylie, and Cambridge Analytica to undertake a full forensic audit and prove that the data was in fact deleted.
Following on from the news, Mark Zuckerberg has appeared on CNN to make a statement, from that Facebook have committed to six key steps (as stated in this Newsroom post which followed Mark’s post):
1. Review our platform. We will investigate all apps that had access to large amounts of information before we changed our platform in 2014 to reduce data access, and we will conduct a full audit of any app with suspicious activity. If we find developers that misused personally identifiable information, we will ban them from our platform.
2. Tell people about data misuse. We will tell people affected by apps that have misused their data. This includes building a way for people to know if their data might have been accessed via “thisisyourdigitallife.” Moving forward, if we remove an app for misusing data, we will tell everyone who used it.
3. Turn off access for unused apps. If someone hasn’t used an app within the last three months, we will turn off the app’s access to their information.
4. Restrict Facebook Login data. We are changing Login so that in the next version, we will reduce the data that an app can request without app review to include only name, profile photo and email address. Requesting any other data will require our approval.
5. Encourage people to manage the apps they use. We already show people what apps their accounts are connected to and control what data they’ve permitted those apps to use. Going forward, we’re going to make these choices more prominent and easier to manage.
6. Reward people who find vulnerabilities. We will expand Facebook’s bug bounty program so that people can also report to us if they find misuses of data by app developers.
All Response Media viewpoint
Due to the severity of the issue, Facebook has taken this to heart and are actively taking steps to protect both their platform, and advertisers, and are revisiting the way in which it passes data to their partners. They have already suspended the necessary parties in this instance, but this event has meant that they are scrutinizing the rules around data sharing even closer, which can only be a good thing.
From an advertising perspective, we do not collect any sensitive or user data – we always aggregate behaviours or interest to reach an audience. This data is purely used for marketing and no personally identifiable information is obtained, stored or shared.
Hopefully this gives you the information you need and puts your mind at rest that we are across this and working with Facebook to protect our clients