#privacy: Facebook purges tens of thousands of apps due to data privacy concerns

 Facebook has suspended tens of thousands of apps following investigations into its participation in the Cambridge Analytica scandal, reports reveal.

According to the social network, not all the apps suspended were believed to be a risk to users, but they were all linked to around 400 developers.

The purge is one more in a suite of measures taken by Facebook to try to rebuild its reputation for being a responsible data handler, a reputation which was shattered by revelations about a murky data-sharing relationship with data intelligence firm, Cambridge Analytica.

It is alleged that the UK-based firm harvested huge swathes of Facebook user data without permission, to help construct a complex political ad campaign to influence US voters’ decisions and thought processes in the run up to the 2016 US presidential elections.

In the aftermath, Mark Zuckerberg’s firm was hit with a $5bn fine by the US Federal Trade Commission in July. The financial penalty is considered to be the largest ever of its kind leveraged against any company for being in violation of consumer privacy rights.

Facebook’s relationship with Cambridge Analytica compromised the private and personal data of tens of millions of computer users. The breach of unprecedented scale continues to be hugely embarrassing for the social network, while the story of the scandal’s unravelling was the subject of recent Netflix documentary The Great Hack.

As part of a major push to improve data privacy standards, to say nothing of its beleaguered image, Facebook initiated an investigation in March of this year to explore the safety mechanisms of the apps that use the Facebook platform.

In a statement, the company said:

“We promised then that we would review all of the apps that had access to large amounts of information before we changed our platform policies in 2014. It has involved hundreds of people: attorneys, external investigators, data scientists, engineers, policy specialists, platform partners and other teams across the company. Our review helps us to better understand patterns of abuse in order to root out bad actors among developers.

“We initially identified apps for investigation based on how many users they had and how much data they could access. Now, we also identify apps based on signals associated with an app’s potential to abuse our policies. Where we have concerns, we conduct a more intensive examination.

“This includes a background investigation of the developer and a technical analysis of the app’s activity on the platform. Depending on the results, a range of actions could be taken from requiring developers to submit to in-depth questioning, to conducting inspections or banning an app from the platform.”

Little information is known on the tens of thousands of apps, or the developers behind them, as of yet.

One of the banned apps, called myPersonality, was sharing data with researchers and other organisations, but with limited cyber-security protection measures in place. The app refused to participate in an audit, Facebook maintains.


Catch the replays and discover the best talks from Last Thursday in Privacy, addressing data protection, privacy and security challenges including working from home, COVID-19, global regulations and more. Visit https://digital.privsec.info/.

We have been awarded the number 1 GDPR Blog in 2019 by Feedspot.