US video-sharing app, TikTok, has been given a $5.7m (£4.3m) fine for failing to safeguard the privacy of children.
The Federal Trade Commission (FTC) issued the penalty, after it was discovered that the TikTok-owned Musical.ly app knowingly used content that had been uploaded by underage customers.
The fine, which is the biggest ever issued in the States for a children’s data case, has also led to an agreement on TikTok’s behalf to rethink the way the platform deals with younger users.
The company must also erase the data involved in the case, while all users in the States who use the app in future will have to verify their age – not that this is difficult to get around.
In a statement, TikTok said:
“We care deeply about the safety and privacy of our users. This is an ongoing commitment, and we are continuing to expand and evolve our protective measures in support of this.”
However, the new function obtain age verification will not be rolled out to countries such as the UK and beyond, as the arrangement only applies to operations in the US.
Boasting a global user-base of around 1bn, TikTok was one of the most downloaded apps of 2018.
The FTC’s concern is about the age of many of the 65 million US-based users.
An FTC statement said:
“For the first three years [of its existence], Musical.ly didn’t ask for the user’s age.
“Since July 2017, the company has asked about age and prevents people who say they’re under 13 from creating accounts. But Musical.ly didn’t go back and request age information for people who already had accounts.”
Worries have been confirmed through media reports of adult users on Musical.ly contacting children who, from a quick study of profile data such as images and dates of birth, were clearly of school age.
Over 300 parental complaints were made to Musical.ly in September 2016 led to the deactivation of the profiles of children involved, but none of the data was actually deleted.
The FTC said that TikTok failed to meet basic privacy standards within the Children’s Online Privacy Protection Act, which details how children’s data can be collected and used with parental consent.
TikTok also failed to respond to parents’ requests to erase the data involved and held onto that data for longer than was needed.
The company said:
“While we’ve always seen TikTok as a place for everyone, we understand the concerns that arise around younger users.
“In working with the FTC and in conjunction with today’s agreement, we’ve now implemented changes to accommodate younger US users in a limited, separate app experience that introduces additional safety and privacy protections designed specifically for this audience.”
European Data Protection Summit will take place on June 3rd in Central London and will play host to 800 DPO’s, Security Professionals and senior business decision makers looking for; information, updates, clarity, advice and solutions. For more information, visit the website.
We have been awarded the number 1 GDPR Blog in 2019 by Feedspot.