U.S. Investigating Whether TikTok Violated Rules on Child Privacy

The U.S. Justice Department and Federal Trade Commission (FTC) are investigating whether popular app TikTok failed to comply with a 2019 agreement to protect children’s privacy.

In May, advocacy groups including the Campaign for a Commercial-Free Childhood asked the FTC to examine whether TikTok failed to delete videos and personal data from users aged 13 and under, as it had said it would do in a February 2019 agreement, among other violations.

Recently, two sources told Reuters that they took part in separate conference calls with the FTC and Justice Department to discuss the matter.

“I got the sense from our conversation that they are looking into the assertions that we raised in our complaint,” said David Monahan, campaign manager for one of the groups.

The FTC enforces the 1998 Children’s Online Privacy Protection Act, which requires websites and apps to get parental permission to collect data on kids under age 13, and calls on online services to prevent such data from entering the hands of third parties. Early last year, Tiktok paid a $5.7 million civil penalty for collecting childrens’ names, phone numbers, email addresses and photos.

A TikTok spokesman said they take “safety seriously for all our users,” and give kids under 13 “a limited app experience that introduces additional safety and privacy protections designed specifically for a younger audience.”

In May, the Netherlands’ privacy regulator also said it was investigating TikTok’s handling of the data of minors in the country.

TikTok saw downloads surge by 315 million in the first quarter, making it Q1’s third-most installed app worldwide. It now boasts some 2.2 billion users globally, according to research firm Sensor Tower. Around 60% of its American users are between the ages of 16 and 24.

In addition to scrutiny of its handling of minors’ data, the company also faces backlash overseas due to its ties to China through its Beijing-based parent company, Bytedance.

Last week, Indian authorities banned TikTok and nearly 60 other Chinese mobile apps amid a violent border dispute with China, citing cybersecurity concerns.

In Australia, the chair of a legislative committee investigating foreign interference through social media told a local radio station Monday that TikTok may be one of the platforms examined.

U.S. Secretary of State Mike Pompeo also said Monday that the Trump administration is “certainly looking at” banning TikTok. He called on Americans not to download the app if don’t want their “private information in the hands of the Chinese Communist Party.” 

Meanwhile, TikTok has abruptly exited Hong Kong entirely in the wake of a controversial national security law imposed by Beijing, a move analysts view as part of an attempt to further distance itself from China. ByteDance is also contemplating changes to TikTok’s corporate structure, with senior execs considering moves such as establishing a new TikTok management board or a global headquarters outside of China, the Wall Street Journal said Thursday, citing people familiar with the matter.

Bytedance, one of the world’s most valuable tech unicorns, currently does not have a global headquarters, although its new CEO, former Disney exec Kevin Mayer, works out of Los Angeles.

On Thursday, TikTok issued a transparency report for the second half of 2019. During that period, the app removed more than 49 million videos globally for violating its community guidelines or terms of service, amounting to less than 1% of total videos created. Of those removed, 89.4% were removed before they received any views.

India led the ranking for most videos removed, with 16.5 million deleted, followed by the U.S. (4.6 million removed), Pakistan (3.7 million), the U.K. (2 million) and Russia (1.3 million).

The firm received 100 information requests from U.S. authorities in the second half of last year and complied with 82% of them, it said.

The transparency report does not include data from China or Hong Kong.

Source: Read Full Article