TikTok removed more than seven million accounts of users suspected of being under age 13 in the first three months of 2021, the popular social media operator said Wednesday in a transparency report.
The Chinese-owned app which is wildly popular with young smartphone users, also said it took down nearly 62 million videos in the first quarter for violating community standards including for “hateful” content, nudity, harassment or safety for minors.
In its first disclosure on underage users, TikTok said it uses a variety of methods, including a safety moderation team, that monitors accounts where users are suspected of being untruthful about their age.
Those age 12 or younger are directed to “TikTok for Younger Users” in the United States.
The report comes with social media operators facing increased pressure to remove abusive and hateful content while remaining open to a variety of viewpoints.
TikTok’s transparency report said that in addition to the suspected underage users, accounts from nearly four million users additional were deleted for violating the app’s guidelines.
“Our TikTok team of policy, operations, safety, and security experts work together to develop equitable policies that can be consistently enforced,” the report said.
TikTok said its automated systems detect and remove the vast majority of offending content: “We identified and removed 91.3 percent before a user reported them, 81.8 percent before they received any views, and 93.1 percent within 24 hours of being posted.”
Overall, fewer than one percent of the videos uploaded on TikTok were taken down for violations.