Skip to main content
photo of the tiktok app on a smartphone

TikTok Sued For Violating Child Privacy Law

The Data Privacy Lawyers at The Lyon Firm are investigating allegations of TikTok privacy violations. The company has allegedly violated the privacy of children by failing to comply with the legal requirement to notify and obtain parental consent before collecting and using personal data from individuals under the age of 13. Contact our legal team if your child has used an online video platform and has possibly been a victim of data privacy intrusions.

Our privacy attorneys are currently involved in numerous online data privacy and data misuse lawsuits, and we represent plaintiffs in all fifty states. We believe strongly in the right to privacy, and our lawyers work with industry experts to build the strongest possible cases against negligent corporations.

Did TikTok Violate Child Privacy Laws?

The U.S. Department of Justice, on behalf of the Federal Trade Commission (FTC), recently sued TikTok, its parent company ByteDance, and affiliated companies for violating the Children’s Online Privacy Protection Act (COPPA).

The 2024 FTC complaint claims that TikTok “knowingly and repeatedly violated kids’ privacy” and used “sophisticated digital tools to surveil kids and profit from their data.”

TikTok and ByteDance deny any wrongdoing. However, the FTC says the company was well aware of the need to comply with COPPA, alleged “compliance failures” that put children’s data and privacy at risk, and a separate 2019 consent order.

According to the lawsuit, by 2020, TikTok had a policy of maintaining accounts of children that it knew were under 13, unless the child made an explicit admission of age. Many data privacy advocates argue that TikTok may not have done quite enough to keep kids below its 13-year minimum age off the platform through an age verification screening process. Company human reviewers allegedly spent an average of about seven seconds reviewing each account to confirm the account belonged to a child.

In 2021, TikTok revamped their privacy settings on accounts belonging to users aged 13 to 15, ostensibly adding more robust privacy protections.

The company, however, allegedly continued to collect personal data from underage users, including data that enabled targeted advertising, all without notifying parents and obtaining proper consent as required by law.

The FTC complaint also alleges that TikTok actually built a kind of loophole into its video-sharing platform that could allow children to bypass the age screening process. Individuals could create accounts without having to provide an age or obtain parental consent to use the service by using credentials from third-party services. The company simply classified such accounts as “age unknown.”

The FTC describes in their complaint instances of children using the dedicated TikTok Kids Mode service, meant to be a more privacy-protected version for children, and kids still had their data collected, a COPPA privacy violation. TikTok then shared this user data with Facebook and other third parties.

The company apparently collected multiple types of data, including information about children’s activities on the app, and personal identifiers that can be used to build profiles on individuals. This was all the while failing to notify parents about the extent of its data collection and data sharing practices. When some parents submitted deletion requests, TikTok sometimes imposed “unnecessary and duplicative hurdles,” and failed to comply with such requests. In summary, the FTC complaint makes the following TikTok data privacy allegations:

  • The company failed to notify parents about the extent of the personal data they were collecting from their children
  • They failed to obtain proper parental consent for the collection and sharing of that data
  • The company failed to limit the unlawful collection, use, and disclosure of children’s personal information
  • They failed to delete children’s personal information when requested by parents or when accounts no longer existed

The complaint requests civil penalties against ByteDance and TikTok of up to $51,744 per violation, per day. The suit also requests to enter a permanent injunction against the company to prevent future COPPA privacy violations. The alleged violations have resulted in millions of children under 13 using the adult version of the TikTok application.

The government says TikTok employed deficient policies that were unable to prevent a flood of children’s accounts. The FTC claims that TikTok and ByteDance have the ability to identify and remove children’s accounts, but choose not to do so, favoring profitability over privacy.

On their website, their privacy policy explicitly states, “We may disclose personal information if permitted or required by law,” and claims the company “does not sell information from children to third parties and does not share such information with third parties for the purposes of cross-context behavioral advertising.”

What Is COPPA?

The Children’s Online Privacy Protection Act (COPPA), established in 2000, details what a website operator must include in a privacy policy, when and how to seek verifiable consent from a parent or guardian for children under 13 years old. Although children under 13 can legally give out personal information with their parents’ permission, many social media sites disallow children under 13 from using their services because it is difficult to comply with children’s online privacy laws.

In 2019, Google and YouTube agreed to pay a $170 million fine to settle allegations that the company illegally collected personal data related to children without their parents’ consent.

Meta Platforms Inc., the parent company of Facebook and Instagram, was sued for contributing to the youth mental health crisis by deliberately designing addicting features on Instagram and Facebook. Another lawsuit claimed Meta also collects data on children under 13 without their parents’ consent.