TikTok is an extremely popular social media platform and app, with over 1 billion monthly active users worldwide as of 2022. It allows users to create and share short videos set to music and effects. TikTok has seen massive growth since launching internationally in 2017, and is now one of the most downloaded apps globally (TikTok Revenue and Usage Statistics (2024)).
However, TikTok’s popularity has not come without controversy. In May 2022, the state of Indiana sued TikTok and its parent company ByteDance, alleging violations of the Children’s Online Privacy Protection Act (COPPA). Indiana claims that TikTok is aware that many of its users are under age 13, yet still collects their personal data without parental consent. This lawsuit seeks to stop these alleged COPPA violations and ensure the privacy and safety of children using the platform (TikTok – Statistics & Facts).
TikTok’s Alleged COPPA Violations
TikTok has been accused of violating the Children’s Online Privacy Protection Act (COPPA) by illegally collecting data on users under the age of 13. In 2019, the Federal Trade Commission fined TikTok (then known as Musical.ly) $5.7 million for violating COPPA. The FTC alleged TikTok had actual knowledge that a significant percentage of its users were under 13 and failed to obtain parental consent before collecting their personal information (“Video Social Networking App Musical.ly Agrees to Settle FTC Allegations It Violated Children’s Privacy”).
TikTok defended itself by claiming it was committed to complying with COPPA going forward. It made changes like introducing an age gate to try to prevent underage signups. However, in 2022 TikTok paid $1.1 million to settle another class action lawsuit alleging it collected underage users’ data without parental consent (“Federal court approves $1.1M TikTok settlement over COPPA allegations”). Critics argue the company still does not do enough to prevent underage use of the platform or protect children’s privacy.
The Risks of Collecting Children’s Data
Collecting data on children comes with many risks that parents and regulators are rightfully concerned about. Children are especially vulnerable to exploitation and harm from the misuse of their personal data. According to research, collecting data on children raises ethical considerations around consent, privacy, and preventing harm (source).
One major risk is that children’s data could be exploited by predators or bad actors. Marketers may use children’s data to target ads and products in manipulative ways. Criminals could exploit children’s data to stalk, harass, or scam them. There are also risks of children’s sensitive information being leaked or hacked from improperly secured databases.
Beyond exploitation, collecting children’s data can potentially lead to physical and psychological harms. Over-surveillance and lack of privacy during formative years could impair their social development. Services that manipulate children’s behaviors based on their data profiles could negatively impact their self-esteem and mental health. Handling children’s data irresponsibly violates their dignity and right to informed consent over their information (source).
Protecting children must be the top priority when their data is involved. Companies and regulators have an ethical obligation to minimize risks and prevent foreseeable abuses. With children’s wellbeing at stake, responsible data practices are not just recommended – they are a necessity.
Indiana’s Claims Against TikTok
In May 2022, Indiana Attorney General Todd Rokita filed a lawsuit against TikTok alleging the company has misled users, particularly children, about the mature and inappropriate content on its platform. The lawsuit claims TikTok violates Indiana’s Deceptive Consumer Sales Act by not disclosing the “true nature” of its content.
Specifically, the lawsuit alleges TikTok does not adequately separate children’s content from mature content. It states that inappropriate, mature, and even criminal content is “thrust upon unsuspecting children.” [1]
The lawsuit also claims TikTok has misrepresented how children’s data is handled and shared. It alleges TikTok knew it collected personal data from children under 13 without parental consent, in violation of the federal Children’s Online Privacy Protection Act (COPPA).
Indiana is seeking monetary damages for harms suffered by Indiana consumers, civil penalties, and an injunction requiring TikTok to stop its alleged deceptive conduct. The state has demanded a jury trial. If successful, the lawsuit could result in millions of dollars in payouts.
TikTok’s Response
TikTok has staunchly denied the allegations in Indiana’s lawsuit. In a statement after the lawsuit was filed, TikTok said the claims were “without merit” and vowed to defend against them in court (Source 1).
TikTok argues that it has taken extensive steps to protect minors on its platform and provide parents with control over their children’s accounts. In November 2022, shortly after the lawsuit was filed, TikTok announced a new feature called “Family Pairing” that allows parents to link their TikTok account to their child’s to enable parental controls (Source 2).
TikTok also points to its robust privacy and safety policies, use of age verification for accounts belonging to minors, restrictions on messaging and livestreams for younger users, and partnerships with child safety organizations as evidence that it prioritizes youth wellbeing (Source 3). The company has characterized the lawsuit as unfair and without justification.
Other States’ Actions Against TikTok
Indiana is not the only state taking legal action against TikTok. Utah and Arkansas have also filed lawsuits alleging violations of child privacy laws and the negative impacts of social media on children’s mental health.
In October 2022, Utah became the first state to sue TikTok, claiming the app has a harmful and addictive impact on children. Utah’s lawsuit alleges TikTok’s algorithm can addict children and cause emotional and physical harm. It aims to require TikTok to strengthen parental controls and restrictions on addictive features.
That same month, Arkansas sued TikTok for allegedly misleading and causing harm to children through its viral video-sharing platform. The suit claims TikTok causes developmental harms, affects mental health, and enables cyberbullying and dangerous viral challenges among minors. Arkansas is seeking to prevent TikTok from operating in the state if it does not address these issues.
With three states now bringing legal action, a national pattern seems to be emerging of states investigating and trying to regulate TikTok over child safety concerns. The lawsuits highlight growing bipartisan anxiety around social media’s effects on children. It remains to be seen if more states will follow with their own investigations or bans.
Federal Government Scrutiny
In addition to lawsuits from states, TikTok has faced growing scrutiny from federal agencies and lawmakers over its data collection practices and potential national security risks. In March 2023, it was reported that the Justice Department had opened an investigation into TikTok’s parent company ByteDance over concerns they had accessed data from US users to spy on journalists. The FBI is also conducting an investigation into TikTok on national security grounds.
The Federal Trade Commission announced an investigation into TikTok in March 2022 centered on how the platform’s data collection and algorithms impacted young users. In December 2022, a group of US Senators called on the FTC to fully investigate how TikTok uses data from underage users. Lawmakers have raised bipartisan concerns about the risks TikTok may pose in terms of exposing user data to the Chinese government and influencing American youth.
Some federal politicians, including Senators Mark Warner and Marco Rubio, have called for TikTok to be banned outright in the US over data privacy and national security issues. While an outright ban remains unlikely, tighter regulations on how TikTok operates and accesses US user data seem probable as government scrutiny ramps up.
Broader Social Media Privacy Concerns
The lawsuit against TikTok comes amid broader criticisms and concerns about data collection practices by major social media platforms. Privacy advocates have raised alarms about the vast troves of personal data accumulated by companies like Facebook, Instagram, YouTube, and TikTok.
Key concerns include the extensive profiling of users, targeted advertising based on sensitive information, lack of transparency around data practices, and vulnerability to hacking or misuse. According to the Electronic Privacy Information Center, “The massive stores of personal data that social media platforms collect and retain are vulnerable to hacking, scraping, and data breaches” (https://epic.org/issues/consumer-privacy/social-media-privacy/).
While social media platforms argue this data enables customized services, privacy experts warn it could expose users to identity theft, manipulation, and other harms. Many believe current laws and policies fail to adequately protect consumer privacy amid rapid technological change.
Indiana’s lawsuit can be seen as part of a broader regulatory push to rein in social media data practices and enforce stronger privacy safeguards, especially for minors. However, meaningful reform remains elusive given the massive revenues at stake in digital advertising and social media’s lobbying clout.
What’s at Stake for TikTok
TikTok faces substantial legal and financial consequences if found liable in the Indiana lawsuit and other cases alleging COPPA violations. The company could face injunctions blocking certain data collection practices, as well as fines and damages. As part of its 2019 FTC settlement, TikTok already paid a record $5.7 million penalty for violating COPPA. Further violations could result in even larger fines.
The lawsuits also threaten TikTok’s business model and growth prospects. The company relies heavily on collecting user data to power its personalized recommendation algorithm and targeted advertising. Restrictions on data collection from minors could hamper these core functions. TikTok may need to significantly alter its practices regarding younger users in order to comply with COPPA. This could undermine TikTok’s popularity among teenagers.
The legal scrutiny highlights the risks facing digital platforms dependent on youth engagement. Investigations and lawsuits could result in substantial compliance costs or forced changes for TikTok. However, properly safeguarding children’s privacy remains a priority for regulators. TikTok will need to carefully balance legal obligations, business interests, and ethical data practices.
Conclusion
Indiana’s lawsuit against TikTok alleges that the social media platform knowingly and illegally collected personal information from children under 13 without parental consent. This violates the federal Children’s Online Privacy Protection Act (COPPA), which aims to protect children’s privacy and safety online.
The lawsuit seeks to stop TikTok from further COPPA violations, impose fines, and obtain compensation for affected Indiana residents. The outcome of this case could have broader implications for how tech companies handle children’s data and comply with COPPA regulations.
If Indiana wins, it would likely spur other states and the federal government to pursue similar legal action against TikTok over children’s privacy concerns. This could ultimately force TikTok to change its data collection practices and bolster COPPA enforcement overall. However, TikTok maintains it has worked to protect minors’ privacy and will fight the lawsuit.
Beyond TikTok, this lawsuit highlights growing apprehension around social media companies’ data collection from vulnerable groups like children. It underlines the need for continued vigilance and oversight of tech industry practices that may infringe on privacy rights.