TikTok has seen explosive growth since its launch in 2016, becoming one of the world’s most popular social media platforms. As of the end of 2022, TikTok had over 1 billion monthly active users globally (15 Essential TikTok Statistics for Marketers in 2024). In the US alone, there are over 73.5 million monthly active users (TikTok User Statistics 2024: Everything You Need To Know).
TikTok’s user base skews young, with around 42% of users between the ages of 10 and 19 (Essential TikTok Statistics). The platform is popular among Generation Z, also known as “zoomers,” born between the late 1990s and early 2010s. While the majority of users are under 30, older generations are joining the platform as well. TikTok offers short-form entertaining video content that appeals to a wide range of ages and demographics.
TikTok’s Content Moderation
TikTok utilizes a combination of artificial intelligence and human moderators to review content on its platform. The AI scans videos, captions, hashtags, comments, and other elements to flag potentially inappropriate material. This allows TikTok to proactively find violating content before users see it. The AI looks for things like nudity, dangerous challenges, hate speech, and more based on TikTok’s community guidelines.
When AI flags something as inappropriate, it gets sent to human moderators for review. TikTok has thousands of content moderators working around the clock globally to assess flagged content (Quantanite). These moderators make the final decision on whether to remove or restrict videos and accounts based on TikTok’s criteria. Moderators also review user reports and block or ban accounts engaged in policy violations. According to TikTok, over 99% of removed videos were taken down before a user reported them, highlighting the scale of its automated moderation.
TikTok claims that the combination of AI and human moderators allows it to maintain safety and authenticity across its large user base (Baldwin & Klobuchar). However, critics argue that inappropriate, dangerous, and false content still spreads rapidly on TikTok despite these measures.
Age Rating System
TikTok uses an age rating system to categorize and filter video content according to maturity level. Based on the Guardian’s Guide, TikTok rates videos as G, PG, PG-13, R, or X. The G rating is for general audiences, PG is for parental guidance suggested, PG-13 requires parental guidance for under 13, R is restricted to ages 17+, and X is for adults only.
When uploading videos, creators can self-rate their content or leave it unrated. TikTok’s moderation team will review and add age ratings to unrated videos. Age ratings determine what videos are shown in feeds based on a user’s age. For example, accounts for children under 13 will not see videos rated PG-13 or above.
According to NBC News, maturity ratings provide clearer guidance about video content while helping ensure age-appropriate recommendations. However, some critics argue the self-rating system is imperfect and claim inappropriate content can slip through the cracks.
Age Verification
TikTok does not require users to verify their age in order to sign up and use the app. Users only need to enter their date of birth when creating an account, but there is no identity verification to confirm this. The onus is on the user to provide accurate information.
If TikTok suspects a user is underage, it may restrict the account until age is verified. According to TikTok’s support site, “Minimum age appeals on TikTok”, users can appeal age restrictions by providing a government ID or passport to confirm they meet the minimum age requirement. However, this is only required if an account is flagged – not for every user.
So in summary, TikTok does not universally require age verification for all users. But it may prompt certain users to verify their age if the provided date of birth appears to be inaccurate.
Age-Appropriate Recommendations
TikTok does use age-based filters and recommendations to tailor content for younger users. According to TikTok’s Guardian’s Guide, the platform’s recommendations system is designed to show age-appropriate content to teens aged 13-15. For even younger users under 13, TikTok has a “Younger Users” experience that limits content and messaging.
Specifically, TikTok states that “the recommendations a younger teen sees are different from those an older teen sees.” The algorithms are programmed to avoid showing mature or adult content to younger age groups. TikTok also allows parents to enable Restricted Mode, which further limits the appearance of inappropriate content.
Overall, TikTok does tailor and filter content to provide age-appropriate recommendations and experiences based on the user’s verified age. However, some inappropriate content can still slip through the filters, which is why parental oversight is still encouraged.
Parental Controls
TikTok offers several options for parents to restrict content and monitor their child’s usage on the platform. The main parental control is called Family Pairing, which allows parents to link their own TikTok account to their child’s account. With Family Pairing enabled, parents can:
- Restrict the appearance of content that may be inappropriate for younger audiences
- Limit screen time
- Restrict direct messaging to approved followers only
Family Pairing also gives parents access to a dashboard that shows their child’s activty on TikTok, such as videos liked, shared or posted. To enable Family Pairing, both the parent’s and child’s accounts must first enable Family Safety Mode in their settings. Then, the parent can send their child an invitation to pair the accounts via text or email 1.
Parents also have the option to toggle on “Restricted Mode” in the app’s settings. This limits the appearance of content that may be inappropriate for younger audiences. However, it does not block all content since TikTok aims to show age-appropriate content based on a user’s inferred age 2. Third-party apps like Bark and Qustodio also allow parents to monitor TikTok usage and restrict inappropriate content.
Third-Party Filters
While TikTok has its own built-in filters, there are also third-party apps that can be used to filter or restrict TikTok content according to age. These include:
- Screen Time on iOS devices – This allows parents to limit the time spent on TikTok per day and restrict content ratings.
- Google Family Link – This app helps parents control their child’s Android device usage, including filtering TikTok based on content ratings.
- FamiSafe – A parental control app that blocks inappropriate TikTok content based on keywords, user profiles, video categories etc.
- Kaspersky Safe Kids – Another parental control app that has a TikTok monitor to analyze videos and block inappropriate ones.
These third-party options provide parents with additional tools to restrict the TikTok content their children can view beyond the platform’s built-in controls. However, no system is completely fool-proof.
Effectiveness of Filters
While TikTok has implemented various filters to limit age-inappropriate content, their effectiveness in practice has been questioned. According to the NSPCC, the maturity score and restricted mode filters are not always accurate, with inappropriate content sometimes slipping through. There are also concerns that children could easily circumvent the age verification requirements. Overall, third-party filtering apps like Bark are considered more robust, though no system is foolproof.
Controversies
There have been controversies surrounding some of the content on TikTok and its effects on young users. One major concern has been about inappropriate content that slips past TikTok’s filters. An investigation by BBC News found that TikTok was failing to suspend the accounts of users sending sexual messages to teens and children (https://www.outlookindia.com/website/story/world-news-tiktok-failing-to-suspend-accounts-of-people-sending-inappropriate-messages-to-teenagers-report/328251 ). TikTok’s community guidelines prohibit sexually suggestive content involving minors, but critics argue it has not done enough to enforce these policies.
Some experts argue TikTok has a startling amount of sexual content that is too easily accessed by children (https://m.economictimes.com/tech/technology/tiktok-has-a-startling-amount-of-sexual-content-and-its-way-too-easy-for-children-to-access/articleshow/105347406.cms). Suggestive dancing, clothing, and song lyrics are common sources of concern. Parents worry children may be exposed to content they are not ready for or that could lead to risky behavior. While TikTok has parental controls, they are limited in effectiveness.
Overall, many believe TikTok needs to do more to crack down on inappropriate content and better restrict what children can access. Its massive popularity among youth has led to heightened scrutiny. Addressing these concerns is crucial for TikTok to maintain trust, enforce its guidelines, and protect young users.
Conclusion
Overall, TikTok has implemented numerous measures to restrict inappropriate content for younger users based on age. Key findings include:
- TikTok utilizes an age rating system that requires users to verify their age and assigns them to the appropriate experience.
- Users under 13 are limited to a curated version of TikTok with strict content moderation.
- Users aged 13-15 have some restrictions on interactions and viewing certain content.
- Users 16 and over have access to most content, with some graphic material still restricted.
- TikTok personalizes recommendations based on age to show age-appropriate videos.
- Parents can enable additional controls through Family Pairing to restrict time limits and block inappropriate content.
- However, filters are not perfect and inappropriate content can still reach underage users at times.
In summary, while not infallible, TikTok does make reasonable efforts to provide age-appropriate experiences tailored to users based on their age. Parents are advised to still monitor usage and utilize parental control features.