TikTok is a popular social media platform that allows users to create and share short videos. With over 1 billion monthly active users, TikTok has implemented guidelines and reporting systems to maintain a safe environment on the platform.
If a TikTok account is found to repeatedly violate the platform’s Community Guidelines, it may be subject to deletion. The platform has a moderation process in place for reviewing reported accounts and determining if deletion is warranted based on the severity and frequency of violations.
This article provides an overview of how TikTok’s reporting and account deletion process works. It outlines the criteria used for deleting accounts, how users can report concerning accounts, the number of reports needed for deletion to occur, and options available for appealing account deletion decisions.
TikTok’s Community Guidelines
TikTok has established Community Guidelines that all users must follow when posting content. These guidelines outline prohibited content such as:
- Illegal activities
- Violent extremism
- Hate speech
- Harassment and bullying
- Adult nudity and sexual activities
- Dangerous acts and challenges
- Misleading information
- Regulated goods
- Dangerous individuals and organizations
TikTok aims to cultivate an inclusive environment and remove content that promotes harm, danger, or hate. Users are encouraged to report any concerning posts that appear to violate the platform’s guidelines.
How Users Can Report TikTok Accounts
To report a TikTok account, users need to go to the profile page of the account they want to report and tap the three dots icon in the upper right corner. This will open a menu where users can select “Report” and choose from several options like “This account posts nudity or sexual activity”, “This account is pretending to be someone else”, or “This account is spamming me”.
Users can select the option that best fits their reason for reporting. After selecting a reason, users can provide additional details and submit the report to TikTok for review. TikTok states they review all reports and will take action if an account is found to violate their community guidelines.
According to TikTok’s help site, users can also report specific content like videos, comments, messages, or live streams. Reporting directly from the content in question provides helpful context for reviewing the report (TikTok, 2023).
What Happens When an Account is Reported
When a TikTok account is reported, TikTok reviews and investigates the reports to determine if the account has violated the platform’s Community Guidelines. According to TikTok’s Support page, “When an account is reported, we review the content and accounts to determine if they violate our Community Guidelines” (https://bloggingscheme.com/tiktok-what-happens-when-you-report-someones-account/). The types of content and behavior that violate the guidelines include illegal activities, dangerous acts, bullying, hate speech, adult nudity and more.
If TikTok finds that the reported account has clearly violated the Community Guidelines, the account may be subject to removal or permanent suspension. However, not all reports necessarily lead to account deletion. For minor or accidental violations, TikTok may just remove the offending content. Or the account may get a warning and temporary suspension from certain TikTok features. The specific action taken depends on the severity and details of the violation.
It’s important to note that abusive and false reporting can lead to penalties on the reporting accounts as well. So TikTok aims to thoroughly investigate reports and determine if violations have actually occurred before taking action against an account.
Criteria for Deleting Accounts
TikTok has strict policies and community guidelines that all users must follow. Accounts may be deleted if they are found to violate these rules repeatedly or egregiously. Some of the main criteria TikTok uses for deleting accounts include:
Illegal Activities – Accounts that promote dangerous or illegal behavior like drug use, violence, or criminal activities may be deleted without warning. This includes any content that facilitates criminal activities.
Harmful Activities – Posts that encourage harm to oneself or others, like self-harm, eating disorders, suicide, or dangerous stunts and challenges are grounds for account deletion.
Hateful Ideologies – Accounts spreading hateful content, inciting harm against protected groups, or threatening violence may be permanently banned.
Sexual Exploitation – Accounts with sexually explicit content involving minors will be deleted immediately and reported to authorities. Pornography and nonconsensual intimate media are also prohibited.
Spam/Scams – Repeated spamming, scams, misleading content, or impersonation of others can lead to account deletion.
According to TikTok’s Community Guidelines, accounts may be permanently deleted if they “repeatedly or severely” violate policies. The number of violations needed for deletion varies case-by-case.
Appealing Account Deletion
If you believe your TikTok account was banned or deleted incorrectly, you can appeal the decision by submitting an appeal directly within the app. The appeals process allows you to provide additional context and request a review of the decision.
To start, go to your profile page. If your account was banned, you’ll see an option to “Submit appeal.” Tap that button and follow the instructions to explain why you think your account should be restored.
According to TikTok’s guidelines, you’ll need to provide a detailed explanation for why you believe the decision was a mistake. Factors like your age, location, and the reason for the ban may impact the specific appeal requirements [1].
Once submitted, TikTok will review your appeal and any accompanying information you provided. The process may take several days. If your appeal is approved, your account and content will be restored. However, TikTok warns that any violations of their community guidelines in the future could result in permanent deletion.
How Many Reports Trigger Deletion
According to TikTok’s Community Guidelines, multiple community reports against a TikTok account within a short period of time can lead to that account being reviewed and potentially removed. However, TikTok does not provide an exact number of how many reports it takes to trigger an account deletion.
Some reports indicate it can take as few as 25-50 reports within a 24 hour period to prompt an account review, while others estimate it may take hundreds or even thousands of reports over a longer timeframe. The number likely varies based on factors like the severity of violations, number of videos flagged, and history of violations.
TikTok’s support articles state that “Multiple community reports filed against an account within a short period of time may indicate violation of our Terms of Service and Community Guidelines.” So the number of reports needed scales based on the timeframe and severity of issues.
The key is consistent reporting from multiple users against concerning content. While an exact number is not public, repeated flags from the community are essential for prompting TikTok to review an account.
Sources:
How Many Reports To Delete TikTok Account
Mistaken and Abusive Reporting
While TikTok encourages users to report content that violates its community guidelines, users should be cautious about false reporting. Falsely reporting content can lead to serious consequences for the reporting user.
TikTok issues warnings and may suspend accounts that are found to repeatedly report content that doesn’t actually violate community guidelines. Suspensions may be temporary or permanent, depending on the severity of the false reporting [1].
Users are advised against targeting someone by falsely reporting their content in an attempt to get their account deleted. This practice, known as “mass reporting,” is considered abusive behavior under TikTok’s community guidelines [2].
If you believe your content was wrongly reported, you can appeal the removal or restriction. TikTok reviews these appeals and may remove strikes received from mistaken reports.
Overall, community members should thoughtfully consider if a report is warranted before submitting it. Abusing the reporting system takes resources away from addressing truly problematic content.
Best Practices for Users
While TikTok aims to foster an inclusive environment, users must still be thoughtful and responsible when posting content. Here are some tips to avoid having your account deleted:
- Carefully review content before posting to ensure it does not violate TikTok’s Community Guidelines against dangerous acts, illegal activities, regulated goods, nudity and sexual activities, harassment and bullying, violent extremism, hateful ideologies, and misinformation that causes harm.
- Be respectful of diversity and do not post content that attacks or dehumanizes people based on protected attributes.
- Make sure any content directed at minors is age-appropriate.
- Only share personal information and content with proper consent.
- Do not artificially boost engagement metrics through fraudulent likes, shares, or comments.
- Credit original creators and obtain usage rights for any content that is not your own.
- Moderate your comment section to remove hate speech, misinformation, or spam.
While most users have good intentions, it only takes one lapse in judgment for an account to be banned. By keeping community guidelines top of mind, users can responsibly grow their audience while avoiding mistaken deletions.
Conclusion
In summary, TikTok does not publicly disclose the exact criteria or number of reports needed to delete an account. However, based on anecdotal evidence, it appears that multiple reports against an account for violations of TikTok’s community guidelines may result in account deletion after review by TikTok’s content moderation team.
The number of reports likely depends on the severity of the offense, whether it is a repeated violation, and the overall context. Malicious users may attempt to abuse the reporting system, so TikTok aims to filter out invalid reports while still enforcing policies against harmful content. Users who feel their account was wrongly deleted can submit an appeal for reinstatement.
In the end, the reporting system allows TikTok to maintain a safe community, but also requires good faith participation from users. With a thoughtful approach, TikTok can balance open expression with responsible content standards for all.