In recent years, TikTok has seen explosive growth and popularity, emerging as one of the most widely used social media platforms in the world. As of January 2023, TikTok reached over 1.5 billion monthly active users, representing a 16% increase from 2022 (Charle). Much of this growth has been driven by younger audiences, with over 60% of TikTok’s global users being between the ages of 16 and 24 (Blogging Wizard).
With its short form video format built for entertainment and viral sharing, TikTok has cultivated a massive community of creators and viewers. Major factors behind TikTok’s success include its powerful algorithm that quickly learns user preferences and serves up personalized content, as well as its tools that make creating and editing videos simple. As TikTok continues to expand its global reach, the platform’s cultural significance and influence grows.
TikTok’s Content Moderation
In order to maintain a safe and positive environment, TikTok relies heavily on content moderation. Moderators review billions of videos and other content daily to enforce TikTok’s Community Guidelines.
This enormous scale of moderation has led to controversies. In 2022, former TikTok moderators filed a lawsuit against the company, alleging that constant exposure to disturbing content caused lasting psychological trauma, and TikTok failed to provide adequate mental health support. Moderators described a high-pressure environment where they had to review hundreds of traumatic videos per day with minimal resources.
Critics argue TikTok’s content moderation practices lack transparency compared to platforms like Facebook, making it difficult to evaluate their effectiveness. Concerns have also been raised about inconsistent rule enforcement, especially relating to harmful challenges and disordered eating content.
Types of Explicit Content
TikTok prohibits sexually suggestive content, especially content featuring minors. This includes suggestive dancing, gyrating, and lip syncing. For example, TikTok’s community guidelines state: “Don’t post content depicting, promoting, normalizing or glorifying sexually suggestive themes, including simulated sexual acts and sexualized minors, even if consensual” (https://www.tiktok.com/community-guidelines/en/safety-section/dangerous-acts-challenges/).
Despite these guidelines, many TikTok users, including young teens, participate in provocative dancing and lip syncing. Critics argue this content sexualizes minors and exposes them to adult themes before they are ready (https://findanexpert.unimelb.edu.au/news/72158-tiktok-has-a-startling-amount-of-sexual-content-%E2%80%93-and-it%27s-way-too-easy-for-children-to-access).
In addition to suggestive dancing, TikTok also prohibits drug references and profanity in content. However, critics argue it can be challenging for TikTok to moderate the large volume of content being uploaded (https://theconversation.com/tiktok-has-a-startling-amount-of-sexual-content-and-its-way-too-easy-for-children-to-access-216114). Some inappropriate content slips through automated moderation before being identified and removed.
TikTok’s Community Guidelines
TikTok has clear rules outlined in their Community Guidelines around nudity, sexual content, and harassment. Per their guidelines, “Nudity, pornography, and sexually explicit content are not allowed on TikTok.” The guidelines specify that even partial nudity, including exposed genitalia, buttocks or nipples is prohibited. Sexual intercourse and other sexual activities are also not permitted.
In terms of harassment, TikTok states that content or accounts that “attack, threaten, incite violence or harass an individual or group of individuals” is not allowed. This includes severe bullying or harassment, hateful ideologies, behavior promoting eating disorders, and content that is gratuitously shocking or disgusting. TikTok aims to maintain a supportive environment and remove content that could make users feel unsafe, according to the Community Guidelines Enforcement Report.
Impact on Young Users
Although TikTok has been working to improve its content moderation policies, there are still concerns around children’s exposure to inappropriate or explicit content on the platform. Several experts caution that sexualized dancing, revealing outfits, adult themes, and explicit songs are prevalent in many popular TikTok videos and creators’ content (https://www.psychiatrictimes.com/view/concerning-content-tiktoks-effect-on-child-and-adolescent-mental-health). According to research, “the exposure to and consumption of sexualized media puts the physical and mental health of children and youth at risk through a variety of mechanisms” (https://www.apa.org/pi/families/resources/media-guidelines.pdf). This type of content can normalize the oversexualization of minors and impact their self-esteem, body image, and mental health.
Many parents worry about the challenges of monitoring their kids’ TikTok use and exposure. While TikTok does have some parental control features like screen time limits and restricted mode, critics argue these controls are not robust enough to protect minors from concerning content (https://kidmatterscounseling.com/blog/5-things-parents-need-to-know-about-tik-tok/). Parents have an incredibly difficult job keeping up with new social media apps and sites. More work may need to be done to educate parents on tools available and to improve parental controls across platforms.
Artist Expression vs. Responsibility
TikTok creators have to balance their own artistic expression with being responsible content creators on the platform. Many creators enjoy having the freedom to create edgy or provocative content, but they also have a responsibility not to cross the line into harmful or dangerous territory.
According to the TikTok Content Creator job description from Prosperity (https://www.prosperity.ie/job-descriptions/tiktok-content-creator), creators are responsible for driving engagement through entertaining and relevant content. However, they must stay within TikTok’s community guidelines.
Ultimately, the responsibility falls mainly on the creators themselves to use good judgment. TikTok does enforce its policies by removing inappropriate content and suspending accounts. But with so much content being produced, TikTok relies heavily on users to report problematic videos.
Overall, TikTok creators have to walk a fine line between creative expression and responsible content creation. Pushing boundaries can help them gain views and followers, but they risk account suspension if they share truly harmful or dangerous content.
Proactive Moderation vs. Reactive
TikTok currently uses a combination of proactive and reactive moderation to keep the platform safe. Proactive moderation involves TikTok actively scanning for policy-violating content using automated tools and human moderators. Reactive moderation relies on users to report content that violates guidelines.
Experts argue TikTok could improve by moderating more proactively. Rather than waiting for users to flag violating content, TikTok could use more advanced AI to identify risk factors and quickly remove inappropriate videos. However, developing highly accurate AI is challenging. TikTok must balance being proactive with avoiding over-censorship of benign content.
TikTok’s user reporting system enables community monitoring at scale. Users can report videos that seem to violate policies. TikTok then reviews each report to determine if action is needed. User reporting is vital for reactive moderation. However, over-reliance on reactive moderation puts burden on users. TikTok aims to lighten this burden through increased proactive efforts.
Censorship Concerns
One major area of contention with TikTok has been around censorship and removal of certain videos from the platform. Critics have accused TikTok of limiting free speech by removing or downranking videos that are critical of China and its policies. For example, in November 2019, Canadian teenager Feroza Aziz had her account temporarily suspended shortly after posting a video criticizing China’s treatment of Uighur Muslims. As reported by CBC, TikTok claimed the suspension was due to an earlier satirical video she had posted and was unrelated to her criticism of China. However, the incident sparked further concerns about possible political censorship on the platform.
TikTok must walk a fine line between enforcing community guidelines and respecting free expression. According to TikTok’s policies, they do not remove content based on political sensitivities. However, their somewhat vague rules against promoting dangerous organizations or individuals could potentially lead to the suppression of political dissent. There are also worries that the platform’s Chinese origins and oversight lead it to censor topics deemed sensitive by the Chinese government. As TikTok continues to grow globally, the company will need to be more transparent in how it enforces its guidelines to avoid accusations of censorship.
The Future of Moderation
As TikTok continues to grow and evolve, there are several ways the platform can improve its approach to moderation going forward:
First, TikTok could invest more resources into developing its automated moderation systems using the latest AI technology. This would allow the platform to proactively identify and remove harmful content at scale, rather than relying solely on user reporting. For example, TikTok states its current automated systems are focused on finding violations of its Community Guidelines, but enhanced AI could also detect more nuanced problems like misinformation.
In addition, while TikTok has expanded its content moderation workforce, critics argue it should continue to grow these teams to keep pace with its rapid expansion worldwide. According to TikTok, its moderation capacity has grown along with app downloads, but continued investment here could improve response rates further.
Compared to competitors like Instagram and YouTube, TikTok is relatively new to content moderation at its massive current scale. As it solidifies its moderation strategies, TikTok can likely learn from the best practices and missteps of these more established platforms. However, TikTok also faces unique challenges due to factors like its algorithmically-driven “For You” feed.
Overall, TikTok will need to be proactive and flexible in order to moderate content effectively moving forward. Striking the right balance between user safety, artistic expression, and public perception will continue to be an evolving challenge as the app expands its global reach.
Conclusion
In summary, while TikTok has faced criticism for allowing explicit content, the platform has put significant effort into improving its content moderation to better protect young users. TikTok updated its Community Guidelines to clarify unacceptable content and expanded its moderation team to be more proactive in removing harmful videos. However, this increased moderation raises concerns about censorship and impacts on artistic expression. There is an ongoing struggle to find the right balance between allowing freedom of speech while also protecting users.
When it comes to explicit content, the interests of young users, adult users, artists and creators can sometimes be in conflict. Platforms like TikTok must aim to create an environment that allows for creative expression and meaningful connections, while also keeping vulnerable groups safe from inappropriate content. As TikTok’s moderation systems continue evolving, the goal should be protecting users in a way that is fair, transparent, and supportive of a diversity of voices.