A filter bubble refers to a situation where an algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past clicks, search history, and likes. This puts users inside of their own unique information bubble or “echo chamber”, where they are isolated from contradictory viewpoints and exposed primarily to information that reinforces their own perspectives.
As a short-form video sharing app, TikTok relies heavily on an algorithm to recommend content to users. The app launched in 2016 and quickly became one of the most popular social media platforms, with over 1 billion monthly active users as of September 2021. TikTok’s powerful “For You” recommendation algorithm is a big driver of this success, using machine learning and artificial intelligence to determine which videos to show each user.
However, there are concerns that TikTok’s hyper-personalized recommendations may lead some users into isolated spaces or filter bubbles, where they only see limited perspectives. The algorithm aims to engage users by showing them content similar to what they have previously liked, commented on, shared, or dwelled on, while excluding content it does not believe they will engage with. This could narrow users’ worldviews and amplify polarization if it feeds them too limited of a content diet. Understanding the potential risks of filter bubbles on TikTok is an important issue to explore.
How the TikTok Algorithm Works
The TikTok algorithm uses machine learning and deep learning techniques to recommend videos to users. Some of the key factors that influence what videos are recommended include:
User interactions – The algorithm pays close attention to how users engage with videos. This includes things like which videos you like, share, comment on, watch multiple times, or watch for an extended period. Videos similar to ones you interact with will be recommended more (Source).
Video information – Details about the video itself also influence recommendations. This includes information extracted from the video content like objects, sounds, and text detected in the video. The language spoken and hashtags used also provide signals to the algorithm (Source).
Device and account settings – User account settings like age, gender, location, and device type are factored in. For example, recommendations may be tailored for different geographic regions.
Creators you follow – Videos from creators you actively follow are more likely to be recommended. TikTok studies creators you engage with and shows you more of them.
Overall, machine learning allows TikTok to model user interests based on multiple signals and patterns in data. The system gets smarter over time by continuously learning from new user data and engagement (Source). Recommendations become more personalized as users interact more on the platform.
Impact on Users
One major concern around TikTok’s algorithm is that it can create “filter bubbles” that narrow users’ interests and perspectives over time. As the algorithm learns what content each user engages with, it curates their For You Page to show more of that type of content. This can gradually isolate users into limited interest areas and viewpoints.
Some experts worry this filtering effect could lead to increased radicalization, as users see increasingly extreme content within their niches. As Eli Pariser explained when coining the term, “filter bubbles tend to have a militant political edge” (The unique power of TikTok’s algorithm). Users may not even realize how narrowly focused their feeds have become over time.
While TikTok’s recommendations aim to entertain, critics argue the hyper-personalized feeds shape users’ worldviews in concerning ways. As Pariser said, “you’re confined in an echo chamber of your own ideas” (Filter bubbles: a delicate interaction between algorithms and human bias). This isolation can foster radicalization and polarization.
Mitigating Filter Bubbles
TikTok has taken some steps to try to mitigate the filter bubble effect and expose users to more diverse content. One action they’ve taken is to make new accounts public by default when users first join the platform, rather than private. According to TikTok, this encourages “openness and creativity” and allows a user’s content to be more widely discoverable from the start1.
TikTok has also made changes to the For You page algorithm to promote “maximum diversity” and reduce repetitive or hyper-targeted recommendations. They tweak the algorithm to balance showing users content they will likely enjoy along with some videos outside their normal interests. This introduces more perspectives beyond a user’s filtered bubble2.
Comparisons to Other Platforms
TikTok is not the only social media platform accused of creating filter bubbles through its algorithm. Instagram and Facebook have also faced criticism for promoting certain types of content over others, potentially trapping users in bubbles of repetitive, homogeneous content. 2
However, TikTok’s aggressive algorithm takes personalization to another level. While Instagram and Facebook tailor feeds based on past engagement, TikTok rapidly learns users’ interests through cues like hashtags and favorited videos. It then fills the “For You” feed with remarkably similar content, possibly more so than competitors. This creates a strong filter bubble effect as users see less diverse perspectives. 13
Criticisms and Controversies
TikTok has faced its fair share of criticisms and controversies since skyrocketing in popularity. Two of the biggest concerns surrounding the platform relate to data privacy and accusations of censorship.
In terms of privacy, TikTok has come under fire for its data collection practices. As the app is owned by a Chinese company, ByteDance, there are worries that user data could be accessed by the Chinese government. In 2019, TikTok paid a $5.7 million fine in the US over allegations it illegally collected personal information from children (Wikipedia). TikTok claims that data from US users is now stored on servers in the US and Singapore. However, some experts argue vulnerabilities remain.
Accusations of censorship have also plagued TikTok. The platform is not available in China, where ByteDance operates a similar but separate app called Douyin. However, there is evidence TikTok has downweighted or removed content deemed politically sensitive by the Chinese government (NYTimes). For example, posts related to protests in Hong Kong and China’s treatment of Uyghur Muslims have been impacted. TikTok claims it does not remove content based on sensitivities related to China.
TikTok’s Responses
TikTok has faced criticism that its algorithm creates filter bubbles by overly customizing each user’s experience. In response, TikTok has emphasized that the main aim of its algorithm is to customize the experience and recommend content users will enjoy, not to limit their exposure.
In a blog post, TikTok stated that “the system aims to avoid recommending content that is blatantly false, misleading or meant to shame individuals.” The post acknowledges that some customization is inherent in any recommendation algorithm, but claims TikTok is committed to transparency and giving users control.
TikTok has released periodic transparency reports detailing how much content is removed and accounts banned. However, critics argue these reports do not provide enough insight into how the closely guarded recommendation algorithm actually works.
Research on Actual Effects
So far, there have been limited academic studies conducted on the actual effects of filter bubbles created by TikTok’s algorithm. Much of the observations have been anecdotal so far.
Kevin Zawacki of Mozilla Foundation’s TheirTok project has done some early anecdotal research on TikTok bubbles that is concerning. He has noticed that when using separate accounts – such as for different genders, races, religions – the content served by TikTok can be extremely different.
Full scale academic studies on quantifying the TikTok filter bubble effects are still lacking at this point. However, some computer science researchers have begun preliminary investigations by analyzing dumps of TikTok data or running controlled experiments with bot accounts.
User Suggestions
One way users can mitigate filter bubbles is by following a diverse range of accounts outside of their usual interests. TikTok’s algorithm tends to serve content similar to accounts a user already follows. By actively seeking out and following people of different backgrounds, locations, viewpoints, and interests, users can introduce more variety into their For You feed.
For example, a user interested in fashion can also follow science communicators, travelers, historians, artists, and activists. Even if the user doesn’t engage much with content outside their niche at first, the algorithm will start blending more diversity into recommendations over time as it learns the user’s expanded interests. This “taste developing” approach takes some effort but can have a big impact.
Users can also periodically reset the TikTok recommendations algorithm by tapping the profile icon, choosing “Settings and privacy,” selecting “Account,” and turning “Personalized recommendations” off for a few days. This clears the algorithm’s impression of the user’s interests and may introduce new content next time it’s turned back on. Resetting periodically forces the algorithm to recalibrate suggestions. However, taste developing by following diverse accounts tends to have a longer-lasting impact.
Conclusion
In summary, filter bubbles on TikTok refer to how the platform’s personalized algorithm can limit the diversity of content that users see. The algorithm analyzes user behavior and preferences to serve them more of what they previously engaged with. This creates “filter bubbles” or “echo chambers” around certain topics and viewpoints.
While filter bubbles are not unique to TikTok, some critics argue they may be more prevalent given how advanced and central the algorithm is. However, research on the actual effects remains limited. There are also ways users can proactively mitigate filter bubbles by seeking out new content, accounts, and topics.
Going forward, some key questions are: How can TikTok balance personalization and diversity? How does immersive content affect filter bubbles? And how do bubbles impact different demographics – especially younger users who spend lots of time on the platform? More research is needed to fully understand the implications of filter bubbles on TikTok and similar platforms.