Whenever you have an hour to kill or procrastinate, watch educational videos or get your daily dose of cringe, you’re likely to go to TikTok, which will be happy to recommend funny videos with cats, dogs, clumsy kids, and much more. TikTok will do everything to make you perceive it as a social network where happiness, laughter, goodness, and other positive emotions reign supreme.
However, there is another side to this Chinese social media platform, and TikTok is doing its best to hide the ugly truth from its users.
Why will you never find out about Uighurs on TikTok, but have a very good chance of coming across dangerous challenges and Russian influence operations on this social media? OPORA’s analyst Olha Snopok explains it to you in her blog.
How TikTok decides what you watch
As you already know, the key element of any social network is an algorithm that determines what you see in your feeds. It analyzes dozens of your characteristics, including your gender, age, activities (for example, how long you’ve been watching funny videos about raccoons). Based on the results of this analysis, TikTok recommends other videos that will probably be to your liking and make you come back to watch them when you have spare time.
However, there is another important point to remember: algorithms not only decide what you see but also determine what will become popular on TikTok. At the same time, TikTok doesn’t care too much about what kind of content gains popularity – the main thing is that users watch, like, and share it over and over again. Because of this, there is a risk of harmful or even dangerous trends appearing on your feed.
One example of dangerous trends is the so-called “Benadryl” challenge, which became popular with teenagers in the United States in 2020. This challenge encourages teenagers to make videos of them taking 12-14 pills of “Benadryl” allergy drug, which is well beyond the recommended dose and causes hallucinations. It wasn’t until 2023 that TikTok began responding to this dangerous trend after the second child died from “Benadryl” overdose and all the leading American media reported about it.
This coincided in time with the growing popularity of blackout challenge, the participants of which hold their breath or deliberately squeeze their necks until they lose consciousness, while recording the whole process on a video. According to Bloomberg, blackout challenge claimed the lives of at least 15 children under the age of 12.
TikTok’s senior managers tried to distance themselves from blackout challenge in every possible way. They claimed that children had learned about this challenge from sources other than their platform. Furthermore, they emphasized that this problem is not confined to a particular social network. There is a problem with the way that all social media work in general. They said that the algorithms used by every social network in the world are at fault, so TikTok is not to blame for this situation.
Indeed, algorithms can cause overly depressing and radical videos to appear in your feed. If you spend considerable amount of time consuming sad or disturbing content, TikTok will think that you like it and offer you more of the same. The results of research studies carried out by several international periodicals, including the New York Times and the Washington Post, go to prove the following: if you watch depressing content on TikTok for a long time, then sooner or later it will start showing you videos that prompt you to do negative actions against yourself, even if such videos violate the rules of this social media platform.
Although TikTok denied all accusations of generating harmful content that led to tragic consequences, this social media platform has every possibility of controlling what its users see in their feed.
Yes, I’m talking about the infamous content moderation, which, on the one hand, is supposed to clean the feed from dangerous content that contradicts TikTok policy, and on the other hand, it allows TikTok to remove or hide the things that do not meet its interests. One such an example is the manner in which this social media platform treats the topic of Uighurs.
The Uighurs are a Turkic ethnic group primarily residing in the Xinjiang Uighur Autonomous Region of China. According to various estimates, there are 10 to 11 million Uighurs living in China. The Uighurs have their own language and practice Islam. They are closely related to Kazakhs in terms of traditions and culture.
Since 2014, the Uighurs have been severely persecuted by the Chinese authorities under the slogan of “fighting terrorism”. Representatives of this ethnic group are abducted off the streets and sent straight to the so-called “vocational education and training centers”. Presumably, the Chinese government is deeply concerned about the educational level and professionalism of Uighurs, which is why it sends them to these innovative centers without providing any explanations (I’m just kidding). The people who “studied” in these training centers say that they were abused and forced to recite communist propaganda every day, abandon their faith, culture and language.
TikTok is trying hard to hide this truth from the world. In November 2019, an incident happened to one of the TikTok users, which made the world community talk about violation of Uighur rights with renewed vigor.
In particular, Feroza Aziz had been posting makeup tutorial videos on TikTok, in which she also shared personal stories about dating, school life and relationships with her parents. But one day she posted a video that drew her followers’ attention to persecution of Uighurs in China. The video quickly went viral and received over 1.4 million views. In this video, Feroza Aziz mentioned the Chinese government’s repressive measures against the Uighurs, including the “vocational education and training centers" that basically serve as concentration camps.
How did TikTok react to this video?
Feroza’s account was temporarily blocked after she published this video. TikTok tried to justify itself by saying that Feroza’s account was blocked due to previous violations of company rules and it has nothing to do with her criticism of the Chinese government’s policies.
However, TikTok blocked Feroza’s new account as soon as she mentioned the Uighurs again. When this story received widespread publicity, Tiktok made an apology and tried to convince everyone that the video was deleted due to a “moderation error”, and both of Feroza’s accounts were blocked because, first of all, she “posted a video depicting Osama bin Laden” and then she violated the company’s policy that prohibits users from creating a second account on the same device.
However, this explanation doesn’t sound convincing to us. Currently, Feroza’s TikTok account is inactive. Her last videos were posted back in 2020. According to the information on Feroza’s page, she is still 19 years old, but in fact she is 22 now. Therefore, it can be assumed that Feroza has no access to her TikTok account and cannot post makeup tutorial videos telling her audience about the problems of Uighurs in China.
What does all this mean for users from Ukraine?
Representatives of TikTok declared that their company protects the information space of Ukraine and cooperates with the Ukrainian authorities, and their social media platform has always been the center of truth and integrity. However, reality on the ground tells a different story. Every Ukrainian citizen who used this social media has come across Russian disinformation, at the very least.
Texty.org.ua recently found out that Russians maintain strong presence in the Ukrainian segment of TikTok: they created dozens of sock-puppet farms to promote ideas that are beneficial to them. Russian users leave thousands of comments and post their own videos in which they discredit the Ukrainian government and the military recruitment centers in an attempt to sow discord within Ukrainian society.
Furthermore, Russians use TikTok to promote their own content. This works especially well in combination with excerpts from Russian movies and comedy shows, which are actively penetrating the Ukrainian segment of TikTok.
Does TikTok moderate digital trash?
Unfortunately, TikTok doesn’t moderate digital trash. It still gets lots of likes and views. The influence of comedy shows is somewhat less noticeable, but the excerpts from Russian news and TV programs have a strong impact on Ukrainian society, especially when it comes to young people who are very active on social media platforms.
At the same time, TikTok occasionally blocks and deletes the content published by Ukrainian authors. For example, at the beginning of full-scale invasion, TikTok blocked Detector Media reports about Bucha and did not allow them to publish anything about Azov. Today, TikTok periodically deletes videos of Ukrainian soldiers who share stories about the war. According to TikTok administration, they delete these videos because users don’t want to see violent content.