
A new research suggests that people may be getting an incomplete picture of the country’s history when searching for important words or phrases. In addition, video that demonize or severely portray China’s human rights abuses are more difficult to find on TikTok than other foe networks.
U. S.  , TikTok users who search for terms like” Tiananmen”,” Tibet”, and” Uyghur” — words commonly used in , Chinese Communist Party , propaganda — see less “anti-China” content than those same searches produce on Instagram and YouTube, according to a new study from the , Network Contagion Research Institute , at , Rutgers University.
Researchers created 24 fresh accounts across , ByteDance Ltd. owned TikTok, Meta Platforms Inc.’s Instagram and Alphabet Inc.’s YouTube, to simulate the experience of American teens signing up for social press. When searching for keywords usually related to the government’s human rights abuses, TikTok’s engine displayed a higher proportion of positive, negative or irrelevant information than both Instagram and YouTube, the study found.
” What sets TikTok off is that the accurate knowledge about China’s human rights abuses are most safely crowded out on the platform”, says , Joel Finkelstein, chairman and chief technology officer of NCRI. People who used TikTok for three hours or more per day were significantly more appreciative of China’s human rights record than non-users, according to a survey conducted along with the study.
A TikTok spokesman refrained from refuting the NCRI’s findings, saying that creating new accounts and conducting searches for these keywords do not accurately reflect the app’s user experience. He also made note of the fact that some of the incidents occurred long before TikTok actually existed, and that TikTok is a newer service than its rivals.
” This non-peer reviewed, flawed experiment was clearly engineered to reach a false, predetermined conclusion”, a TikTok spokesperson said in a statement. ” Creating fake accounts that use the app in a prescribed manner does not reflect the experience of real users,” just as this so-called study does not reflect facts or reality.
TikTok, owned by a , Beijing-based company, has faced intense scrutiny from , U. S.  , lawmakers and regulators concerned about the Chinese government’s influence over the social media app and its potential threat to national security. Earlier this year, President , Joe Biden , signed a law forcing ByteDance to sell the app by , Jan. 19 , or face a ban in the , U. S.
The idea that TikTok could be used to spread pro-China , messaging to American citizens, especially young people, has been a key factor in Congress’s efforts to ban the app. During congressional testimony in March, FBI Director , Christopher Wray , warned of China’s ability to” conduct influence operations” on TikTok, saying those efforts would be “extraordinarily difficult” to detect. The researchers at NCRI acknowledged that their study does not provide “definitive proof” that TikTok employees or the Chinese government had purposefully altered the algorithm. Users add tags to content to make it more relevant.
The analysis builds on NCRI’s earlier findings, which show that TikTok either amplifies or devalues content in accordance with its own interests. That report was cited heavily by , U. S.  , politicians who see the app as a threat to national security. TikTok Chief Executive Officer , Shou Zi Chew , called that previous report misleading when questioned about the findings during a , Senate , hearing earlier this year. TikTok pointed Bloomberg to a critique of that research published by the , Cato Institute, a libertarian, free-market think tank. ( One of Cato’s Institute’s key donors and former board members,  , Jeffrey S. Yass, is also a significant shareholder in TikTok’s parent company, ByteDance. )
ByteDance and TikTok executives have repeatedly denied allegations that the Chinese government uses the social media app to disseminate propaganda, but those arguments have failed to placate , U. S.  , government officials. TikTok has since sued the , U. S.  , government to overturn that law, arguing that , Congress , has not substantiated its claims of the app being a national security threat.
The NCRI is an independent non-profit organization composed of political scientists, security experts and research analysts. The group receives funding from , Rutgers University, the British government, and “private philanthropic families”, Finkelstein said.
To conduct the study, researchers collected more than 3, 400 videos related to the keywords ‘ Uyghur’,’ Xinjiang’,’ Tibet’ and ‘ Tiananmen,’ terms researchers consider important to the Chinese government’s messaging. On TikTok, Instagram, and YouTube, researchers looked up each keyword and viewed the first 300 or so videos that were displayed. From there, each video was classified as either pro-China, anti-China, neutral or irrelevant by up to three human reviewers. Researchers pointed out that their classification of content as pro-China , or anti-China , involved” subjective judgment”. They also urged caution, saying that “despite efforts made to minimize bias, there still may be interpretative differences.”
Videos that highlighted Uyghurs ‘ plight in , China, mentioned Tibetan liberation or contained imagery based on the massacre at , Tiananmen Square, were classified as anti-China , content by reviewers. Official CCP promotional messages, messages promoting the narrative that , Tibet , has been liberated and patriotic images of , Tiananmen Square , with no mention of the massacre, were considered to be pro-China , content.
The analysis determined that TikTok had the highest proportion of pro-China content across all three platforms for searches of the terms” TikTok” and” Tiananmen” ( TikTok ).
More than 25 % of search results for’ Tiananmen’ for example, were considered pro-China, which researchers defined as patriotic songs, travel promotions or scenic representations that make no mention of the 1989 massacre there. In comparison, only about 16 % of search results on Instagram were pro-China, and just about 8 % on YouTube. Instagram’s spokeswoman declined to comment. YouTube representatives did n’t immediately respond to a request.
In some cases, Instagram and YouTube showed higher rates of pro-China , content than TikTok. For’ Uyghur’ and ‘ Xinjiang,’ about 50 % of searches on YouTube returned positive content, compared to less than 25 % on TikTok. Researchers attributed the findings to a few well-known accounts created by, or connected to, state actors.
___
© 2024 Bloomberg L. P
Distributed by Tribune Content Agency, LLC.