TikTok removed nearly 350,000 videos related to election misinformation – CNET

TikTok America

TikTok was busy fighting election misinformation during the 2020 campaign.


James Martin/CNET

TikTok removed nearly 350,000 videos in the US for violating rules on election misinformation, disinformation or manipulated media in the second half year, the short-form video app app said on Wednesday in a transparency report.

The video platform removed a total of 347,225 videos from the platform and made an additional 441,028 videos ineligible for recommendation on the For You page, which serves as a home page for users.

TikTok said it began preparing for the election in 2019 and drew on the social media industry’s experiences in the 2016 election cycle, when Russian trolls used social media in an effort to sow disinformation. TikTok prepared for more domestic efforts to influence the 2020 election based on trends it had observed on the spread of misleading information. It said its partnerships with fact-checkers allowed it to verify how accurate videos posted to it are, as well as limit the distribution of those containing “unsubstantiated content.

“Our investment in building relationships with a range of experts improved our overall approach to platform integrity, from policies to enforcement strategies to product experiences in our app,” TikTok said.

TikTok became a political pawn in the months leading up to the contentious 2020 presidential election when then-President Donald Trump issued a pair of executive orders designed to force TikTok into a sale. Microsoft and Oracle were among the US companies seeking to buy TikTok’s American operations from ByteDance, a Chinese firm. A deal with Oracle and Walmart is currently on hold.