Video-sharing app TikTok has updated its content policies to curb misinformation on its platform ahead of the 2020 presidential election in the United States, the company said today.
The app, which has come under fire by U.S. lawmakers and the Trump administration over national security concerns due to its Chinese ownership, said it was working with experts from the U.S. Department of Homeland Security to “protect against foreign influence”.
TikTok said it would expand partnerships with PolitiFact and Lead Stories to fact-check potential misinformation about the election. It will also allow users to report vote-related misinformation on the app, the company said in a blog post.
The company, which does not allow political advertising and said in the blog post it was not the “go-to app to follow news or politics,” has increasingly emerged as a platform for political discourse and activism. Users recently said they helped inflate attendance expectations at U.S. President Donald Trump’s June rally in Tulsa, Oklahoma.
The hugely popular app, which allows users to create short videos with special effects and music clips, has also been used to share false claims such as COVID-19 misinformation.
TikTok said it was adding a specific policy to prohibit synthetic or manipulated content that misleads users in a way that could cause harm. In recent days, a viral doctored video of House Speaker Nancy Pelosi had spread across social media platforms, including TikTok.
The changes are the latest moves by TikTok to combat misinformation, an issue that major social media companies including Facebook and Twitter have long struggled to police on their own platforms.
TikTok owner ByteDance is the first Chinese company to achieve global success with a consumer app. However, amid rising U.S.-China tensions, the White House has threatened to ban TikTok and other Chinese-owned apps, citing national security risks.