This is Global Witness’s follow-up investigation into the platform’s content moderation.
Ahead of the Irish general elections this Friday (29 November), an investigation by international NGO Global Witness has found “significant failings” in TikTok’s ability to moderate disinformation and political content.
To test TikTok’s ability to moderate content, the NGO created 28 video ads with disinformation – 14 in English and 14 in Irish – submitting them to the social media platform, giving it 48 hours to review the submissions.
The type of disinformation the platform sent for review included statements that implied voters can cast their votes on Facebook, or that voters need to supply proof of two Covid-19 vaccinations to be allowed to vote.
According to the platform – which moderates videos through automated moderation technology alongside human reviewing – the type of disinformation the NGO submitted to TikTok violates the platform’s community guidelines and stated commitment to protect the integrity of elections.
However, TikTok approved 11 of the 28 ads – three in English and eight in Irish.
Global Witness acknowledged that social media companies claim to have further checks once an ad goes live. To investigate whether further checks are carried out once an ad is live, it submitted two ads, one in English and one in Irish, that referenced the upcoming election’s date – this information was correct but against TikTok’s guidelines. This ad was approved. This finding suggests there are also gaps in TikTok’s review process at the point of an ad going live,” the NGO said.
Overall, Global Witness found that the platform’s ability to moderate content in both English and Irish fails to meet standards, with both language content showing “major weaknesses in what should have been a very easy test”.
Moreover, according to TikTok, the platform prohibits ads that reference, promote or oppose political candidates. The platform also prohibits any references to an election, including voter registration or turnout among other information.
A TikTok spokesperson confirmed with the NGO that all of the ads they submitted violated the social media platform’s advertising polices and said that they have conducted an investigation into why some of the ads were not rejected.
TikTok said it is “focused on keeping people safe and working to ensure that TikTok is not used to spread harmful misinformation”.
To tackle disinformation, the NGO has asked TikTok to pay content moderators fair wages, “robustly” enforce policies around election disinformation and publish information on the steps the platform has taken.
Global Witness conducted an investigation into TikTok in May around around the time of the EU parliamentary elections. The NGO submitted 16 ads – which all went through the platform’s moderation system. It found the platform’s content moderation “so poor” that it submitted a complaint to the EU, requesting the watchdog to probe TikTok for breaches under the Digital Services Act.
At the time, TikTok said that they “instituted new practices for moderating ads that may be political in nature to help prevent this type of error from happening in the future”.
Last year, TikTok claimed to have shut down 72 different accounts with a combined following of nearly 100,000 users spreading disinformation and divisive content to Ireland.
In 2021, the platform established a safety advisory council in Europe to assist with content moderation, setting up a new European Transparency and Accountability Centre in Ireland.
The EU has strict policies around very large platforms regulating their content. Earlier this year, Telegram co-founder and CEO Pavel Durov was arrested in France over the messaging app’s inability to stop criminal activity on the platform.
Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.