Facebook takes a harder line on video clickbait following spam complaints

18 Aug 20173 Shares

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Facebook feed. Image: weedezign/Shutterstock

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

In the ongoing fight against the scourge of clickbait, Facebook announces tighter regulations around spam video content.

Last January, Facebook users may have noticed a change in the type of video appearing in their News Feed. The company made some alterations to the algorithm that essentially encouraged content creators to post longer videos.

In a blog post from January 26 2017, staff wrote: “As we continue to understand how our community consumes video, we’ve realised that we should therefore weight percent completion more heavily the longer a video is, to avoid penalising longer videos.”

Malicious links

The promotion of longer videos by Facebook has led to spammers deceiving users in recent months, with the aim of tricking them into clicking into sites with malicious links.

Put simply, many users noticed static images disguised as videos appearing on their feeds, with a fake play button embedded in the image to create the appearance of a video.

To combat this underhanded algorithm manipulation, Facebook engineers announced two updates that would prevent these clickbait videos spreading on the site.

“To limit this, during the coming weeks, we will begin demoting stories that feature fake video play buttons and static images disguised as videos in News Feed.”

Facebook warned that publishers who rely on these “intentionally deceptive practices” should brace themselves for a dramatic decrease in the distribution of these fake stories. Publishers that adhere to the best practices of Facebook won’t see much change in how their posts are distributed.

Facebook under the magnifying glass

The tougher regulatory approach taken by the company follows a barrage of criticism after the US presidential election.

Last year, The New York Times reported that the company was dealing with accusations around enabling the spread of misinformation and propaganda news stories, which influenced how the electorate voted last year.

As the unrest continues in the US, CEO Mark Zuckerberg recently made a statement doubling down on Facebook’s stricter regulation practices. “ With the potential for more rallies, we’re watching the situation closely and will take down threats of physical harm,” he said.

Facebook feed. Image: weedezign/Shutterstock

Ellen Tannam is a writer covering all manner of business and tech subjects

editorial@siliconrepublic.com