Facebook removes former UCD professor’s page over Covid misinformation

14 Dec 2021

Image: © Natee Meepian/Stock.adobe.com

Meta said Dolores Cahill’s Facebook page was removed as it takes ‘aggressive steps’ to fight harmful misinformation.

Facebook has removed the page of Dolores Cahill as part of a crackdown on Covid-19 misinformation on its platform.

Cahill is a former University College Dublin (UCD) professor who has become known in recent months for controversial claims about Covid-19 and vaccines.

A spokesperson for Meta, Facebook’s parent company, confirmed to SiliconRepublic.com that Cahill’s page was removed as part of “aggressive steps to fight harmful Covid-19 misinformation” on its platforms. The Irish Times reports that the page had more than 130,000 followers.

Cahill, whose employment with UCD ended earlier this year, has made many claims about Covid-19 and vaccines that were found to be false or misleading.

A study published by the UK-based Institute for Strategic Dialogue in October found that the World Doctors Alliance, a group co-founded by Cahill that spread Covid-19 misinformation and conspiracy theories, nearly doubled the number of interactions it got on Facebook in the first six months of this year.

“Since the pandemic began, we’ve removed over 16m pieces of content from Facebook and Instagram containing harmful Covid-19 misinformation and have taken down groups and pages for repeatedly sharing this material,” the Meta spokesperson added.

“This includes Dolores Cahill’s Facebook page. We’ve also added warning labels to more than 167m pieces of additional Covid-19 content thanks to our network of fact-checking partners.”

Misinformation moves

Throughout the pandemic, social networks have worked to cut down on Covid-19 misinformation and tackle false claims being spread on their platforms.

This has ranged Facebook and Twitter removing a video last year in which former US president Donald Trump falsely claimed that children are “almost immune” to Covid-19, to YouTube recently banning all anti-vaccination content.

Meta said that under its Covid-19 misinformation policy, it now removes false claims – identified with partners such as the World Health Organization – about the safety, efficacy, ingredients or side effects of vaccines and conspiracy theories about Covid-19 vaccines.

It is also looking to provide authoritative information to Facebook and Instagram users, and notify them when they’ve interacted with content that has been flagged as Covid-19 misinformation.

Earlier this month, Meta revealed that it removed a number of disinformation networks from Facebook and Instagram this year that were engaging in coordinated inauthentic behaviour to mislead other users.

This included accounts linked to an anti-vaccination conspiracy movement, and accounts, pages and groups linked to a Chinese network spreading Covid-19 disinformation. This group was pushing claims that the US was interfering with the search for the origins of the virus and trying to place the blame on China.

Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.

Sarah Harford was sub-editor of Silicon Republic

editorial@siliconrepublic.com