Facebook admits anti-discrimination advertising policy isn’t working

22 Nov 2017

Facebook Ad Manager app. Image: dennizn/Shutterstock 

Despite a federal ban on housing discrimination in the US, Facebook allowed advertisements that excluded minorities, according to a new investigation.

Facebook’s difficulties around its advertising policies are notorious at this point and it has been discussing how to solve the problem of political advertising purchasing for quite some time now.

CEO Mark Zuckerberg is keen to present a model of self-regulation, but it appears that there are still lingering issues when it comes its advertising strategy.

Continued issues for Facebook ads

Last year, ProPublica investigated the company’s system and found that it allowed advertisers to discriminate against specific ethnic groups viewing ads related to housing, excluding them in a pool called ‘Ethnic Affinities’.

Since this report, ProPublica has followed up with an investigation, which found that Facebook still allows for racial discrimination within its advertising model. Journalists bought dozens of ads for rental housing but asked that they not be visible to user groups such as Jews, Spanish speakers and African Americans, among others.

These groups and more are ostensibly protected under the federal Fair Housing Act in the US, which makes it illegal to publish ads “with respect to the sale or rental of a dwelling that indicates any preference, limitation or discrimination based on race, colour, religion, sex, handicap, familial status or national origin”. Those who violate the act can be subject to heavy fines.

Facebook provides a workaround for those looking to enact discriminatory housing policies, as every ad purchased by ProPublica was approved within a matter of minutes. Most of the ads took around three minutes to be granted approval, while it took 22 minutes for an ad excluding people “interested in Islam, Sunni Islam and Shia Islam” to be published successfully.

According to its own policies, Facebook should have flagged the advertisements and prevented some of them from being uploaded in the first place. ProPublica was told by the US department of housing and urban development that an inquiry into Facebook’s advertising policies had been closed, “reducing pressure on the company to address the issue”.

A failure on Facebook’s behalf

Facebook spokesperson Ami Vora gave a statement to The Verge on the issue, describing the advertising loopholes as a failure in enforcement. “Earlier this year, we added additional safeguards to protect against the abuse of our multicultural affinity tools to facilitate discrimination in housing, credit and employment.

“The rental housing ads purchased by ProPublica should have but did not trigger the extra review and certifications we put in place due to a technical failure.”

Vora said that although Facebook’s systems “continue to improve”, the company can do better.

“While we currently require compliance notifications of advertisers that seek to place ads for housing, employment and credit opportunities, we will extend this requirement to all advertisers who choose to exclude some users from seeing their ads on Facebook to also confirm their compliance with our anti-discrimination policies and the law.”

This type of targeting is legal for advertisements touting films, for example, but housing, employment and credit advertisements are ripe for exclusionary practices, as the ProPublica investigation clearly demonstrates.

Although there is certainly a lot to be said for the benefits that come with the development of algorithms, the potential for them to be manipulated to achieve bigoted end results is something that needs to be explored and remedied.

Facebook Ad Manager app. Image: dennizn/Shutterstock 

Ellen Tannam was a journalist with Silicon Republic, covering all manner of business and tech subjects

editorial@siliconrepublic.com