The company was accused of letting advertisers exclude protected groups from their campaigns. | Illustration by Alex Castro / The Verge
The US government and Facebook parent company Meta have agreed on a settlement to clear up a lawsuit that accused the company of facilitating housing discrimination by letting advertisers specify that ads not be shown to people belonging to specific protected groups, according to a press release from the Department of Justice (DOJ). You can read the full agreement below.
The government first brought a case against Meta for algorithmic housing discrimination in 2019, though accusations about the company’s practices go back years before that. The company took some steps to address the issue, but clearly, they weren’t enough for the feds. The department says this was its first case dealing with algorithmic violations of the Fair Housing Act.
The settlement, which will have to be approved by a judge before it’s truly final, says that Meta will have to stop using a discriminatory algorithm for housing ads and instead develop a system that will “address racial and other disparities caused by its use of personalization algorithms in its ad delivery system.”
Meta says this new system will replace its Special Ad Audiences tool for housing, as well as credit and employment opportunities. According to the DOJ, the tool and its algorithms made it so advertisers could advertise to people that were similar to a pre-selected group. When deciding who to advertise to, the DOJ says Special Ad Audiences took things like a user’s estimated race, national origin, and sex into account, meaning it could end up cherry-picking who saw housing ads — a violation of the Fair Housing Act. In the settlement, Meta denies wrongdoing and notes that the agreement doesn’t constitute an admission of guilt or a finding of liability.
In a statement on Tuesday, Meta announced that it plans on tackling this problem with machine learning, making a system that will “ensure the age, gender and estimated race or ethnicity of a housing ad’s overall audience matches the age, gender, and estimated race or ethnicity mix of the population eligible to see that ad.” In other words, the system is supposed to make sure that the people actually seeing the ad are the audiences targeted by and eligible to see the ad. Meta will look at age, gender, and race to measure how far off the targeted audience is from the actual audience.
By the end of December 2022, the company has to prove to the government that the system works as intended and build it into its platform, per the settlement.
The company promises to share its progress as it builds the new system. If the government approves it and it’s put into place, a third party will “investigate and verify on an ongoing basis” that it’s actually making sure ads are shown in a fair and equitable way.
Meta will also have to pay a $115,054 penalty. While that’s effectively nothing for a company bringing in billions every month, the DOJ notes that it’s the maximum amount allowed for a Fair Housing Act violation.