Meta Settles Claims That Ads Violated U.S. Fair Housing Laws

3 minute read

Meta Platforms Inc. will change its ad delivery system to address concerns that it violates the Fair Housing Act by discriminating against users, as part of a settlement with a federal regulator.

The accord resolves a lawsuit by the US Department of Housing and Urban Development alleging that the algorithms used in Meta’s advertising systems allowed marketers to violate fair housing laws by limiting or blocking certain groups of people from seeing housing ads on the service.

“Because of this ground-breaking lawsuit, Meta will—for the first time—change its ad delivery system to address algorithmic discrimination,” Manhattan US Attorney Damian Williams said in a statement.

Read More: Frances Haugen Calls for ‘Solidarity’ With Facebook Content Moderators in Conversation with Whistleblower Daniel Motaung

Meta said Tuesday that it built machine learning technology to ensure that ads reach people that reflect the overall potential audience for a particular ad, and not just a subset of that group.

In a blog post, Meta wrote that it will “work to ensure the age, gender and estimated race or ethnicity of a housing ad’s overall audience matches the age, gender, and estimated race or ethnicity mix of the population eligible to see that ad.” The company will also pay a fine of just over $100,000.

Meta said it will use the new technology for employment and credit ads as well as for housing.

Meta’s ad targeting capabilities have come under fire in recent years. In some cases, the company’s very specific targeting options may have enabled marketers to exclude certain groups from ads, for things like housing. In other cases, Meta’s targeting options were linked to a person’s protective characteristics, like race or religion.

Read More: Inside Facebook’s African Sweatshop

In the HUD complaint, the US alleged Meta’s algorithm allowed advertisers to find users who share similarities with groups of other individuals.

Meta hopes to get the new system up and running by the end of the year, said Roy Austin, the company’s vice president of civil rights. Austin added that Meta will also seek feedback on these changes from civil rights groups in the coming months. Many civil rights groups have been critical of the company’s use of personal data for targeting and how it can lead to discrimination.

The case is US v Meta Platforms Inc., 22-cv-5187, US District Court, Southern District of New York.

(Updates with comment by Manhattan US Attorney.)

More Must-Reads from TIME

Contact us at