Facebook and US Sign Deal to End Discriminatory Housing Ads

Facebook and US Sign Deal to End Discriminatory Housing Ads
The META logo on a laptop screen in Moscow, on Oct. 28, 2021. (Kirill Kudryavtsev/AFP via Getty Images)

NEW YORK—Facebook will change its algorithms to prevent discriminatory housing advertising and its parent company will subject itself to court oversight to settle a lawsuit brought by the U.S. Department of Justice on Tuesday.

In a release, U.S. government officials said it had reached agreement with Meta Platforms Inc., formerly known as Facebook Inc., to settle the lawsuit filed simultaneously in Manhattan federal court.

According to the release, it was the Justice Department’s first case challenging algorithmic discrimination under the Fair Housing Act. Facebook will now be subject to Justice Department approval and court oversight for its ad targeting and delivery system.

U.S. Attorney Damian Williams called the lawsuit “groundbreaking.” Assistant Attorney General Kristen Clarke called it “historic.”

Ashley Settle, a Facebook spokesperson, said in an email that the company was “building a novel machine learning method without our ads system that will change the way housing ads are delivered to people residing in the U.S. across different demographic groups.”

She said the company would extend its new method for ads related to employment and credit in the U.S.

“We are excited to pioneer this effort,” Settle added in an email.

Williams said Facebook’s technology has in the past violated the Fair Housing Act online “just as when companies engage in discriminatory advertising using more traditional advertising methods.”

Clarke said “companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner.”

According to terms of the settlement, Facebook will stop using an advertising tool for housing ads that the government said employed a discriminatory algorithm to locate users who “look like” other users based on characteristics protected by the Fair Housing Act, the Justice Department said. By Dec. 31, Facebook must stop using the tool once called “Lookalike Audience,” which relies on an algorithm that the U.S. said discriminates on the basis of race, sex, and other characteristics.

Facebook also will develop a new system over the next half-year to address racial and other disparities caused by its use of personalization algorithms in its delivery system for housing ads, it said.

If the new system is inadequate, the settlement agreement can be terminated, the Justice Department said. Per the settlement, Meta also must pay a penalty of just over $115,000.

The announcement comes after Facebook already agreed in March 2019 to overhaul its ad-targeting systems to prevent discrimination in housing, credit, and employment ads as part of a legal settlement with a group including the American Civil Liberties Union, the National Fair Housing Alliance, and others.

The changes announced then were designed so that advertisers who wanted to run housing, employment or credit ads would no longer be allowed to target people by age, gender, or zip code.

The Justice Department said Tuesday that the 2019 settlement reduced the potentially discriminatory targeting options available to advertisers but failed to resolve other problems, including Facebook’s discriminatory delivery of housing ads through machine-learning algorithms.

By Larry Neumeister

ntd newsletter icon
Sign up for NTD Daily
What you need to know, summarized in one email.
Stay informed with accurate news you can trust.
By registering for the newsletter, you agree to the Privacy Policy.
Comments