The Justice Department says Facebook will change its algorithms to prevent discriminatory housing advertising as its parent company subjects itself to court oversight in a response to a U.S. 訴訟
フェイスブック will change its algorithms to prevent discriminatory housing advertising and its parent company will subject itself to court oversight to settle a lawsuit brought by the U.S. Department of Justice on Tuesday.
In a release, 我ら. government officials said Meta Platforms Inc., formerly known as Facebook Inc., said Tuesday it reached an agreement to settle the lawsuit filed the same day in マンハッタン 連邦裁判所.
According to the release, it was the Justice Department’s first case challenging algorithmic discrimination under the Fair Housing Act. Facebook will now be subject to Justice Department approval and court oversight for its ad targeting and delivery system.
我ら. Attorney Damian Williams called the lawsuit “groundbreaking.” Assistant Attorney General Kristen Clarke called it “historic.”
Ashley Settle, a Facebook spokesperson, said in an email that the company was “building a novel machine learning method without our ads system that will change the way housing ads are delivered to people residing in the U.S. across different demographic groups.”
She said the company would extend its new method for ads related to employment and credit in the U.S.
“We are excited to pioneer this effort,” Settle added in an email.
Williams said Facebook’s technology has in the past violated the Fair Housing Act online “just as when companies engage in discriminatory advertising using more traditional advertising methods.”
Clarke said “companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner.”
According to terms of the settlement, Facebook will stop using an advertising tool for housing ads that the government said employed a discriminatory algorithm to locate users who “look like” other users based on characteristics protected by the Fair Housing Act, 司法省は言った. By Dec. 31, Facebook must stop using the tool once called “Lookalike Audience,” which relies on an algorithm that the U.S. said discriminates on the basis of race, sex and other characteristics.
Facebook also will develop a new system over the next half-year to address racial and other disparities caused by its use of personalization algorithms in its delivery system for housing ads, と言いました.
If the new system is inadequate, the settlement agreement can be terminated, 司法省は言った. Per the settlement, Meta also has pay a penalty of just over $115,000.
The announcement comes after Facebook already agreed in March 2019 to overhaul its ad-targeting systems to prevent discrimination in housing, credit and employment ads as part of a legal settlement with a group including the アメリカ自由人権協会, the National Fair Housing Alliance and others.
The changes announced then were designed so that advertisers who wanted to run housing, employment or credit ads would no longer be allowed to target people by age, gender or zip code.
The Justice Department said Tuesday that the 2019 settlement reduced the potentially discriminatory targeting options available to advertisers but failed to resolve other problems, including Facebook’s discriminatory delivery of housing ads through machine-learning algorithms.