Meta reaches agreement with US government on housing discrimination enabled for ad targeting

Meta Reaches Agreement With Us Government On Housing Discrimination Enabled For Ad Targeting

The US government and Facebook’s parent company Meta agreed to a settlement to clear up a lawsuit that accused the company of facilitating housing discrimination by allowing advertisers to specify that ads not be shown to people belonging to specific protected groups. , according to a press release from the Department of Justice (DOJ). You can read the full agreement below.

The government filed a lawsuit against Meta for algorithmic housing discrimination in 2019, although accusations over the company’s practices date back years before that. The company took some steps to address the issue, but it was clearly not enough for the feds. The department says this was its first case dealing with algorithmic violations of the Fair Housing Act.

The settlement, which will have to be approved by a judge before it is actually final, says Meta will have to stop using a discriminatory algorithm for housing listings and instead develop a system that “addresses racial and other disparities.” caused by the use of personalization algorithms in its ad delivery system.”

Meta says this new system will replace its Special Ad Audiences tool for housing, as well as credit and employment opportunities. According to the DOJ, the tool and its algorithms allowed advertisers to advertise to people similar to a pre-selected group. When deciding who to advertise to, the DOJ says that Special Ad Audiences took into account factors such as a user’s estimated race, nationality and gender, which means it could end up choosing who saw housing ads — a violation of the Fair Housing Act. In the settlement, Meta denies wrongdoing and notes that the settlement does not constitute an admission of guilt or a finding of responsibility.

Baca Juga :  BPOM CANDRI Frangipani Natural Spice Soap

In a statement on Tuesday, Meta announced that it plans to address this issue with machine learning, creating a system that “will ensure that the age, gender, and estimated race or ethnicity of the general audience of a housing listing matches the age, sex, and esteemed race. or mixed ethnicity of the population eligible to see this ad.” In other words, the system must ensure that the people who actually see the ad are the targeted and qualified audience to see the ad. Meta will analyze age, gender and race to measure the distance between the target audience and the real audience.

By the end of December 2022, the company needs to prove to the government that the system works as intended and integrate it into its platform as per the agreement.

The company promises to share its progress as it builds the new system. If the government approves and is implemented, a third party will “continuously investigate and verify” that it is indeed ensuring that ads are displayed in a fair and equitable manner.

Meta will also have to pay a fine of $115,054. While this is nothing for a company that makes billions every month, the DOJ notes that it is the maximum amount allowed for a violation of the Fair Housing Act.

Baca Juga :  Bokeh Art Indo Japanes Twitter Link Latest Full HD

You May Also Like