Meta announced some the evolution of its housing policy, job and credit adsfollowing a recent settlement with the US Department of Housing and Urban Development regarding the potential for discriminatory use of Facebook ads in these categories.
The settlement specifically relates to real estate ads and how Facebook ads can be used to target specific groups with real estate promotions, which could be used to exclude certain audiences from this outreach.
These new changes will ideally eliminate this possibility, while Meta has also chosen to extend the same to job and credit ads.
“To protect against discrimination, advertisers posting real estate ads on our platforms already have a limited number of targeting options they can choose from when setting up their campaigns, including restriction on age, gender, or zip code. Our new method builds on this foundation and strives to make further progress towards fairer ad distribution through our ad serving process. »
This updated method includes a new “variance reduction system” in Meta’s ad targeting process, which is designed to correct potential errors in this process.
Mets says he has been working with the Department of Housing for more than a year on the new system, which will expand the use of machine learning technology to ensure that age, gender and race or origin Estimated ethnicity of a housing ad’s overall audience match demographics. and the ethnic composition of the population likely to see this advertisement.
“We’re making this change in part to address feedback we’ve heard from civil rights groups, policymakers, and regulators about how our advertising system delivers certain categories of personalized ads, particularly around fairness. So while HUD has raised concerns about personalized housing ads in particular, we plan to use this method for job and credit related ads as well. is a deep-rooted issue with a long history in the United States, and we are committed to expanding opportunities for marginalized communities in these spaces and others.
Meta’s advanced ad targeting systems have come under scrutiny on this element, with a survey by ProPublica in 2016, finding that Facebook’s system allowed advertisers to exclude blacks, Hispanics and other “ethnicities” from ads.
As Meta notes, it has since implemented various changes to limit the misuse of its ads, but many have argued that it hasn’t gone far enough, with advertisers still able to exclude certain audiences using advanced Meta tools.
This new initiative will aim to provide more protections and eliminate stigma, in partnership with key organisations.
Additionally, Meta also says that he is ending his Special advertising audiences tool:
“In 2019, in addition to eliminating some targeting options for real estate, job and credit ads, we introduced Special Advertising Audiences as an alternative to Lookalike Hearings. But the field of fairness in machine learning is dynamic and evolving, and special ad audiences were one of the first ways to address concerns. Going forward, we’ll focus on new approaches to improving fairness, including the method we announced today. »
Given the intent of these changes, the impact should be minimal, as they’re designed to eliminate targeting types that shouldn’t be used anyway. But there can be ripple effects, and if you serve ads in these areas, or through these tools, it will be worth taking note of the changes within your process.
Meta says that given the complexity of the process, these changes “will take time test and implement.