clock menu more-arrow no yes mobile

Filed under:

The US government says Facebook’s ad business creates housing discrimination

The Department of Housing and Urban Development has issues with Facebook’s targeted ad business.

Houses for sale in Chicago.
Houses for sale in Chicago.
Scott Olson/Getty Images

Facebook has a $55 billion annual advertising business in part because it lets advertisers pick and choose, with precise detail, who they want to target their advertisements to.

That’s great if you’re Coca-Cola, and historically it’s been great for Facebook. But the Department of Housing and Urban Development, commonly referred to as HUD, said Thursday that Facebook’s targeting actually creates some serious problems.

Specifically: HUD claims Facebook’s ad platform is “causing housing discrimination,” and can “exclude people” from seeing certain ads based on traits that are defined by HUD as “protected characteristics,” like race, national origin, and religion. Twitter’s and Google’s ad businesses may be creating the same problem.

“Facebook is discriminating against people based upon who they are and where they live,” HUD Secretary Ben Carson said in a press release posted to the HUD website. “Using a computer to limit a person’s housing choices can be just as discriminatory as slamming a door in someone’s face.”

You can read HUD’s full charge against Facebook here, but the press release outlines the main accusations. The big problem is that Facebook uses its own algorithms to determine who should see which ads, and HUD claims those algorithms can unintentionally exclude groups of people with similar protected characteristics just because Facebook’s systems don’t necessarily deem them a good match for the ad.

Imagine a housing developer wants to promote fancy, new condos in San Francisco on Facebook. The developer sets the targeting parameters in a way that means there are one million Facebook users who could see the ad, but the developer only pays to reach 100,000 of those people. Facebook then determines which 100,000 people to show the ad to based on which people it thinks may find the ad most relevant. That means, though, that Facebook’s algorithms could prioritize certain groups of people over another, and HUD claims those groupings may be created using data about protected characteristics.

“Facebook combines data it collects about user attributes and behavior with data it obtains about user behavior on other websites and in the non-digital world,” the press release reads. “Facebook then allegedly uses machine learning and other prediction techniques to classify and group users to project each user’s likely response to a given ad, and in doing so, may recreate groupings defined by their protected class.”

Facebook says it’s been working with HUD to solve the issue, but that the two sides came to a roadblock when HUD asked for data about Facebook’s users and targeting the company refused to hand over. Here’s Facebook’s full statement:

We’re surprised by HUD’s decision, as we’ve been working with them to address their concerns and have taken significant steps to prevent ads discrimination. Last year we eliminated thousands of targeting options that could potentially be misused, and just last week we reached historic agreements with the National Fair Housing Alliance, ACLU, and others that change the way housing, credit, and employment ads can be run on Facebook. While we were eager to find a solution, HUD insisted on access to sensitive information - like user data - without adequate safeguards. We’re disappointed by today’s developments, but we’ll continue working with civil rights experts on these issues.

This is far from the first ad snafu for Facebook in the past 18 months. The company has been called out multiple times for letting advertisers target people based on keywords like “jew haters” and “Joseph Goebbels.” Then, of course, there was the 2016 US presidential election in which Russia used targeted Facebook ads to try and sway voter opinion ahead of the election.

The bigger concern for Facebook will be if any of its ad practices lead to serious regulatory problems. The company is being investigated by the FTC (and other government agencies) for Cambridge Analytica, a situation in which personal data from millions of Facebook users was collected and later sold by people outside the company without users’ knowledge.

The claims against Facebook by government agencies are piling up, and it’s only a matter of time before a shoe drops — it’s just unclear what shoe it will be, and how much it will hurt.

This article originally appeared on