Back to Facebook

FacebookLawsuit

moderateAnti-PrivacyLawsuit

Executive Summary

HUD charged Facebook with violating the Fair Housing Act by enabling advertisers to target or exclude housing ads based on race, color, national origin, religion, familial status, sex, and disability. The case settled in June 2022 with Meta paying the maximum FHA civil penalty of $115,054.

What Happened

On March 28, 2019, the U.S. Department of Housing and Urban Development charged Facebook with violating the Fair Housing Act by enabling advertisers to discriminate in housing-related advertisements based on protected characteristics including race, color, national origin, religion, familial status, sex, and disability. HUD alleged that Facebook allowed advertisers to exclude specific groups such as parents, non-American-born individuals, non-Christians, people interested in accessibility, and those from certain neighborhoods by drawing exclusion zones on maps. The agency also claimed Facebook's algorithms used data collected about users to determine ad delivery in discriminatory ways, even when advertisers did not explicitly request such targeting. The case settled in June 2022 with Me...

Who Is Affected

Facebook users seeking housing who belong to protected classes under the Fair Housing Act were affected by not being able to view relevant housing advertisements. The discrimination particularly impacted racial minorities, religious groups, families with children, people with disabilities, women or men when gender-targeted exclusions were used, and residents of specific neighborhoods that advertisers redlined. ProPublica's prior reporting had identified that the platform allowed exclusions targeting specific groups including Spanish speakers, mothers, and the disabled.

Why It Matters

This case represents one of the first federal enforcement actions addressing algorithmic discrimination in digital advertising platforms, establishing that anti-discrimination laws apply to automated systems and machine learning algorithms. HUD's charge specifically addressed not only advertiser-selected targeting options but also claimed Facebook's own algorithms amplified discrimination in ad delivery beyond advertisers' explicit choices. The case follows years of investigative reporting beginning in 2016 that documented these practices, and occurred alongside multiple other lawsuits from housing advocacy groups and related cases involving employment and age discrimination in Facebook's advertising system.

AI-Assisted

Event summaries are generated by Claude AI from verified sources and reviewed by humans before publication.

HUD charged Facebook with violating the Fair Housing Act by enabling... — Facebook | PrivacyWire