Women truckers claim Facebook’s algorithm is discriminatory in EEOC complaint


An advocacy group representing female truck drivers filed a complaint Thursday with the Equal Employment Opportunity Commission alleging Facebook parent company Meta is steering ads for lucrative jobs away from women and older workers based on the type of role.

Real Women in Trucking alleges in the new complaint that the underlying algorithm powering the social media giant’s ad system is far more likely to show ads promoting jobs in certain blue-collar professions such as trucking, firefighting, manufacturing and construction to men. Similarly, Facebook is more likely to show ads to female users for roles historically considered to be women’s work, such as housekeeping, home care and child care, the complaint alleges. Facebook delivers these ads in a biased fashion even in situations when the marketer has deemed them to be eligible to be shown to the general adult population, the complaint charges.

“It’s very clear that the algorithm is doing exactly what Facebook intended it to do: That is, it relies on gender and age to decide who should receive ads,” Peter Romer-Friedman, one of the lawyers representing the group, said in an interview. “The problem of algorithmic discrimination is far worse than anyone ever thought it was, or studied, or found through studies.”

Romer-Friedman was previously involved in a discrimination case against Facebook that led to a 2019 settlement in which the social media company agreed to make sweeping changes to its ad platform.

The legal filing draws on publicly available data in the company’s ad library, which reveals a pattern of “algorithmic steering” that causes some job ads to be shown to an audience of 90 percent women or men. For instance, an employer seeking to hire truck drivers in North Carolina set the eligible audience for a job ad to all genders. But of the people Facebook showed the ad to, 94 percent were men and just 11 percent were 55 and older, according to the lawsuit.

Justice Department and Meta settle landmark housing discrimination case

Meta spokesperson Dave Arnold said in a statement that the company is actively building technology “to make additional progress in this area.”

“Addressing fairness in ads is an industry-wide challenge and we’ve been collaborating with civil rights groups, academics and regulators to advance fairness in our ads system,” he said.

The complaint is likely to add to the public and legal scrutiny facing Meta over how the company’s automated ad system, which is known for offering marketers the ability to tailor ads to thin slices of the population, has resulted in discrimination against minorities and other groups in employment, housing and finance.

In 2019, Facebook agreed to stop allowing advertisers to use gender, age and Zip codes to market housing, credit and job openings to its users. That change came after a Washington state attorney general probe and a ProPublica report found that Facebook was letting advertisers conceal housing ads from African Americans and other minorities. Earlier this year, Meta agreed to build a new automated advertising system that the company says will help ensure that housing-related ads are delivered to a more equitable mix of the population.