This opinion indicates that Facebook–and by implication, every other ad network–could violate California’s Unruh Act (an anti-discrimination law) by targeting third-party ads based on age, gender, or other protected criteria. The court reaches this shocking conclusion by cutting several analytical corners, including: collapsing any distinctions between content publication (ad targeting) and retail sales; disregarding the implausibility of the plaintiff’s purported search strategy; and disregarding that life insurance policies are routinely price-discriminated based on the buyer’s age and gender. The court’s single-minded determination to find a valid discrimination claim under these conditions casts a long and troubling shadow over the online advertising industry. Who needs new privacy laws if the Unruh Act already bans most ad targeting?
* * *
Liapes claims she wanted to buy life insurance. Like any savvy shopper, she conducted her search by clicking around Facebook with the hope that life insurance ads might pop up in her Facebook newsfeed. She claims that she didn’t see some life insurance ads because she was outside the ads’ targeting criteria. (She also claims that she didn’t see auto insurance ads for the same reason, because I guess those are substitutes for life insurance ads?).
(Incredibly, Liapes says she wanted MORE Facebook ads, a desire shared by almost no one).
Facebook requires users to report their age and gender. Facebook also requires self-service advertisers to select age and gender for their ad targeting, Facebook’s default ad targeting settings include all adults (18+) and all genders, but it encourages advertisers to refine those criteria to improve ROI. Facebook also allows advertisers to select “lookalike” audiences, where advertisers enumerate the specifications of their desired consumers and Facebook attempts to assemble pools of similar consumers. This tool relies on age and gender to target ads. Facebook makes further choices of how to deliver ads when advertisers’ targeting criteria creates a prospective customer pool bigger than the advertiser’s ad budget–Facebook relies on age and gender to prioritize the ad exposures, even if the advertiser didn’t ask for that.
Liapes describes some ads she believed she didn’t get due to Facebook’s ad targeting:
Liapes identified a life insurance ad that was only sent to males ages 30 to 49 because the advertiser used the Audience Selection tool. In another instance, a life insurance ad was not shown to her because it was only sent to people ages 25 to 45 — based on the advertiser’s use of the Audience Selection tool — and because the advertiser wanted to reach people similar to its customers — based on the advertiser’s use of the Lookalike Audience tool.
* * *
Often, the law allows advertisers to target ads based on age, gender, and other protected classifications. This is easiest to see with “gendered” products, where most of a product’s buyers are of a single gender. Advertisers don’t have to advertise to consumers who aren’t likely to be interested in products not meant for them.
Despite this, some product categories have restrictions on discriminatory advertising, including employment, housing, and financial services. I’m not clear when that qualification applies to insurance ad targeting.
Regardless, insurance companies routinely engage in price discrimination and consider age and gender when setting their prices. If they can build proper actuarial tables, all demographic sectors could be equally profitable due to their price discrimination.
Nevertheless, insurance companies are not equally agnostic about how they spend their ad dollars. The insurers may find that certain customers are more profitable than others. For example, some customers may be more likely to renew or purchase multiple policies; and insurer fear small demographic pools, which are riskier no matter what price is set. (Due to marketplace inefficiencies, it’s also probable that some consumer segments are in fact more profitable to insurers than others). As a result, insurers have incentives to target ads to maximize their returns on a limited ad budget, even though the resulting sales will be price-discriminated.
* * *
The plaintiffs claimed that Facebook directly violated the Unruh Act by delivering ads on an age- and gender-discriminatory basis. In other words, the regulated activities in question are entirely Facebook’s, not any third-party advertiser’s.
Facebook argued that it lacked the statutorily required “intent to discriminate.” The court responds:
Facebook, not the advertisers, classifies users based on their age and gender. Advertisers using the Audience Selection tool are required to identify the age and gender preferences for their target audience. While the default audience setting is 18 to 65 years of age and older and all genders, Facebook provides advertisers with the option of easily including or excluding entire groups from the target audience by checking categories on a drop-down menu. Moreover, Facebook encourages advertisers to target users based on age and gender…
To the extent Facebook argues it was not responsible for any unequal treatment Liapes experienced because it merely followed the advertisers’ selections, we disagree….Facebook, rather than the advertisers, sends the ads to users within the Audience Selection parameters. Facebook retains the discretion and ability to approve and send an ad that includes age- or gender-based restrictions….
it is Facebook that creates the Lookalike Audience resembling the
advertiser’s sample audience. When analyzing the characteristics of the
sample audience to determine the larger Lookalike Audience, Facebook
directly relies on the users’ age and gender. This occurs regardless of
whether the advertiser has created a sample audience with age or gender
exclusions. Thus, while an advertiser provides Facebook with the sample
audience, “the rest of the work to create the Lookalike Audience is done
entirely by Facebook,” and it is that work that ultimately results in ad denial…
Facebook uses age and gender to determine who will receive the ads, regardless of whether the advertiser directs Facebook to limit the age or gender of recipients. Thus, even if advertisers do not limit their audience to a specific gender or age, Facebook makes those distinctions on behalf of advertisers via the ad-delivery algorithm
This passage requires careful reading/re-reading. The court seems to be saying that the plaintiff can show Facebook had impermissible intent to discriminate simply because it made age- or gender-based deliveries of ads, even if Facebook or advertisers have legitimate business reasons for doing so. Read broadly, this passage implicates the entire ad targeting industry.
(There are so many weird phrases in the above passage. The one I’ll highlight here: “ad denial.” That assumes consumers are “entitled” to receive all ads, which they plainly are not).
* * *
The court next turns to Liapes’ common law aiding-and-abetting theory. Remarkably, the court doesn’t mention the recent Taamneh Supreme Court opinion interpreting common law aiding-and-abetting. That’s a glaring and confidence-rattling omission.
The court says the plaintiff sufficiently alleged Facebook’s aiding-and-abetting by claiming:
Facebook knew insurance advertisers intentionally targeted its ads based on users’ ages and gender — as explained above, a violation of the Unruh Civil Rights Act. According to Liapes, the coding in Facebook’s platform identifies each type of business, including insurance advertisers, that purchases ads. In addition, Facebook is aware of the ad’s subject matter, including insurance ads. Facebook was aware ads contained age- or gender-based restrictions because it alone approved and sent the ads to the target audience.
The court credulously accepted Liapes’ allegation that Facebook “approved” the ads, but I wonder if that will survive further review. If the ads were self-service, Facebook “approved” the ads only in the sense that its system was designed not to require human review or approval of ads. I think the Taamneh ruling would characterize this differently.
With respect to substantial encouragement or assistance:
Each time an advertiser used the Audience Selection tool and made a discriminatory targeting decision based on age or gender, Facebook followed the selected audience parameters. Indeed, this occurred despite Facebook retaining the discretion to reject ads that include age- or gender-based restrictions. Facebook further maintained the age and gender Audience Selection criteria despite its awareness advertisers were making discriminatory advertising choices.
* * *
Facebook requires users to disclose their age and gender before they can use its services. It designed and created an advertising system, including the Audience Selection tool, that allowed insurance companies to target their ads based on certain characteristics, such as gender and age. [Cite to Vargas v. Facebook, a non-precedential opinion] Although there are thousands of characteristics advertisers may choose to identify their target audiences, Facebook requires advertisers to select age and gender parameters. Each category includes “simple drop-down menus and toggle buttons to allow” insurance advertisers “to exclude protected categories of persons.”…It creates, shapes, or develops content “by materially contributing” to the content’s alleged unlawfulness….
Facebook’s Lookalike Audience tool and ad-delivery algorithm underscore its role as a content developer. According to the complaint, Facebook uses its internal data and analysis to determine what specific people will receive ads. The algorithm relies heavily on age and gender to determine which users will actually receive any given ad. This occurs even if an advertiser did not expressly exclude certain genders or older people. The algorithm then sends or excludes users from viewing ads based on protected characteristics such as age and gender. Because the algorithm ascertains data about a user and then targets ads based on the users’ characteristics, the algorithm renders Facebook more akin to a content developer.
What does content “shaping” mean?
[Reminder: The Roommates.com opinion held that Section 230 applied to housing discrimination claims in some circumstances. The rest of the opinion was undercut by the subsequent Ninth Circuit holding that Roommates.com had complied with the housing discrimination laws all along, so Roommates.com had qualified for Section 230 at the time the court denied it.]
Facebook argued that it supplied “neutral” tools to advertisers (I hate that phrase because tools are never neutral). Citing Kimzey, the court says “the system must do absolutely nothing to enhance the unlawful message at issue beyond the words offered by the user” (cleaned up). In this case, Liapes alleged “Facebook, after requiring users to disclose protected characteristics of age and gender, relied on “unlawful criteria” and developed an ad targeting and delivery system “directly related to the alleged illegality” — a system that makes it more difficult for individuals with certain protected characteristics to find or access insurance ads on Facebook.”
The court distinguishes Prager v. YouTube because “There were no allegations, as here, that the defendant created a system that actively shaped the audience based on protected characteristics.” What does it mean to “actively shape an audience,” how does that relate to “content shaping,” and how does audience shaping undermine Section 230’s publisher immunity?
* * *
The court’s conclusions aren’t specific to insurance policies. The court implies that any gender- or age-based ad targeting for any product or service (and targeting based on any other protected characteristics) could violate the Unruh Act.
This conclusion would have devastating effects on the entire Internet ecosystem. Furthermore, it almost certainly raises First Amendment concerns–because it restricts the publication of advertising–that may not have been at issue yet in this case but would need to be resolved at some point.
I’m not sure what further proceedings in this case will look like. Facebook may think it can win on remand. Nevertheless, the stakes of this case are so high that Facebook probably must escalate it. For now, this ruling will be on my list of top 10 Internet Law developments of 2023.
* * *
Case citation: Liapes v. Facebook Inc., 2022 WL 20680402 (Cal. App. Ct. Sept. 21, 2023) [Westlaw’s 2022 classification is wrong because the opinion is misdated as 2022–presumably a typo. If the court can’t even get the date of its own opinion right, maybe it’s unsurprising they can’t get the substance right either.]
The post Does California’s Anti-Discrimination Law Ban Ad Targeting?–Liapes v. Facebook appeared first on Technology & Marketing Law Blog.