A finger pressign a keyboard.

Netchoice, LLC v. Bonta

Status: Ongoing
Last Update: February 14, 2024

What's at Stake

This amicus brief highlights the constitutional defects of a California law that purports to protect consumer privacy but is actually an impermissible, content-based regulation of online speech. The brief urges that this law should be struck down while emphasizing that the court should rule narrowly and leave paths open for other consumer privacy laws to withstand First Amendment challenge.

The California legislature largely framed the California Age-Appropriate Design-Code Act (“CAADCA”) as a consumer privacy law that would offer stronger privacy protections by default. This is an important goal that lawmakers can and should achieve without violating the constitution. But the CAADCA’s text makes clear that it is in fact an impermissible, content-based regulation of online speech, and it fails strict scrutiny. The First Amendment precludes such a law. At the same time, the CAADCA contains several privacy concepts that, in another law, could be subject to and satisfy a lower level of First Amendment scrutiny.

On February 14, 2024, the ACLU and ACLU of Northern California filed an amicus brief in the U.S. Court of Appeals for the Ninth Circuit explaining the CAADCA’s constitutional defects, but urging the court to rule narrowly and recognize that a different consumer privacy law could pass constitutional muster. The brief canvases the many harms that can flow from private actors’ collection, aggregation, use, and sharing of people’s sensitive information, and the vital role consumer privacy laws could play in preventing or remediating those harms. It then discusses avenues that legislators can take to protect consumer privacy without infringing free speech. Finally, it explains why the CAADCA in particular does trench on free speech and should remain enjoined.

Electronic devices and services are essential to participating in modern life. We use them to communicate with each other, record our thoughts, and access everything from transportation to health care to education. But the businesses behind these technologies often collect, share, and use our personal information in ways we never agree to, potentially leading to discrimination, security breaches, loss of privacy, and other harms.

Legislators have tools at their disposal to pass consumer privacy laws that effectively address these harms without running afoul of the First Amendment. For example, they can regulate the use and sharing of personal data obtained solely in exchange for a good or service. Such legislation would likely trigger—and pass—a lower level of constitutional scrutiny than does the CAADCA. Similarly, laws mandating disclosure of when a business is tracking consumers, such as those requiring notification when a user’s GPS location is collected, are likely to survive First Amendment challenge.

However, the CAADCA is not such a law. It purports to regulate privacy harms, but instead prohibits and burdens online speech based on its content. It bans businesses from “[u]sing [any child’s] personal information” in a way that is “materially detrimental” to them, reaching online speech both to and about children based on its potential to cause harm, anguish, or distress to any child. And the CAADCA burdens online communication by requiring businesses to evaluate the risk that any new service or feature will expose children to “harmful, or potentially harmful, content,” “contacts,” “conduct,” or “algorithms.” Online businesses must also create a “timed plan to mitigate or eliminate th[at] risk before” children access the online service or feature. Since these reports must be completed before any new service or feature is implemented or made publicly available, the CAADCA in effect delays user access to new platforms or ways of speaking online as well as platforms’ implementation of new content moderation policies. Further, in part because the terms “material detriment,” “harmful,” and “potentially harmful” are all left undefined, the CAADCA could lead platforms to enact more speech restrictive content moderation policies or ban certain content entirely in an effort to comply with the law.

Well-settled Supreme Court precedent establishes that the government may not ban or burden speech just to shield young people from ideas or images that the government might think unsuitable for them. While speech confronting, discussing, and working through painful daily realities, such as parental abuse or depression, can pose difficulties for children, this speech is constitutionally protected and often valuable. Even where speech lacks obvious value, the Court has ruled that First Amendment protection generally does not turn on that. The Court has also held that the government may not limit adults’ access to speech in the name of protecting children. The CAADCA falls short of what the First Amendment requires on each of these grounds.

The ACLU has long advocated that strong protections for online speech are necessary to ensure the First Amendment’s vitality in the digital age. This brief adds to that work by urging the Ninth Circuit to enforce the First Amendment’s bar on content-based speech regulations without unnecessarily foreclosing other existing and future privacy laws that enhance civil rights and civil liberties.

Support our on-going litigation and work in the courts Donate now

Learn More About the Issues in This Case