A major courtroom clash over children’s online safety laws has taken a new turn, with a federal appeals court allowing large portions of California’s landmark child-protection statute to move forward while simultaneously dismantling some of its most controversial provisions.
The ruling came from the U.S. Court of Appeals for the Ninth Circuit, which reviewed an earlier injunction that had prevented California from enforcing the California Age-Appropriate Design Code Act. The court concluded that a sweeping constitutional attack on the law was unlikely to succeed, meaning the statute cannot be blocked entirely at this stage.
At the heart of the dispute is a lawsuit brought by NetChoice, a technology industry trade group representing companies including Amazon, Google, Meta Platforms, Netflix and the social platform X. The group argues that California’s law forces digital platforms to police speech in ways that infringe on protections under the First Amendment to the United States Constitution.
The appeals panel, however, was not persuaded by the broad constitutional challenge. Judges noted that companies operating online services can reasonably anticipate that children will access their platforms. Because the law applies to services “likely to be accessed by children,” the court said the statute treats businesses evenly rather than targeting specific viewpoints or speakers.
The panel also rejected arguments that the law’s requirement for companies to estimate the ages of younger users was unconstitutional on its face.
Still, the judges drew a sharp line around another part of the legislation. They concluded that restrictions preventing companies from using personal data in ways that could harm a child’s mental or physical well-being were written too vaguely to be enforced. The same problem, the court said, applied to provisions targeting so-called “dark patterns” — interface designs that manipulate users into sharing data or making choices they might not otherwise make.
Because those sections lacked clear boundaries, the court allowed them to remain blocked for now.
The case has been closely watched since a lower-court injunction halted the law before it could take effect. That order came from a federal district judge in California, prompting state officials to appeal.
California officials view the statute as a major effort to curb digital dangers facing minors, including harassment, exploitation and other harms that can arise on social media platforms. The law also requires companies to evaluate risks their services may pose to children and implement safeguards before launching new features.
Violations could carry significant penalties: civil fines reaching $2,500 per child for negligent breaches and as much as $7,500 per child for intentional violations.
Despite the mixed outcome, both sides claimed momentum. Industry groups say they will continue fighting the law in court, while state officials argue the ruling preserves the core of California’s push to make online spaces safer for younger users.


