A sharply split New York state appeals court has ruled that major social media platforms won’t be held responsible for their role in the radicalization of Payton Gendron—the white supremacist who carried out the deadly 2022 mass shooting at a grocery store in Buffalo. Ten Black individuals lost their lives in that racist attack. But a 3-2 majority on the bench has now said: the platforms that may have fueled his ideology are legally off the hook.
In a decision that rewrites the bounds of digital responsibility, the Appellate Division in Rochester overturned a lower court’s finding. Companies including Facebook, Instagram, YouTube, Reddit, Twitch, Discord, Snap, Amazon, 4chan, and Alphabet were all named in the lawsuit, which was filed by families of the victims, grocery store staff, and customers who witnessed the horror.
Their claim? That these platforms aren’t just passive bulletin boards, but intentionally addictive environments that radicalize users through design—feeding people like Gendron a steady diet of violent, racist propaganda.
The court didn’t argue with that narrative. But legally? It didn’t matter.
Justice Stephen Lindley, writing for the majority, acknowledged the “vile content” that inspired Gendron’s killing spree, but said Section 230 of the Communications Decency Act shields tech companies from liability over what their users post. Blame the algorithm or not, the law treats them all the same. Trying to hold them responsible, Lindley warned, would bring about “the end of the Internet as we know it.”
The dissenting judges, however, weren’t convinced. They pointed to the algorithmic design that pushes content—not passively hosts it. Whether it’s cooking tutorials or white nationalist bile, the system is built to keep users glued. That, they argued, isn’t neutrality. That’s complicity.
Gendron, who livestreamed part of his attack on Twitch, is already serving a life sentence without parole after pleading guilty to multiple state charges, including murder and terrorism fueled by hate. A separate federal case, where he could face the death penalty, is due to begin jury selection in August 2026.
Lawyers representing the victims’ families have yet to respond, but the ruling marks a turning point in how courts are willing—or unwilling—to confront the power of digital platforms in real-world violence.
For now, the court has spoken: design addictive content, push it to the edge—but if you’re a platform, you’re protected.


