Legal Special: The Week of March 23, 2026 Was a Humdinger For Online Life
Three seismic rulings were handed down last week that will impact the Internet as we know it for some time to come. These were two very different topics that coincidentally came to conclusions at about the same time. I believe all three cases are absolute wins for us individuals and the Internet public in general, though Sony, Google, and Meta will be left licking their wounds, and many other companies are going to be closely impacted by these decisions.
Let's start with the older case, but before we do that allow me to state for the record:
I am not a lawyer. I do not play one on TV. I did not stay at a Holiday Inn Express last night.
With that said, please consider the following the analysis of a cybersecurity/internet/IT veteran who has had enough life and job experience to draw some conclusions here. None of this is legal advice, and none of it comes from anybody but yours truly.
Cox Communications v. Sony Music Entertainment
Back in 2019 Cox - a significant ISP in the US - was found liable to the tune of $1,000,000,000 (that's $1B!) for the fact that some of their users pirated music. While I'm not quite sure what the legal argument that won this battle was, it seems to me something like suing the Illinois Tollway System because people speed on it, and because the tollway charges people to use the road it is clearly profiting off of speeders.
Well it took seven years, but the Supreme Court agreed that Cox did not "induce subscribers to pirate music," meaning that just because people were speeding on the Illinois Tollways didn't mean that the Tollway authority was enticing them to do it, and therefore can't be held liable for their speeding. The court was unanimous in this view, though there were two slightly different opinions on where the line for liability begins. Either way, ISP's no longer have to act as copyright holder's police forces, enforcing access bans because the RIAA or other industry groups "believe" end users are pirating their materials.
The impact of this is pretty significant. ISPs don't want to be put in a position to police their users - for anything but excessive bandwidth usage at least - because it's never a good business move to have to police your customers, they tend to stop wanting to be your customers.
OK, maybe a better analogy would be trying to hold VCR manufacturers (sorry Gen Z and Alpha, you may need to look that up) for people pirating TV and movies - and the funny thing is that Sony was in Cox's position decades ago in this exact scenario.
Did the DMCA Loose Its Teeth?
The Digital Millennium Copyright Act of 1998 (DMCA) was a heavy-handed reminder that in the digital age the sorts of things that made wide-scale copyright infringement cost prohibitive were quickly becoming outdated. In an era of CD-ROMs, portable hard drives, and eventually high-bandwidth access to cloud storage, the issues of copying and distributing physical media became trivial. (seriously, think about the process of copying a vinyl record at home vs. a music CD, the difference in equipment cost, effort, time, etc. is just staggering) Instead of adjusting business models to recognize that distribution networks (record companies, video tape companies, cable companies, etc.) were going to lose control and need to reorganize, countries around the world decided to try to enforce copyright in a world where they would quickly find themselves outpaced by the power that John Q. Public had in their family room in most homes in the US and many around the world. One of the principle enforcement methods in the DMCA was to hold service providers liable if they didn't police their own customers on behalf of copyright holders.
This turns out to be the major point on which the Supreme Court divided itself, with one faction believing that their ruling didn't impact the DMCA's incentives to ISPs and others to self-police, the other faction believing that the ruling has effectively nullified that. Time will tell who is correct, but I tend to believe that the DMCA's teeth have been pulled.
Meta and Google's Algorithms Make Them Liable
The second major case from last week that has a significant impact on our daily lives comes out of California, along with a third case from New Mexico.
K.G.M. Wins in California
The California case saw a jury award the plaintiff, known as K.G.M., $6,000,000 (that's $6M!) in damages for their liability in creating products that led to harmful and addictive behavior in her as a minor. Put differently, the algorithms used by Meta in their products like Facebook and Instagram as well as by Google in YouTube caused addictive behavior that led to other mental health issues for K.G.M. The constant rush of "likes" and "views" tracking, as well as the predictive algorithms choosing what to show you based on what you interact with were called out as harmful to children (and, let's face it, adults aren't necessarily doing so well with these either). Other social media companies settled pre-trial with K.G.M.
The argument against this liability has always been that social media platforms aren't (generally) liable for the content posted on them, and from that they're protected by the infamous "Section 230." What the jury seems to be saying is that it isn't the content that was the issue, it was how the platforms used the content to addict people like K.G.M. into continual interaction.
Expect this to be appealed by both companies.
New Mexico Wins in New Mexico
The state brought a suit against Meta specifically, alleging that the company has misled the state's citizens about the dangers of their platform for minors, and at a jury trial won a $375,000,000 payout from the company. In this case New Mexico created fake child accounts on Facebook and Insagram, claiming to be 14 years of age, and these accounts were beset upon with both sexually explicit materials and solicited for sex by men who the state subsequently arrested and charged. At trial New Mexico was able to show damning internal evidence that employee backed efforts to improve safety for children were ignored and prevented by leadership.
“The product is very good at connecting people with interests, and if your interest is little girls, it will be really good at connecting you with little girls.” - Arturo Béjar, former Meta product lead
This one is also likely to be appealed by Meta, but the dam seems to have burst open here.
Big Tobacco Parallels
People are drawing parallels to the 1998 settlement agreement that the Tobacco industry reached with state attorneys general. The industry as a whole had buried evidence as best it could that smoking was a significant cause of cancer for both smokers and those who were subjected to second hand smoke, claiming that the science was unsettled, and ensuring that they promoted any study or even pseudoscience that said smoking was safe.
Mark Zuckerberg himself, in depositions, has described the evidence that Meta's platforms are addictive as "inconclusive," straight out of the Big Tobacco playbook. (see also, the Petroleum Industry)
The second major parallel are that these are just the first few cases to conclude, they are not outliers in any way. More states, and even more individuals and class action suites are in the court systems right now alleging similar things and lining up similar arguments.
The third major parallel is that, like with tobacco, the companies aren't being accused of breaking specific laws, instead they're being accused of harming people and resisting evidence that they're doing so. The tobacco settlement was based on the evidence that smoking was harmful, not that selling smoking products was illegal.
Like the Big Tobacco settlement, I don't expect any of the major players to simply fold up shop. But I do hope that there will be impactful changes to the platforms including:
- Algorithms that choose what you see need to be "opt in," otherwise your feed will be based on your own specific settings and choices.
- Stronger age verification. I don't like this, I believe it is a privacy issue, but I can absolutely see this as one of the settlement options going forward.
- Better parental access to minor's accounts.
- Stronger internal safety measures to remove predatory accounts from the platform.
- Money to/for states to fund support for those who have been harmed by these platforms.
So we're in a situation where ISP's have just been told they're not nearly as liable for copyright infringement as copyright holders would like, and social media companies have now been told that they are liable for the damage they're causing to (at least) minors they allow on the platform. I think there are changes for the better to come from both of these statements.