This Challenge To Section 230 Could Greatly Increase The Liability Of Content-Hosting Websites

Accountability? In this economy?!

Decorative Scales of Justice in the CourtroomYou can find some wild stuff on YouTube. And you don’t even have to go out of your way to find it half of the time — the algorithm does things like recommend conspiracy theory content to people who visit the site. As it turns out, conspiracies, divisiveness, radicalization, and terrorism work wonders for YouTube’s profit model. That said, it is easy to imagine that a website peddling such content would be getting sued left and right over that, yeah? As it turns out, yes, but the suits don’t really tend to get that far. That is, for now. From Reuters:

In 2021, a California state court threw out a feminist blogger’s lawsuit accusing Twitter Inc (TWTR.MX) of unlawfully barring as “hateful conduct” posts criticizing transgender people. In 2022, a federal court in California tossed a lawsuit by LGBT plaintiffs accusing YouTube, part of Alphabet Inc (GOOGL.O), of restricting content posted by gay and transgender people.

These lawsuits were among many scuttled by a powerful form of immunity enshrined in U.S. law that covers internet companies. Section 230 of the Communications Decency Act of 1996 frees platforms from legal responsibility for content posted online by their users.

In a major case to be argued at the U.S. Supreme Court on Tuesday, the nine justices will address the scope of Section 230 for the first time. A ruling weakening it could expose internet companies to litigation from every direction, legal experts said.

This is huge. Thinking through the implications of Section 230 being weakened, University of Santa Clara Law School’s High Tech Law Institute’s Eric Goldman guesstimates that “There’s going to be more lawsuits than there are atoms in the universe.” And he’s probably not too far off. For example, YouTube and Twitch have been hotbeds for white supremacist recruiting. The casual racism has been casual racisiming so far that people have begun to speed run being called slurs.

Yeah, there’s no way TikTok, Twitter, YouTube, and Twitch, among others, wouldn’t get hammered by a nerfed Section 230. Do you have any idea how exponentially the budget these sites currently dedicate to content moderation will grow? I’m not even being facetious here — just think about the next time some mass shooter uses Twitch to livestream carnage. The legal liability will be blinding.

A ruling against [Google] could create a “litigation minefield,” Google told the justices in a brief. Such a decision could alter how the internet works, making it less useful, undermining free speech and hurting the economy, according to the company and its supporters.

It could threaten services as varied as search engines, job listings, product reviews and displays of relevant news, songs or entertainment, they added.

As U.S. Supreme Court Weighs YouTube’s algorithms, ‘Litigation Minefield’ Looms [Reuters]

Sponsored


Chris Williams became a social media manager and assistant editor for Above the Law in June 2021. Prior to joining the staff, he moonlighted as a minor Memelord™ in the Facebook group Law School Memes for Edgy T14s.  He endured Missouri long enough to graduate from Washington University in St. Louis School of Law. He is a former boatbuilder who cannot swim, a published author on critical race theory, philosophy, and humor, and has a love for cycling that occasionally annoys his peers. You can reach him by email at cwilliams@abovethelaw.com and by tweet at @WritesForRent.

Sponsored