The Supreme Court is examining a short but potent law this week that, if altered, could rearrange the modern internet.
toptechtrends.com/tag/section-230/”>Section 230 of the Communications Decency Act shields internet companies from liability for the user-generated content they host and it’s become an unlikely nexus of controversy in recent years.
On Tuesday, the Supreme Court heard oral arguments in Gonzalez v. Google. That case, brought by the family of Nohemi Gonzalez, a victim of the 2015 Islamic State terrorist attacks in Paris, argues that Google should be liable for terrorist content promoted on YouTube that preceded the attack.
On Wednesday, the court will hear a parallel case that faults Twitter for another deadly terrorist attack — in this case, one that resulted in the death of Nawras Alassaf, who was killed after an Islamic State gunman opened fire in an Istanbul nightclub in 2017.
Plaintiffs in both cases argue that the tech platforms in question should face legal liability for the Islamic State content that they hosted or promoted in the lead-up to attacks that together claimed more than 150 lives.
Supreme Court justices grappled with the petitioner’s argument that when YouTube serves users content through its recommendation algorithm, that actually constitutes a different kind of activity than merely hosting that content — one that isn’t protected by Section 230.
“We’re focusing on the recommendation function, that they are affirmatively recommending or suggesting ISIS content, and it’s not mere inaction,” said lawyer Eric Schnapper, who represented Gonzalez’s family in Tuesday’s oral arguments.
The idea that Section 230 might have exceptions isn’t new, but it is controversial. In 2018, toptechtrends.com/tag/fosta/”>a bill known as FOSTA created a carve out to Section 230 that was toptechtrends.com/2018/07/09/indianapolis-vice-cop-says-sesta-fosta-closure-of-backpage-has-blinded-investigators/”>ostensibly designed to reduce sex trafficking but has since faced criticism for making sex work more dangerous.
The Supreme Court isn’t the only government entity evaluating Section 230, though efforts to dismantle the law or toptechtrends.com/2020/03/05/tech-giants-immunities-encryption/”>make its protections come with strings attached have largely stalled out in Congress in recent years.
On Tuesday, some justices expressed doubt that the highest court in the land was the right body to reevaluate the internet law at all.
“We’re a court, we really don’t know about these things,” Justice Elena Kagan said. “These are not the nine greatest experts on the internet.”
As Schnapper pressed on, the justices expressed some confusion at his argument and both sides sought to clarify it. Schnapper’s core argument focused on establishing a distinction between failing to take dangerous content down — a statistical inevitability given how much content online platforms host — vs. actually promoting that content and extending its reach:
“Our view is, if the only wrong alleged is the failure to block or remove, that would be protected by 230(c)(1). But — but that’s — the 230(c)(1) protection doesn’t go beyond that. And the theory of protecting the — the website from that was that the wrong is essentially done by the person who makes the post, the website at most allows the harm to continue. And what we’re talking about when we’re talking about the — the website’s own choices are affirmative acts by the website, not simply allowing third-party material to stay on the platform.”
Ultimately, the justices tried to define the boundaries of what should reasonably be protected by Section 230 and what shouldn’t by exploring hypothetical extremes: that platforms that use algorithms should be allowed to deliberately promote illegal content or that they shouldn’t be permitted to make any algorithmic recommendations at all.
“Let’s assume we’re looking for a line, because it’s clear from our questions that we are,” Justice Sotomayor said.
To make matters more confusing, Schnapper repeatedly referred to the platform’s algorithmic recommendations as “thumbnails” — a term that would be more widely interpreted as the snapshots showing a preview of a YouTube video.
Some justices took Schnapper’s argument to another logical extreme, cautioning that a carve out stripping 230 protections from algorithmic recommendations would also instantly give search engines that rank search results the same treatment.
“So even all the way to the straight search engine, that they could be liable for their prioritization system?” Kagan asked.
The justices repeatedly expressed concern about the potential wide-reaching second-order effects of tinkering with Section 230.
“You’re asking us right now to make a very precise predictive judgment that, don’t worry about it, it’s really not going to be that bad,” Justice Brett Kavanaugh said. “I don’t know that that’s at all the case, and I don’t know how we can assess that in any meaningful way.”
Those reservations were nearly universal among the justices, who did not appear eager to shake up the status quo — a perspective we can expect to see surface again during’s Wednesday’s oral arguments, which will again stream live.
“We’re talking about the prospect of significant liability in litigation and up to this point, people have focused on the [Anti-terrorism Act] because that’s the one point that’s at issue here,” Chief Justice John Roberts said.
“But I suspect there would be many, many times more defamation suits, discrimination suits… It would seem to me that the terrorism support thing would be just a tiny bit of all the other stuff. And why shouldn’t we be concerned about that?”
toptechtrends.com/2023/02/21/supreme-court-section-230-gonzalez-v-google/”>Supreme Court arguments this week could reshape the future of the internet by toptechtrends.com/author/taylor-hatmaker/”>Taylor Hatmaker originally published on toptechtrends.com/”>TechCrunch