Sunday, November 17, 2024

Supreme Court questions Twitter’s liability for terrorist attack

Comment

The Supreme Court spent more than five hours over two days considering the responsibilities and failures of Big Tech, but in the end seemed reluctant to impose substantial changes in how social media platforms can be held liable for contentious or even dangerous content on their sites.

In a case involving Google on Tuesday, the justices seemed reluctant on their own to limit a law that protects social media platforms from lawsuits over content posted by their users, even if the platform’s algorithms promote videos that laud terrorist groups.

On Wednesday, it was Twitter’s turn. And a majority of the court questioned whether the online messaging platform could be sued for aiding and abetting a 2017 terrorist attack just because the militants involved had access to the site for propaganda and recruiting purposes. They were hearing an appeal of a lower court finding that said a lawsuit filed by the family of a man killed in the attack could proceed because Twitter had not done enough to prevent Islamic State’s use of the platform.

“We all appreciate how horrible the attack was, but there’s very little linking the defendants in this complaint to those persons” who committed the attack, said Justice Neil M. Gorsuch.

Justice Clarence Thomas seemed to agree. “If we’re not pinpointing cause and effect or proximate cause for specific things … then it would seem that every terrorist act that uses this platform would also mean that Twitter is an aider and abettor in those instances,” Thomas said.

Supreme Court seems cautious in Google case that could transform internet

American relatives of Nawras Alassaf say Twitter failed to properly police its platform for Islamic State-related accounts in advance of a Jan. 1, 2017, attack in Turkey that killed Alassaf and 38 others.

They based their lawsuit on the Anti-Terrorism Act, which imposes civil liability for assisting a terrorist attack. At issue was whether the company provided substantial assistance to the terrorist group. University of Washington law professor Eric Schnapper, representing the plaintiffs, said they did not have to show that Twitter’s actions led to a specific attack, but that they aided the “terrorist enterprise.”

But Washington lawyer Seth Waxman, representing Twitter, said the company has a policy against hosting content that promotes or supports terrorist actions, and regularly removes accounts when it finds them. Just because Twitter is aware that “among their billions of users were ISIS adherents who violated their policies” does not make the company liable for “aiding and abetting an act of international terrorism,” he said.

In Tuesday’s argument, the Biden administration sided mostly with the family of a different terror victim — also represented by Schnapper — which was suing Google’s YouTube for its algorithms that recommended ISIS-related videos. The government said the broad protections in Section 230 of the Communications Decency Act of 1996 — shielding platforms from liability over content from third parties — did not automatically shield companies that prioritize and recommend such content.

But Section 230 was not at issue in Wednesday’s case, and Deputy Solicitor General Edwin S. Kneedler sided with Twitter, saying the platform should not be sued under the anti-terrorism law.

“The United States condemns in the strongest terms the terrorist act that caused Mr. Alassaf’s death and sympathizes with the profound loss that the plaintiffs in this case have experienced,” Kneedler said. But the company’s actions do not show “a culpable role in the commission of that murder.”

Terrorists killed their daughter. Now they’re fighting Google in the Supreme Court

Not all the justices seemed convinced Twitter should be cleared. Justice Elena Kagan took exception to Waxman’s assertion that the alleged failure on Twitter’s part was that it did not “better ferret out violations of” company policy against terrorist content.

“The conduct is the provision of a platform by which to communicate with each other and other members of ISIS and by which to recruit,” Kagan said. “So you can, you know, say it’s the failure to better police the platform, but it’s the provision of a platform.”

Justice Amy Coney Barrett added: “If you know ISIS is using it, you know ISIS is going to be doing bad things. You know ISIS is going to be committing acts of terrorism.”

But Barrett also pressed Schnapper for any specific link to the attack in Turkey.

Over the two days of hearing the cases involving Big Tech, justices have been critical of the laws they’re asked to interpret. Chief Justice John G. Roberts Jr. complained the anti-terrorism statute is vague and filled with many factors a court must consider when deciding liability. Kagan said Section 230 is outdated, but should be fixed by Congress, not the court.

Legal scholars said that while some justices in the Gonzalez v. Google arguments seemed inclined to limit the liability protections afforded under Section 230, there was little consensus on how to do so.

“I believe that the court has almost no appetite for touching Section 230,” said Chamber of Progress legal advocacy counsel Jessica Miers, whose left-leaning trade group receives funding from tech companies including Google, Apple and Amazon. (Amazon founder Jeff Bezos owns The Washington Post.)

Evelyn Douek, a Stanford law professor and research fellow at the Knight First Amendment Institute, said it “looks much more unlikely that the court is going to answer the Section 230 question.”

Douek said the justices appeared to be searching for lines to draw about who should receive the immunity, but did not appear “satisfied with any of the answers that they got.” The court appears poised to say, “We’re going to leave this for another day,” she said.

The dynamic mirrors the debate on Capitol Hill, where there has been substantial bipartisan agreement on the need to overhaul the 1996 law through legislation, yet little-to-no progress by lawmakers on finding a framework that can garner broad support.

In 2018, lawmakers overwhelmingly passed a measure to allow digital services to be held liable for knowingly facilitating sex trafficking. But a federal report in 2021 found that the law has hardly ever been used by federal prosecutors to get restitution for sex-trafficking victims, and critics say it has forced platforms to shutter sex education sources.

How vigilante ‘predator catchers’ are infiltrating the criminal justice system

Members of Congress have since introduced dozens of other proposals aimed at paring back the tech industry’s liability protections. While many have focused on partisan criticisms that the platforms remove either too much or too little “lawful but awful content,” others have sought to widen liability depending on how companies handle illicit drug sales or child abuse material.

None of those measures have gained significant traction, even as congressional leaders including House Speaker Kevin McCarthy (R-Calif.) and Rep. Nancy Pelosi (D-Calif.) have openly expressed concern about the broad scope of Section 230.

It’s a quagmire that has extended to the executive branch.

Both President Biden and President Donald Trump took aim at the liability shield, to no avail. Trump in 2020 signed an executive order aimed at punishing companies over allegations they disproportionately “censor” conservative users, but the federal agency tasked with overseeing the push declined to act on it before he left office.

As a candidate, Biden called for Section 230 to be “revoked” entirely. Since entering the White House, he has moderated that stance, with the administration instead calling for “reforms” to the law. But to date, the White House has outlined no concrete plans for how to do so.

Related Posts

Leave Comment

Welcome Back!

Login to your account below

Retrieve your password

Please enter your username or email address to reset your password.