twitter HQ
© Jim Wilson/The New York TimesSan Francisco-based Twitter is facing a lawsuit related to a demand to take down a porn video that shows underage boys.
Child-safety advocates are blasting Twitter and lining up to support a lawsuit that alleges the social network declined to remove videos depicting the sexual exploitation of minors despite pleas from a victim and his family.

"It's child sexual abuse material. He was 13 years old and being extorted. What the hell is Twitter doing?" asked Hany Farid, a professor at the School of Information at UC Berkeley.

Farid, who has testified before Congress five times on issues of online safety and regulation, and others filed paperwork with a federal appellate court in San Francisco last week supporting the lawsuit.

A respected nonprofit that works closely with the federal government to fight the sexual exploitation of children also supports the lawsuit because of Twitter's alleged refusal to take down the videos despite the family's pleas.

"The facts in this case are especially egregious because the electronic service provider was aware of the child victims' graphic sexual images and refused to remove the videos from the platform," the National Center for Missing and Exploited Children, told The Examiner in an email.

NCMEC will receive nearly $37 million in taxpayer funding this fiscal year to, in part, "provide online users and electronic service providers a means to report internet-related child sexual exploitation," according to the Department of Justice.

The lawsuit alleges that two users discovered that sexual abuse videos taken of them a few years earlier were circulating on Twitter. The plaintiffs, who were 13 years old in the videos, say they were blackmailed into making the videos and that the posting of them led to bullying at school and extreme anxiety.

The lawsuit says the plaintiffs asked for the material to be removed, and at Twitter's request proved the videos were of them and that they were underage when the videos were taken. One of the plaintiff's mothers asked Twitter to remove the videos, as well. But Twitter's safety team deemed the videos acceptable under their terms of service and declined to remove them, the lawsuit alleges.

"We've reviewed the content, and didn't find a violation of our policies, so no action will be taken at this time," Twitter told the plaintiff in 2020, the lawsuit alleges.

"What do you mean you don't see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down," the plaintiff argued, according to the lawsuit.

When an acquaintance of the family who worked for the Department of Homeland Security contacted Twitter, the company removed the videos, the lawsuit alleges.

"This case is particularly important," NCMEC said, because courts will use it "to decide whether victims of sextortion — a crime where children are blackmailed for explicit images — can seek justice against online platforms that enable child sexual abuse material to be shared."

It is highly unusual for the nonprofit to take such a strong stance regarding a company it must work closely with. But Farid and NCMEC are just two of several child-safety organizations and experts that filed amicus briefs, legal paperwork supporting the lawsuit, last week with San Francisco's U.S. Court of Appeals for the Ninth Circuit. Both sides are scheduled to file briefs in the case next month.

The Canadian Centre for Child Protection; the Rape, Abuse & Incest National Network; Child USA; and professor Brian Levine, director of the Cybersecurity Institute at the University of Massachusetts Amherst, are all voicing support for the lawsuit, John Doe #1 and John Doe #2 v. Twitter Inc.

Twitter declined to comment on the case.

According to transcripts, a lawyer for Twitter argued in trial court that the company's moderators "move heaven and earth to try to get this kind of content off of it. What you can say here is, at best, someone may have made a human error, and it was up for nine days."

During that time, the video of the minors was viewed 167,000 times, according to the National Center on Sexual Exploitation Law Center, which is representing the plaintiffs.

Twitter does not ban porn on its platform, unlike many social networks. The company has a "zero-tolerance policy" for child sexual exploitation content and has invested in technology and tools to enforce the policy. Twitter also bans explicit images or videos that were shared without the consent of the people involved.

The company works with NCMEC and the International Association of Internet Hotlines (INHOPE) to stop child sexual abuse material on its platform.

As part of the case, Twitter entered an exhibit showing it banned accounts involved in the posting of the videos of the plaintiffs. But The Examiner found the banned accounts are still mentioned in dozens of tweets referencing sex and boys, some with obscene photos. Twitter did not immediately respond to a question about the tweets mentioning a key figure in the lawsuit.

The experts say it was the company declining to remove the materials that sets this case apart.

"How bad does it have to get?" asks Farid, the Berkeley professor. "There have been many cases like this where there's been clear failures, but this one seems particularly clear cut, and the harm seems particularly clear cut."

With Twitter reeling from the whistleblower Peiter "Mudge" Zatko's allegations of "egregious deficiencies, negligence, willful ignorance" at Twitter, the lawsuit may represent another significant blow to the company's already damaged reputation for security.