Former Chaturbate Moderator Sues Site Over 'Psychological Trauma'

A former content moderator for the porn site Chaturbate has sued the platform and its affiliates, claiming that he was psychologically harmed by his ongoing exposure to the sexual material on the site.

Neal Barber, who was hired as a moderator for the porn site in 2020, claims in a class action lawsuit that his employers knowingly and intentionally failed to “provide their content moderators with industry-standard mental health protections, such as content filters, wellness breaks, trauma-informed counseling, or peer support systems.” 404 Media first reported on the litigation.

The suit, which names as defendants Chaturbate, its parent company, Multi Media LLC, and a customer support contractor, Bayside Support Services, was filed earlier this month in California.

The lawsuit claims that Barber “developed post-traumatic stress disorder (PTSD) and other severe emotional injuries” from his work, which required him to view and interact with “sexually explicit, violent, obscene and psychologically disturbing live-streamed content for extended periods of time.” Barber now claims to suffer from “vivid nightmares, emotional detachment, panic attacks, and other symptoms consistent with PTSD.” This alleged emotional trauma requires “ongoing medical treatment and therapy,” the suit says. 

“These injuries were not only foreseeable, but preventable,” the litigation continues. “Had Defendants taken even the minimal precautions adopted by companies in Defendants’ industry, Plaintiff would not have suffered these injuries.”

The lawsuit also notes the importance of moderators to the porn industry’s business model. “Because platforms like Chaturbate host vast amounts of live, unfiltered, and sexually explicit content, content moderators are essential to maintain compliance with legal standards, enforce platform rules, and prevent the dissemination of illegal or abusive material,” the lawsuit says. “They serve as the first line of defense against child exploitation, non-consensual content, violent content, obscene content, self-harm, and other violations.”

Gizmodo reached out to Chaturbate, as well as to Bayside Support Services and Multi Media LLC, for comment.

The plight of the content moderator has become one of the most confounding dilemmas of the modern age. The internet is overflowing with repellant material, and it’s almost always somebody’s job to try to clean it up (even Elon Musk’s “free speech” platform X has a moderation staff). Usually, the job falls to precarious low-wage workers—many of whom end up claiming that the sites that employ them do next to nothing to ease the psychological pain of having to watch awful stuff all day.

As an example, Meta has been sued multiple times over the company’s alleged treatment of African contractors who were tasked with moderating the deluge of disturbing and illegal content on the company’s websites. Last year, it was reported that 140 moderators who had previously done work for Facebook had been diagnosed with PTSD from having viewed social media material involving murders, suicides, and child sexual abuse material.

As legal troubles involving moderators have become more common, some companies are increasingly turning to automated, AI-driven systems to do the work of cleaning up their sites. However, it’s often the case that human observers are still necessary to provide oversight for the automated systems.

Chaturbate has had a difficult few years, as it and other porn sites continue to adjust to the wave of age-verification regulations that have taken root in mostly conservative states. Last year, the platform was fined over half a million dollars by the state of Texas for failing to institute age-verification mechanisms for the users of its site. A conservative political movement has also increasingly lobbied to make the entire porn industry illegal.

Like
Love
Haha
3
Atualize para o Pro
Escolha o Plano que é melhor para você
Leia Mais