IN THE NEWS | CONTACT US

If you or your business has been impacted by being a content moderator for Twitch, please contact us.


STREAMING TRAUMA

Video games have been around since 1958 - today's games are leaps and bounds beyond a simple dot bouncing around on a screen. People are making careers and millions of dollars streaming their gameplay and lives. 

Founded in 2011 as a spin -off company from Justin.tv, streaming goliath Twitch is the platform of choice for most streamers. It has evolved from a small community of video game lovers to a streaming platform with over 140 million monthly active users. People now use Twitch to stream their daily lives, host events, and rivals YouTube for content and engagement. 

THE TRUTH ABOUT MODS

Due to the increase in streams and popularity, Twitch has contracted content moderators, but unlike social giants Facebook, Instagram, or Twitter, Twitch relies on volunteer moderators to enforce community regulations. Since streamers do not have the time to moderate their own chats while streaming, they usually pick amongst their fans to help with moderation (these volunteer moderators are referred to as “Mods” and have a sword in a green shield icon next to their username). Much like other social media services, Twitch has come under scrutiny for failing to moderate violent, discriminatory, and graphic content in the forms of videos and written speech.

Mods, for the most part, are unpaid volunteers in charge of moderating a streamer’s chats and banning users who violate community guidelines. Because of this, Mods are often at the receiving end of vitriol and violent and hateful language. Mods and streamers are often victims of “hate raids," which are usually used to target marginalized communities, such as women, BIPOC, and LGBTQ+ communities. These attacks have become so prevalent that on September 2021, Twitch streamers coordinated a boycott #ADayOffTwitch” to call attention to these hateful acts.

Self-harm is another common theme that Mods are exposed to. On November 15, 2017, a Twitch Mod came across a user that was expressing suicidal thoughts and actions. The Mod reached out to Twitch in an attempt to help the individual, but Twitch proceeded to cancel the individual’s account mid-crisis - the Mod believes that a bot deleted the account. Some Mods believe they carry the responsibility of putting in the emotional labor to help individuals who are suicidal and/or threatening self-harm; they also carry the burden of sifting through hateful messages and comments posted on chats by bots or online trolls.

This barrage of insults, violent language, and harassment can result in lifelong PTSD, emotional trauma, and other psychological consequences. And while Twitch posits their new "layered" safety approach, the question remains if this is enough. 


streamers

WHAT'S NEXT

Steve Williams at The Joseph Saveri Law Firm has a groundbreaking history of litigating cases on behalf of content moderators. In Scola v. Facebook, Inc., the firm won a historic $52 million settlement and obtained substantive workplace changes designed to mitigate the psychological harm that can be caused by routinely viewing objectionable conduct. On July 12, 2022, in Jane Doe v. YouTube, the Class filed a motion for Court approval of a preliminary settlement. Under the $4.3 million proposed settlement, an estimated 1,300 Class members would receive $2,079 each. YouTube has also agreed to provide content moderators with onsite and virtual counseling services by licensed clinicians for individual biweekly sessions, as well as access to telephonic counseling and peer support groups that meet monthly.

If you are a current or past content moderator for Twitch, we are here to help. Please contact our firm by completing the form below if the above or a similar situation applies to you or if you would like to learn more about our investigation.

Any information you provide to us will be kept strictly confidential as provided by law.


SHARE YOUR EXPERIENCE