IN THE NEWS | CONTACT US
If you or your business has been impacted by being a content moderator for TikTok, please contact us.
Case
If you or your business has been impacted by being a content moderator for TikTok, please contact us.
TikTok videos are ubiquitous and a large part of today's content consumption. Before a video is allowed, a content moderator must view it to ensure that it complies with legal standards and does not violate company standards. TikTok has an estimated 10,000 content moderators, and each of them is responsible for vetting the millions of videos uploaded each day.
According to past and current TikTok moderators, they are required to work at a frenetic pace, watching hundreds of videos per 12-hour shift. The high volume means they are allowed no more than 25 seconds per video and simultaneously view three to 10 videos. And if this is not bad enough, the content is not the cute animal and baby videos often seen on the platform.
TikTok moderators allege they are subjected to a regular diet of horrific videos, running the gamut from pornography, animal mutilation, suicides, rapes, and other traumatic events. This near-constant exposure can easily lead to psychological trauma, including PTSD.
The Joseph Saveri Law Firm, LLP has litigated a similar case in Scola v. Facebook, Inc., in which it won a historic $52 million settlement and obtained substantive workplace changes designed to mitigate the psychological harm that can be caused by routinely viewing objectionable conduct.
On March 24, 2022, The Joseph Saveri Law Firm, LLP filed a complaint on behalf of two named plaintiffs and a proposed class of content moderators against defendants Byte-Dance Ltd., Byte-Dance Technology Co. Ltd., and TikTok, Inc. (collectively “Byte-Dance”). The suit, filed in federal court in California, alleges Byte-Dance failed to provide a safe workplace for thousands of content moderator contractors who were exposed to the dangers of psychological trauma resulting from exposure to graphic and objectionable content on TikTok’s application.
On November 9, 2022, plaintiffs filed a second amended complaint.
Plaintiffs bring this action to:
“By screening social media posts for objectionable content, content moderators are our frontline soldiers in a war against depravity: a war we all have a stake in winning,” said firm partner Steven Williams. “The psychological trauma and cognitive and social disorders these workers face is serious. But they are being ignored, and the problems will only grow worse—for the company and for these individuals. It’s our hope and goal that ByteDance recognizes its obligations to these workers and creates a safer workplace for them.”
If you are a current or past content moderator for Byte-Dance or a contracted third party, we are here to help. Please contact our firm by completing the form below if the above or a similar situation applies to you, or if you would like to learn more about our investigation.
Any information you provide to us will be kept strictly confidential as provided by law.
If you wish to inform us of any unfair business practice, antitrust or competition issue, or comment on one of our cases, please use the form below. There is no cost or obligation for our review of your case. We agree to protect your name and all confidential information you submit against disclosure, publication, or unauthorized use to the full extent under the law. Please note that completion of this form does not contractually obligate our firm to represent you. We can only represent you if both you and our firm agree, in writing, that we will serve as your attorney. Please read our disclaimer.