If you or your business has been impacted by being a content moderator for TikTok, please contact us.


TikTok videos are ubiquitous and a large part of today's content consumption. Before a video is allowed, a content moderator must view it to ensure that it complies with legal standards and does not violate company standards. TikTok has an estimated 10,000 content moderators, and each of them is responsible for vetting the millions of videos uploaded each day. 

According to past and current TikTok moderators, they are required to work at a frenetic pace, watching hundreds of videos per 12-hour shift - the high volume means they are allowed no more than 25 seconds per video and simultaneously view three to 10 videos at the same time. And if this is not bad enough, the content is not the cute animal and baby videos often seen on the platform. 

TikTok moderators allege they are subjected to a regular diet of horrific videos, running the gamut from pornography, animal mutilation, suicides, rapes, and other traumatic events. This near-constant exposure can easily lead to psychological trauma, including PTSD. 

The Joseph Saveri Law Firm has litigated a similar case in Scola v. Facebook, Inc., in which it won a historic $52 million settlement and obtained substantive workplace changes designed to mitigate the psychological harm that can be caused by routinely viewing objectionable conduct.



On December 23, 2021, The Joseph Saveri Law Firm filed a class action complaint on behalf of Ms. Candie Frazier and a proposed class of content moderators against defendants Byte-Dance Ltd., Byte-Dance Technology Co. Ltd., and TikTok, Inc. (collectively “Byte-Dance”). The suit, filed in federal court in California, alleges Byte-Dance failed to provide a safe workplace for thousands of content moderator contractors who were exposed to the dangers of psychological trauma resulting from exposure to graphic and objectionable content on TikTok’s application.

On December 24, Ms. Frazier was abruptly and unjustly placed on leave for speaking up, with no clear path forward as to how she can regain her position.

“In retaliation for bringing these important issues to the public, Ms. Frazier was advised on Christmas Eve that she was being placed on leave and had to give up her work equipment. As a result, she can no longer do the job she relies on to support her family. Put yourself in her shoes and imagine the fear and uncertainty she and her family experienced because of this unnecessarily cruel act. TikTok’s behavior—straight out of Dickens’ novel or a modern sweatshop—violates the law, and we call upon TikTok to change course and restore our client to her former position and responsibilities without delay,” said Steven Williams of the Joseph Saveri Law Firm. “By screening social media posts for objectionable content, content moderators are our frontline soldiers in a war against depravity: a war we all have a stake in winning. The psychological trauma and cognitive and social disorders these workers face is serious. But they are being ignored, and the problems will only grow worse—for the company and for these individuals. It’s our hope and goal that Byte-Dance recognizes its obligations to these workers and creates a safer workplace for them.”

If you are a current or past content moderator for Byte-Dance or a contracted third party, we are here to help. Please contact our firm by completing the form below if the above or a similar situation applies to you, or if you would like to learn more about our investigation.

Any information you provide to us will be kept strictly confidential as provided by law.