What you need to know
- YouTube's moderators have been forced to sign a new document that acknowledges the job could lead to the development of PTSD and other mental health conditions.
- Failure to adhere to the mental health stipulations are classed as "serious misconduct."
- The document has been criticized as coercive and borderline illegal by labor lawyers.
YouTube's moderation practices have come under scrutiny again. A new report from The Verge has uncovered a dubious practice now employed by Accenture, the firm which provides moderation services to YouTube. As per the report, employees are supposed to fill in a form that serves as an acknowledgment of the risks undergone by content moderators.
Excerpts from the document read:
I understand the content I will be reviewing may be disturbing. It is possible that reviewing such content may impact my mental health, and it could even lead to Post Traumatic Stress Disorder (PTSD). I will take full advantage of the weCare program and seek additional mental health services if needed. I will tell my supervisor/or my HR People Adviser if I believe that the work is negatively affecting my mental health.
It also includes further clauses like:
I understand how important it is to monitor my own mental health, particularly since my psychological symptoms are primarily only apparent to me. If I believe I may need any type of healthcare services beyond those provided by [Accenture], or if I am advised by a counselor to do so, I will seek them.
Online moderation is a difficult job at the best of times for moderators, but Accenture's practices aren't the best way to go around it. The practices have been criticized by legal professionals for shifting the blame of mental health deterioration to employees, to reducing the company's liability, and even to illegally forcing employees to disclose mental health conditions under coercion.
While the filling of the form is "voluntary", The Verge reports that users have been threatened with firing for refusing to fill the forms. In other words, not 100% voluntary. A failure to adhere to the stipulations in the document is classed as "serious misconduct", and can serve as grounds for termination.
In response to the report, Google was quoted as giving the following statement:
Moderators do vital and necessary work to keep digital platforms safer for everyone. We choose the companies we partner with carefully and require them to provide comprehensive resources to support moderators' wellbeing and mental health.
Previous attempts by Google to protect its moderators included a time limit on viewing difficult content, limiting it to 4 hours a day. "This is a real issue, and I myself have spent a lot of time looking at this content over the past year. It is really hard," said YouTube's Susan Wojcicki.