What you need to know
- YouTube's moderators have been forced to sign a new document that acknowledges the job could lead to the development of PTSD and other mental health conditions.
- Failure to adhere to the mental health stipulations are classed as "serious misconduct."
- The document has been criticized as coercive and borderline illegal by labor lawyers.
YouTube's moderation practices have come under scrutiny again. A new report from The Verge has uncovered a dubious practice now employed by Accenture, the firm which provides moderation services to YouTube. As per the report, employees are supposed to fill in a form that serves as an acknowledgment of the risks undergone by content moderators.
Excerpts from the document read:
I understand the content I will be reviewing may be disturbing. It is possible that reviewing such content may impact my mental health, and it could even lead to Post Traumatic Stress Disorder (PTSD). I will take full advantage of the weCare program and seek additional mental health services if needed. I will tell my supervisor/or my HR People Adviser if I believe that the work is negatively affecting my mental health.
It also includes further clauses like:
I understand how important it is to monitor my own mental health, particularly since my psychological symptoms are primarily only apparent to me. If I believe I may need any type of healthcare services beyond those provided by [Accenture], or if I am advised by a counselor to do so, I will seek them.
Online moderation is a difficult job at the best of times for moderators, but Accenture's practices aren't the best way to go around it. The practices have been criticized by legal professionals for shifting the blame of mental health deterioration to employees, to reducing the company's liability, and even to illegally forcing employees to disclose mental health conditions under coercion.
While the filling of the form is "voluntary", The Verge reports that users have been threatened with firing for refusing to fill the forms. In other words, not 100% voluntary. A failure to adhere to the stipulations in the document is classed as "serious misconduct", and can serve as grounds for termination.
In response to the report, Google was quoted as giving the following statement:
Moderators do vital and necessary work to keep digital platforms safer for everyone. We choose the companies we partner with carefully and require them to provide comprehensive resources to support moderators' wellbeing and mental health.
Previous attempts by Google to protect its moderators included a time limit on viewing difficult content, limiting it to 4 hours a day. "This is a real issue, and I myself have spent a lot of time looking at this content over the past year. It is really hard," said YouTube's Susan Wojcicki.
Take a journey with MrMobile back to When Phones Were Fun
People love to say that phones aren't as exciting as they used to be — and while MrMobile would argue that foldables make this one of the most interesting periods in mobile history, it's certainly fun to look back on When Phones Were Fun.
Could you replace your home internet with just your phone's data plan?
In-home internet is an expensive monthly bill. If you had to, could you cancel it and rely solely on your phone's data plan?
Qualcomm's new wearables chip is still generations behind the competition
The Snapdragon 4100 series is a big improvement over what we had before, but is still literally generations behind the competition — and it's going to show in the consumer products.
Best USB-C Audio Adapters for Galaxy S20 in 2020
Without a headphone port, you'll appreciate having an audio adapter to go with your Galaxy S20. Here are the best options you can find to use whatever headphones you'd like.