Child abuse. Pornography. Bestiality. Murder.
As part of their job, moderators for social websites have to view some of the most disturbing videos and photos on the internet. Once the employees have determined that the images violate the company’s community standards and the law, they delete the accounts of the people who posted them and report the incidents to the National Center for Missing & Exploited Children, per federal law.
Unsurprisingly, having to watch upsetting content like that every day takes a toll on moderators. But two Microsoft employees say their company, one of the largest in the world, failed to provide them with proper support as their mental health deteriorated and they began showing symptoms of Post-Traumatic Stress Disorder, or PTSD.
In a lawsuit filed in King County, Washington, Henry Soto and Greg Blauert allege that they were part of Microsoft’s Online Safety Team, formed in 2007 in response to new federal legislation and complaints from customers about disturbing imagery being stored and distributed using the company’s products.
Never miss a local story.
Soto claims he was assigned to the team and didn’t have any choice in the matter. Both Soto and Blauert say Microsoft did not fully prepare them for what the Online Safety Team would be doing — going through users’ accounts and communications and viewing haunting images of children being abused, sexually exploited and even killed.
According to CourthouseNews.com, Soto said he and his wife had always dreamed of working for Microsoft and moved to Washington specifically to work for the company. However, once he and Blauert were assigned to the Online Safety Team, they were forced to remain there for 18 months before they could request a transfer.
And in those 18 months, both men claim they suffered from continually reviewing disturbing content.
“He had trouble with sleep disturbance, nightmares,” Soto’s lawyers alleged, per the Daily Beast. “He suffered from an internal video screen in his head and could see disturbing images. He suffered from irritability, increased startle, anticipatory anxiety, and was easily distractible.”
After watching a video of a young girl being abused and murdered, Soto says he began to experience “auditory hallucinations.”
Meanwhile, “Mr. Blauert became noticeably withdrawn in the workplace and at home,” court documents posted by CourthouseLaw. com say. “He became listless and avoidant in the workplace. Supervisors authorized him and others to leave work early when they broke down or became overwhelmed by the trauma associated with viewing the depictions. Leaving early on occasions of breaking down was part of the ‘Wellness Plan.’ However, Mr. Blauert was criticized in employment reviews for following his wellness plan.”
Microsoft told McClatchy that employees are not permitted to view the offensive imagery at home or on a personal device and are not allowed to spend their entire work day viewing them.
Still, Blauert suffered from nightmares and outbursts of anger, he claims in the lawsuit.
Microsoft provided counseling sessions for members of the team suffering from “compassion fatigue,” a non-official mental illness characterized by anxiety, tension and apathy towards violence, per the Compassion Fatigue Awareness Project. In those sessions, however, the counselor suggested Blauert take more breaks to walk, smoke and play video games during the day, according to the lawsuit.
It didn’t work, he says. Blauert suffered a “physical and mental breakdown,” with symptoms of insomnia, anxiety, uncontrollable crying and PTSD, per the lawsuit.
Both men say their work has damaged their relationship with their families, especially their children, made it difficult for them to go out in public and severely limited their ability to even go onto the internet.
But while both men applied for worker’s compensation while taking medical leaves due to their PTSD, their claims were denied, per the Daily Beast.
“The worker’s condition is not an occupational disease,” their rejection letters each read.
Soto and Blauert also allege that while Microsoft provided dedicated counseling and funding for a different department dealing with many similar issues, the Digital Crimes Unit, the company violated state law by failing to provide a safe workplace for them and should have realized how harmful viewing such images should have been for their employees.
The men’s lawyers say they provided Microsoft with the results of their investigation before filing suit, but the company did not provide any “material corrections.”
In a statement to McClatchy, a Microsoft spokesperson denied that the company was negligent in regards to its employees’ mental health.
“We disagree with the plaintiffs’ claims. Microsoft takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work,” the statement read.
“This work is difficult, but critically important to a safer and more trusted internet. The health and safety of our employees who do this difficult work is a top priority. Microsoft works with the input of our employees, mental health professionals and the latest research on robust wellness and resilience programs to ensure those who handle this material have the resources and support they need, including an individual wellness plan. We view it as a process, always learning and applying the newest research about what we can do to help support our employees even more.”
In addition to seeking damages for their medical bills and ongoing PTSD, the men requested that Microsoft implement new policies to help other employees in similar situations, including mandatory meetings with psychologists and spousal support programs.
However, Microsoft told McClatchy that the company already provides mandatory monthly meetings with a psychologist for employees, as well as group meetings with a psychologist. In addition, the company said it uses certain techniques, such as blurring, to prevent employees from being exposed to the full realism of the images.