Facebook’s growing team of human moderators spend their workdays immersed in the most violent, obscene, and offensive dregs of the internet, resulting in PTSD, addiction, and adopting the very beliefs they’re hired to suppress.
Overworked and underpaid, Facebook’s third-party content moderators are frequently traumatized, chewed up, and spit out by the company they work for, according to a report by The Verge’s Casey Newton that paints a picture of a scene far removed from the stereotypical Silicon Valley tech utopia.
“I don’t think it’s possible to do the job and not come out of it with some acute stress disorder or PTSD,” a former employee, now diagnosed with both, told the outlet.
Moderators work for contractors like Cognizant, where they’re trained to filter out the torrents of violence, pornography, and offensive speech that flow through the platform daily, in return for a $15 hourly wage that represents slightly over 10 percent of the average Facebook worker’s paycheck, and psychological issues they drown in “sex, drugs, and offensive jokes” – plus the occasional violent outburst.
Employees reportedly copulate like rabbits wherever they can find the space – in stairwells, in lactation rooms, in parking garages, even in bathrooms, occupying stalls already so scarce that workers spend most of their break time waiting in line to use them. One employee referred to the compulsive coupling as “trauma bonding” – a reaction to both the enforced secrecy (employees must sign non-disclosure agreements and are forbidden to discuss their work for Facebook, or even acknowledge that Cognizant works for Facebook), and the constant onslaught of disturbing imagery dancing across their screens. Others drink or smoke marijuana compulsively, or compete to send each other the vilest memes they come across during their content moderation.
The threat of termination looms large – employees can be fired for racking up just a few errors per week – and that threat weighs just as heavily on those who remain with the company, some of whom “live in fear of former colleagues returning to seek vengeance” – even bringing a gun to work for protection, as one employee did after finding a gang of fired mods waiting for him outside the office. Facebook’s target of 95 percent “accuracy” – meaning moderators’ decisions match those of a company auditor evaluating random posts – seems to hover just out of reach, further compounding the stress of a workplace where employees lack even a permanent workstation to call their own.
Constant exposure to “fringe” views wears away at moderators’ convictions, in an ironic twist for a platform that has repeatedly clamped down on political content that deviates from the mainstream. Former employees confessed to The Verge that they no longer believed the official 9/11 story, that peers had begun questioning “certain aspects” of the holocaust, or that some colleagues were actively seeking converts to flat-earth theory.
“Cognizant makes a counselor available to employees, but only for part of the day,” The Verge explains, and traumatized content mods are out of luck as soon as they leave the company. According to one employee, this is extremely common: “a lot of people…go through those four weeks [of training] and then they get fired,” she said, with “absolutely no access to counselors after that.” A counselor The Verge asked about the risk of immersion in the dregs of the internet even tried to spin employees’ PTSD diagnoses as a positive, claiming “some people can experience ‘post-traumatic growth,’ where trauma victims become stronger.”
If you like this story, share it with a friend!