Posted on 02/25/2019 12:02:15 PM PST by yesthatjallen
Facebook in a blog post on Monday affirmed its "commitment" to the workers who moderate the platform's content after a report found many of them suffer from post-traumatic stress disorder (PTSD), drug abuse and anxiety.
The post began by acknowledging "questions, misunderstandings and accusations around Facebook's content review processes," a likely reference to an investigation by The Verge published earlier Monday.
"We want to continue to hear from our content reviewers, our partners and even the media who hold us accountable and give us the opportunity to improve," the company wrote.
The Verge interviewed a dozen current and former Facebook content reviewers in Phoenix, all of whom are third-party contractors with a vendor called Cognizant. Cognizant runs a content moderation site for Facebook that employs around 1,000 people in Arizona.
The contractors described oppressive working conditions, with limited breaks and heavy scrutiny, and a job with an intense emotional toll. Content reviewers are asked to look through hundreds of posts per day, including images and videos of graphic violence, sexual exploitation, hate speech and harassment, in order to flag and take down posts that violate Facebook's complex guidelines.
Facebook has brought on thousands of reviewers in the past several years amid criticism that it has not done enough to remove exploitative and harmful content.
"Given the size at which we operate and how quickly weve grown over the past couple of years, we will inevitably encounter issues we need to address on an ongoing basis," Facebook wrote in the post.
The company wrote that it partnered with Cognizant, as well as Accenture and Genpact, in recent years to scale "our investment in safety and security, including rapidly growing our content review teams."
According to the Verge investigation, content reviewers with Cognizant make less than $30,000 per year, while the average Facebook employee makes around $240,000.
Current and former employees told The Verge that mental health resources for traumatized content reviewers are inadequate or downright unhelpful.
The report also tracked instances in which content reviewers were radicalized by what they were seeing online, as "conspiracy videos and memes that [moderators] see each day gradually lead them to embrace fringe views."
People really started to believe these posts they were supposed to be moderating, one employee told The Verge. They were saying, Oh gosh, they werent really there. Look at this CNN video of David Hogg hes too old to be in school. People started Googling things instead of doing their jobs and looking into conspiracy theories about them. We were like, Guys, no, this is the crazy stuff were supposed to be moderating. What are you doing?
One moderator began to express theories that the earth is flat, while another began to deny that the Holocaust happened, The Verge reported.
Facebook in the post wrote that it is instituting a "rigorous and regular compliance and audit process" to check in on third-party contractors and that it will increase "requirements and expectations" laid out in contracts.
"We encourage all partner employees to raise any concerns with their employers HR teams," Facebook wrote.
A Cognizant spokesperson said the company has looked into some of the complaints raised in The Verge investigation.
"[We have] previously taken action where necessary and have steps in place to continue to address these concerns and any others raised by our employees," the Cognizant spokesperson said, according to CNBC. "Cognizant is committed to providing a healthy, safe and positive work environment for all of our associates."
Many of the employees who spoke to The Verge described psychological damage from the job, including suicidal thoughts and heightened anxiety.
Im f---ed up, man, one worker said. My mental health its just so up and down. One day I can be really happy, and doing really good. The next day, Im more or less of a zombie. Its not that Im depressed. Im just stuck.
I dont think its possible to do the job and not come out of it with some acute stress disorder or PTSD," he said.
while the average Facebook employee makes around $240,000.
I just sent that to a couple of in laws addicted to Facebook and claiming to be conservatives.
Except for the very crappy pay, it actually sounds kinda fun.
YouTube outsources to the Philippines. The place Facebook uses, Cognizant, is based in Phoenix.
Very true. Hospital workers, EMTs, cops. Lots of people see horrible stuff at work every day.
LOL! Independent fact-checking not permitted! Use Snopes!
Cognizant is based in Teaneck, NJ (I interviewed there), but more than half their employees are based in India.
Thank you. I was confusing YouTube with Facebook.
FB would do itself a real big favor if it hired real grownups with open minds and terrific historical and cultural knowledge to do the moderating. And then put them in some isolated town, far away from the influence of their loony left FB coworkers.
It seems like they have been pushing AI to perform creative work in order to take jobs that humans are currently doing. Maybe a better thing for AI to be focused on is cleaning up the messes that the digital revolution has created. Sitting through 8 hours a day of beheading videos is something human beings cannot do without becoming damaged as a result.
If, for example, AI was written/trained to recognize genre and to some extent correctness, it could be used to take the vast array of creative material that is already out there, languishing and unsorted, and machine sort it into a usable and profitable form.
Recognizing the genre of terrorist propaganda video or writing could be an early testing ground since you know darned well that if a video isn’t criminal or terroristic in nature, some user is going to be there to ask “why do you always delete my videos?!”
It’s just a thought. This is bothering me.
I think their instincts are good in getting people to make judgment calls, its just that they are being used for far more clear cut cases as well that are flagrant violations.
More accurately, they are compensated with a package valued at $240,000, whatever this means.
I assumed Phoenix because that's where the Verge article said they interviewed people at.
Could it tell a real beheading video from a clip from Game of Thrones?
Or they could just leave them alone like Russia's Facebook-clone VKontakte does.
It could yank beheading videos and if anyone wants to complain they can account for their relationship with the video. In the case of criminal activity it is likely that they won’t and you can just jank them.
As long as they have to deal with these things they might as well put them to good use.
True and Facebook hired an Indian company to do its moderation. How wrong is that?
Its an Indian company, with a branch in New Jersey. So we are letting foriegners control our social media...
Yeah, but do they have to deal with Laz?
I still have trauma from some of the stuff I saw in government mailboxes from the time I did email admin for a federal department. There are some things that it is just no good for a man to see.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.