Facebook had more than 30,000 employees working on safety and security about half of whom were content moderators.
Facebook assesses potential moderators ability to deal with violent imagery, screening them for their coping skills.
American moderators are more likely to have the cultural context necessary to evaluate U.S. content that may involve bullying and hate speech, which often involve country-specific slang, he says.
Here is a racist joke. Here is a man having sex with a farm animal. Here is a graphic video of murder recorded by a drug cartel.
Miguel is very good at his job. He will take the correct action on each of these posts, striving to purge Facebook of its worst content while protecting the maximum amount of legitimate (if uncomfortable) speech. He will spend less than 30 seconds on each item, and he will do this up to 400 times a day.
Not a position I would wish on anyone.
This is like working in the nine circles of hell.
That's more like the numbers I was thinking of.
I thought there were only Seven Circles of Hell.
Now you’re telling me there are Nine? Oy!