Content warning: This story contains references to violence, suicide, child abuse and self-harm. A suicide attempt, depression, substance abuse, insomnia, surveillance, threats. These are just some of the experiences reported by the low-paid moderators tasked with sifting through Facebook and Instagram’s most disturbing images. The tech giant Meta, which owns both platforms, has kept the whereabouts of this operation a closely guarded secret since moving it from Kenya, where the company is facing lawsuits over working conditions and human rights. For months, it has also refused to name the company that won the lucrative contract to provide the content moderators...