ugc_banner

Facebook content moderators urge for safety precautions amid pandemic

WION Web Team
San Francisco, United StatesUpdated: Nov 19, 2020, 09:26 AM IST
main img
Photograph:(Others)

Story highlights

The content moderators are responsible for keeping the platform free of hate speech and false information, which has played a very important role during the pandemic and the recently concluded US elections

As the world was hit by the coronavirus pandemic at the beginning of this year, majority of the companies shifted to working from home to safeguard their employees from the deadly coronavirus. This included the social media giant Facebook.

However, now more than 200 Facebook content moderators have urged the social media company to provide better health and safety precautions for the workers, who have now been called back to the office.

"After months of allowing content moderators to work from home, faced with intense pressure to keep Facebook free of hate and disinformation, you have forced us back to the office," said the open letter released by the British-based legal activist firm Foxglove.

The content moderators are responsible for keeping the platform free of hate speech and false information, which has played a very important role during the pandemic and the recently concluded US elections. While the moderators were working from home till now, they have been called back to the office for more 'effective working'.

Moderators have signed a petition asking the Zuckerberg-led company to "keep moderators and their families safe" rather than exposing them to the deadly virus. The petitioners have requested for remote working to the highest possible limit, or at least offer"hazard pay" to the moderators who come to the office in the pandemic.

"We appreciate the valuable work content reviewers do and we prioritize their health and safety," a Facebook spokesperson said in a statement to AFP. 

"The majority of these 15,000 global content reviewers have been working from home and will continue to do so for the duration of the pandemic," the spokesperson said. 

The petitioners have said the current environment does call for human moderators, as the Artificial Intelligence has failed to match up to the standards and the company has now discovered limitations on what the employees can manage in remote working, after which they decided to turn to AI, which resulted in shortcomings.

"The AI wasn't up to the job. Important speech got swept into the maw of the Facebook filter -- and risky content, like self-harm, stayed up," the letter said.

"The lesson is clear. Facebook's algorithms are years away from achieving the necessary level of sophistication to moderate content automatically. They may never get there."

The moderators have also urged Facebook to turn the moderators to full-term employees.