ugc_banner

In a first, Facebook offers estimate of hate speech prevalence on its platform

WION Web Team
California, United StatesUpdated: Nov 20, 2020, 04:39 PM IST
main img
Photograph:(Reuters)

Story highlights

Facebook's rivals Twitter and YouTube, owned by Alphabet Inc's Google, do not disclose comparable prevalence metrics

Social media giant Facebook on Friday offered the first-ever estimate of hate speech prevalence on its platform.

Facebook's rivals Twitter and YouTube, owned by Alphabet Inc's Google, do not disclose comparable prevalence metrics.

The world's largest social media company said out of every 10,000 content views in the third quarter, 10-11 were hate speech.

Facebook, under scrutiny over its policing of abuses, particularly around November's US presidential election, released the estimate in its quarterly content moderation report.

The tech giant, which has 1.82 billion daily users globally, has drawn flak in the past for its handling of hate speech on the platform in India, which is among its biggest markets.

In its Community Standards Enforcement Report for September 2020 quarter, Facebook said it is including the prevalence of hate speech on its platform globally "for the first time".

Facebook said it took action on 22.1 million pieces of hate speech content in the third quarter, about 95 per cent of which was proactively identified, compared to 22.5 million in the previous quarter.

The company defines 'taking action' as removing content, covering it with a warning, disabling accounts, or escalating it to external agencies.

On Instagram, the company took action on 6.5 million pieces of hate speech content (up from 3.2 million in June quarter), about 95 per cent of which was proactively identified (up from about 85 per cent in the previous quarter), it added.

The latest Community Standards Enforcement Report provides metrics on how Facebook enforced its policies from July to September, and includes metrics across 12 policies on Facebook and 10 policies on Instagram.

Facebook Vice President (Integrity) Guy Rosen said the company is also updating its Community Standards website to include additional policies that require more context and can't always be applied at scale.

These policies often require specialised teams to gather more information on a given issue in order to make decisions, he added.
The Community Standards Enforcement Report is published in conjunction with Facebook's biannual Transparency Report.

The Transparency Report shares numbers on government requests for user data, content restrictions based on local law, intellectual property takedowns and internet disruptions.

During the first six months of 2020, government requests for user data increased by 23 per cent from 140,875 to 173,592, it said.

This summer, civil rights groups organized a widespread advertising boycott to try to pressure Facebook to act against hate speech.

The company agreed to disclose the hate speech metric, calculated by examining a representative sample of content seen on Facebook, and submit itself to an independent audit of its enforcement record.

On a call with reporters, Facebook's head of safety and integrity Guy Rosen said the audit would be completed "over the course of 2021."

The Anti-Defamation League, one of the groups behind the boycott, said Facebook's new metric still lacked sufficient context for a full assessment of its performance.

"We still don't know from this report exactly how many pieces of content users are flagging to Facebook, whether or not the action was taken," said ADL spokesman Todd Gutnick. That data matters, he said, as "there are many forms of hate speech that are not being removed, even after they're flagged."