Facebook Discovers That Determining Hate Speech Is A Complex Task And Reinstates Group Hours Later
On Wednesday (9 May), a certain Mr Roy Tan shared an intriguing notification he received from Facebook.
The social media giant informed him that a report he previously made against a certain page had resulted in the group being removed from Facebook.
The reason for reporting the page? Hate speech.
The page? We are against Pinkdot in Singapore (unofficially, WAAPD).
But the removal didn’t last long.
A few hours after Mr Tan’s post, the “public advocacy group” returned to cyberspace.
Its triumphant resurgence was marked by this post from page admin Azman Ivan Tan:
Mr Azman also urged the group’s members to post with more discretion. He reminded them to follow group rules that were set out years ago and to discuss matters based on facts and emotions.
The lesson here, apparently, is that those who oppose homosexuality need to be “the bigger ones”.
But it turns out that WAAPD didn’t get off completely scot-free. Mr Azman also shared the warning that came with the page’s restoration:
Facebook acknowledged that while certain material in the group violated its Community Standards, that should not have led to the entire group being dissolved.
Mr Mark Zuckerberg’s company then apologised for the error and restored the group’s status.
Deciding what is hate speech
The warning proved that hate speech, as defined by Facebook, once spread through the group.
The offensive content has been removed, but how does Facebook draw the line between free and hate speech?
According to the portal, for content to be classified as hate speech, it has to be posed as a “direct attack” on people based on “protected characteristics”, which include:
Serious disability or disease
Discourse that satisfy these requirements are segregated into a three-tiered system.
It is unclear the tier of hate speech that the offending content on WAAPD’s page belonged to.
Still, it was apparently enough to warrant removal (albeit a brief one) of the entire group.
Mr Richard Allen, former British MP and current VP for Public Policy in EMEA (Europe, the Middle East and Africa), penned an introspective blog post about the factors and extenuating circumstances Facebook takes into account when determining hate speech.
Tolerance for speech, he writes, differs from people to people. One man’s sharing of a laugh is another man’s blasphemy.
Some of the things the company takes into consideration are:
Context: Context can heavily differ between different countries. Case in point: the word fags. A slur for gay men in some parts but a term for cigarettes in Britain.
Intent: Determining intent can be the hardest of all. Case in point: using racial/homophobic slurs in attempts to reclaim the word and strip it of its negative power.
Mistakes Facebook has occasionally removed content deemed offensive that actually had the opposite intention. Case in point: Shaun King, a Black Lives Matter activist who got his account blocked in error.
We only hope that members of WAAPD approach the matter in a more nuanced way going forward, with less of this: