What was Facebook thinking when it blocked the account of Kerala journalist VP Rajeena?
Criticles

What was Facebook thinking when it blocked the account of Kerala journalist VP Rajeena?

Surprisingly, those who hurled misogynistic abuse at her for speaking up against child sexual abuse have got off scot- free.

By Arunabh Saikia

Published on :

On October 24, The News Minute reported on a Kerala-based journalist, VP Rajeena’s Facebook account being blocked, following a post she had written about her classmates at a Madrassa being subjected to sexual harassment by a few teachers. Rajeena, a sub-editor with the Malayalam news daily, Madhyamam, received much intimidation and vile abuse for her post on Facebook, as is apparent from one cursory glance through her Facebook profile.

Ironically, though, most of these people who’ve been hurling the choicest of misogynistic abuses at Rajeena continue to have access to their Facebook accounts. (We visited at least five such profiles – and from their timelines, it doesn’t seem Facebook even temporarily blocked them.)

So, what gives?  Why is Facebook suspending the profile of a woman speaking up against sexual exploitation but doing nothing at all about bigoted men heaping sexist insults on her?

According to Facebook’s recently updated Community Guidelines, it doesn’t allow “bullying and harassment” on the platform. “We don’t tolerate bullying or harassment. We allow you to speak freely on matters and people of public interest, but remove content that appears to purposefully target private individuals with the intention of degrading or shaming them,” the corporation informs.

Well, if it is not clear enough already, this is exactly what happened in this case. A private individual was targeted in the most despicable fashion, her integrity questioned, but Facebook instead decided to suspend her account, and completely ignore the perpetrators.

The organisation also details out how it fights sexual violence and exploitation on the platform: “We remove content that threatens or promotes sexual violence or exploitation. This includes the sexual exploitation of minors, and sexual assault.”

Rajeena, in her post, was pointing out the prevalence of sexual exploitation of minors in Madrassas. She was speaking out against something that is often buried under the carpet owing to societal obligations. Her account was based on a personal experience. In effect, her post was completely in spirit with Facebook’s community guidelines: against coercive sexual exploitation of children. Yet, she was censored.

What then explains the rather glaring discrepancy between what Facebook practises and preaches? Why is stuff that clearly does not violate any of Facebook’s guidelines being purged?  A spokesperson of the corporation told me in an email that a technical error led to Rajeena’s account being blocked. “We’re sorry for the trouble it caused. We have restored the account,” the spokesperson added, while choosing to ignore my specific enquiries about how the company comes to a conclusion of what is objectionable and what not.

While it is gracious of Facebook to acknowledge its mistakes, the frequency of these so-called “technical errors” suggests something is flawed with Facebook’s mechanism of flagging unsavoury content. (Only last week, Facebook temporarily pulled down a The Wire article from people’s timelines, only to apologise later.)

Even as Facebook continues to remain tight-lipped, it is is not unreasonable to deduce that Ranjeena’s account was blocked owing to mass reporting by people who had taken umbrage to her post. In short, Facebook bowed down to the whims of a bunch of chauvinists just because they were larger in number. Is it then essentially a case of a half-baked computer algorithm that works purely on the basis of number of complaints and without any human intervention?

Ciara Lyden, who was part of Facebook’s Community Operations team that is responsible for flagging content not in conformity with its policies, in a rare interview to The Independent in September insisted that it is not the case. She said the company had staff – with “good cultural knowledge” – based across four time zones in California, Texas, Dublin and Hyderabad, monitoring the process.

According to Julie de Bailliencourt, Facebook’s safety policy manager for Europe, the Middle East and Africa, once a user reports abuse (like Rajeena’s detractors did), it is rated for its severity and directed to the right team.

Lyden, however, admitted that the company could do well with more of a human touch, while suggesting that it was not an easy task since the site involved more than a billion people who are online 24 hours a day.

In spite of Lyden’s assurances that Facebook is striving to make itself more humane (a real person, she claims, goes through every single report of abuse), recent incidents inspire very little confidence.  Findings of The Electronic Frontier Foundation’s crowd-sourced venture, Online Censorship Project, which documents incidences of censorship on the social media, reveal many recent instances of expurgation that belie Lyden’s claims.

An article in The Drum, ABC News’ online platform argues that even if there was human intervention, the community guidelines set and imposed by Facebook did not account for the kind of diverse cultures the platform catered to.

Rajeena’s profile has now been restored; another “mistake” has been rectified.  But then mistakes only get rectified in the true sense, when you know what went wrong. In this case, no one, except Facebook, seems to know it. And Facebook it seems has vowed to never let us know.

Newslaundry
www.newslaundry.com