IT ministry notifies creation of Grievance Appellate Committee that will oversee content moderation decisions

The government-appointed committee’s decisions will be binding on the intermediary.

WrittenBy:Aditi Agrawal
Article image

Update at 7.47 pm, Oct 28: The IT ministry notified the amendments to the IT Rules 2021 on October 28. The headline to this report was changed to reflect that. No changes were made to the notified rules, so the text of this report remains the same.

The Indian government will have the ability to rule on the validity of content moderation decisions taken by all intermediaries, including social media platforms, once the IT ministry notifies amendments to its IT Rules 2021. Two IT ministry sources told Newslaundry the amended rules will be notified in the next few days.

Despite significant pushback from industry and civil society, the IT ministry is pushing ahead with setting up a Grievance Appellate Committee – a government-appointed body that can be approached by users dissatisfied with decisions made by the intermediary’s grievance officer. 

Newslaundry has learned that the department of legislative affairs, which is responsible for vetting all new legislation, had orally expressed concerns to IT ministry officials about the legality of creating an appellate body through a subordinate legislation. But these concerns were not put in writing. It is not clear how these concerns were subsequently resolved.

Here’s a run-down of what the amendments contain according to the latest version of the document that Newslaundry has seen.

How will GACs deal with grievances?

Each GAC will comprise three government-appointed, full-time members, including a chairperson. One must be a former government officer while the other two will be independent members. The GAC is empowered to approach anyone with “requisite qualification, experience and expertise in the subject matter” for assistance. 

A user with a grievance will first approach the intermediary’s grievance officer, who must acknowledge the complaint within 24 hours and resolve the grievance within 72 hours for some kinds of content, and within 15 days for others. Content with non-consensual nudity must be taken down within 24 hours of receiving the complaint. 

The grievance officer will then inform the user about the resolution of their complaint. If the user is dissatisfied, they can approach the GAC within 30 days. The GAC must then resolve the complaint within 30 days, the intermediary must comply with the GAC’s order, and upload a report to that effect on its website. 

The GAC must carry out the entire grievance appeal process through an online dispute resolution mechanism. 

What grievances need to be resolved in 72 hours?

According to the latest version of the rules, within 72 hours, grievance officers must redress complaints on content that is “obscene, pornographic, paedophilic”, violates another person’s privacy, or is objectionable on the basis of gender, race or ethnicity, or relates to or encourages money laundering or gambling, or promotes enmity or incites violence with respect to religion or caste. 

Complaints must also be resolved within 72 hours for content that is harmful to a child; a threat to national security, sovereignty, diplomatic relations or public order; contains any virus or malware; impersonates another person; is deceptive about who sent it; or contains information that is “patently false and untrue or misleading in nature”.

This doesn’t mean such content must be removed in 72 hours, just that the grievance officers must redress the complaint within 72 hours of the complaint being filed. 

Unlike the amendments that were put out for public consultation in June 2022, grievances that need to be resolved within 72 hours no longer include content that may belong to another user, content that violates intellectual property rights, and content that violates current laws.

What about self-regulation by intermediaries?

In June, during an open consultation on the amendments (which this journalist had attended), minister of state for electronics and IT Rajeev Chandrasekhar had said the government was open to the creation of a self-regulatory body to address user complaints on content moderation. He added that the government was not interested in acting as the moderator for platforms as long as the platforms obeyed Indian laws. 

The Internet and Mobile Association of India, an industry body, had proposed creating a self-regulatory body for social media platforms as an appellate platform. But social media platforms were unable to reach a consensus – Meta and Twitter supported the creation of the body that would exist above their respective grievance officers, while Google vehemently opposed it.

Meanwhile, streaming platforms and digital news publishers had been forced to create self-regulatory bodies through the IT Rules 2021. But both of them have an intermediate layer between their own grievance officers and the government in the form of self-regulatory bodies. Intermediaries do not. 

What obligations might be diluted compared to the proposed amendments?

The latest amendments make it obligatory for intermediaries to ensure their users cannot “host, display, upload, modify, publish, transmit, store, update or share” any information that violates certain standards – infringing on intellectual property rights, for instance, or being harmful to a child. 

During the June consultation meeting between stakeholders and Chandrasekhar, multiple concerns had been raised about this. A Nasscom representative had said that placing such an obligation means an intermediary would have to ensure users do not even type objectionable content on their platforms. In such a case, intermediaries would have to carry out proactive censorship. 

Let’s break this down. Suppose a user tries to sell a stolen good on OLX. The proposed amendment would have made it mandatory for OLX to prevent the ad from going up in the first place. Or if an Airtel user shared images of sexual acts without the consent of the subjects, it would have been Airtel’s responsibility to ensure it didn’t happen.

This has been diluted in the latest version of the amendments. According to these, intermediaries only have to make “reasonable efforts”. They’re also no longer required to ensure that their users don’t share information that is “patently false and untrue” or published/written “with the intent to mislead or harass”, as had been proposed earlier. 

Which obligations might have been tightened?

Intermediaries are expected to be required to ensure their services are accessible to users along with a “reasonable expectation of due diligence, privacy and transparency”. This proposed amendment has been retained in its entirety in the latest version. During the consultation period, stakeholders had sought clarification about whether this “accessibility” only referred to access to services or also to access to services for the differently abled.

The latest version says that intermediaries also have to ensure their users comply with rules and regulations, privacy policies and user agreements – all three of which must be made accessible to users in English or any of the 22 languages specified in the eighth schedule to the constitution. 

Is there anything that’s unclear?

As with most legislation that has emerged from the IT ministry in the last few years, yes, there is a fair bit of ambiguity about certain aspects of the latest version of the amendments. 

For instance, the government can notify one or more GACs within three months of the notification of the rules. Can new GACs be constituted after this three-month period has lapsed? How will the GAC be held accountable, and by whom? 

Additionally, “intermediaries” cover an entire gamut of internet service providers – telecom service providers, cyber cafés, VPNs, domain registrars like GoDaddy, content delivery networks like Cloudflare, web hosts, cloud service providers, content management systems, payment processors, e-commerce platforms, social media platforms. Will each kind of intermediary have its own GAC? This question was repeatedly raised during the consultation period too. 

If multiple GACs exist, will an intermediary that performs multiple functions – like Facebook being a social media platform and running Marketplace – be part of multiple GACs? And will it be a complainant’s responsibility to figure out which is the right GAC to approach?

So far, there’s no clarity. 

Update on Oct 28, 8.47 am: A previous version of the story said the latest amendments make it obligatory for intermediaries to ensure users cannot host information that is defamatory. This is incorrect and has been removed.

Also see
article imageSansad Watch Ep 6: How to 'rule' over digital media


We take comments from subscribers only!  Subscribe now to post comments! 
Already a subscriber?  Login

You may also like