With India having the second-highest digital consumption and internet users in the world, makes it a lucrative market and a breeding ground for various social media platforms as well as content providers. In view thereof, the need for and importance of a robust digital content-regulating legislation cannot be undermined. Enacted with an intention to provide the necessary legal framework, the Information Technology Act, 2000 (“IT Act”) and the corresponding rules have had to walk a tight rope. Between protecting the delicate rights of the “digital nagriks”, such as their freedom of speech, right to reputation, copyright in their content etc. on one hand, and, conditionally safeguarding the digital platforms from every other “questionable” content posted/uploaded by third party “nagriks”, on the other, this legislation has been put to test on several occasions. In fact, right after being notified by the Ministry of Electronics and Information Technology (“MeitY”) last year notification last year (May 2021), the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“IT Rules”) have been mired in several litigations and controversies. Recently, on June 6, 2022, the MeitY re-published draft amendments to the IT Rules for public consultation and stakeholders’ comments, after initially withdrawing it on 2nd July, 2022 on account of an ‘editorial’ glitch. The public note accompanying the draft amendments that evidently sets the target on the ‘Big Tech Platforms” questioning their accountability is particularly interesting to note:
Putting the Interests of Digital Indians First
Proposed amended IT rules to provide additional avenues for grievance redressal apart from Courts and also ensure that the Constitutional rights of Indian citizens are not contravened by any Big-tech Platform by ensuring new accountability standards for SSMIs. …..
as the digital eco-system and connected Internet users in India expand, so do the challenges and problems faced by them, as well as some of the infirmities and gaps that exist in the current rule vis-a-vis Big Tech platform. Therefore, new amendments have been proposed to the IT Rules 2021, to address these challenges and gaps.
While the amendments proposed in the draft do not particularly appear to be an attack on the Big Tech platforms (such as Twitter, Meta’s Facebook, Instagram and Google’s YouTube, to name a few), the actual effect of the proposed amendments pose a challenge to their existing autonomy and the compliance requirements now expected of them are nothing short of onerous, making ‘due diligence’ a behemoth task. We have analyzed the four crucial amendments proposed by the MeitY and our comments on the same hereinbelow:
1. Requiring intermediaries to “ensure” that users comply with requirements in rule 3(1)(a) and rule 3(1)(b) of the IT Rules 2021
This amendment requiring “ensuring” users’ compliance of the platform’s rules/policies and “causing” the user not to act in contravention with the provisions of the IT Act and Rules, is vague and worrisome for the following reasons:
ii. ABOVE AND BEYOND THE EXISTING DUE DILIGENCE REQUIREMENTS: The various decisions on ‘due diligence’ requirement of the intermediaries have settled the manner and extent to which the platforms/service providers have to act in order to be safeguarded from user (third party) generated content. The due diligence requirement is limited to publishing policies, terms and conditions, user agreement etc. for its users only informing them that they are required to comply with rule 3(1)(b). And, it is been repeatedly held that the intermediaries cannot be asked to screen or regulate content as it would fall outside the ambit of “intermediary status” as laid down in the IT Act.4 Further, it has also been well established that infringing/objectionable content can only be removed upon the intermediary “receiving actual knowledge” of such content, by way an order or direction from the court or a government agency, as laid down in Shreya Singhal case5 and/or knowledge based on specific information (such as the specific URL/s) given by the person whose work was being infringed by the content uploaded, as established in the MySpace case. If an intermediary fails to remove or disable access to the content which is unlawful upon despite actual knowledge, it shall be in breach of the due diligence threshold and would fall out of the ambit of the ‘safe harbour’ provided to an intermediary. While providing general mechanisms for identifying infringing content, such as the Rights Management Tool, Notice and Take Down provision, Take Down Stay Down tool and Hash Block Filter, have been found as effective tools for complying with the due diligence requirement of social media platforms (MySpace case), none of these existing tools appear to be technologically sufficient in “ensuring” compliance of the platforms’ policies/terms/rules in the manner proposed in the amendment. The ‘due diligence +’ requirement is not only onerous but almost impossible to meet, considering, the Big Techs have to handle a humungous amount of data/content on a daily basis.
2. Addition of rule 3(1)(m) and 3(1)(n) to respect the principles of the Constitution of India
With the laudable intent of protecting and safeguarding the rights guaranteed under the Constitution, the MeitY has proposed an addition to the ‘due diligence’ requirements whereby:
One, the intermediary shall take all reasonable measures to ensure accessibility of its services to users along with reasonable expectation of due diligence, privacy and transparency;
Second, the intermediary shall respect the rights accorded to the citizens under the Constitution of India.
It would not be an exaggeration to say that the devil lies in the lack of details. Not defining key terms in the proposed amendment above, such as “accessibility”, “privacy” transparency” and failing to elucidate what taking ‘reasonable measures’ or ‘respecting the rights’ would practically entail, makes this addition open to interpretation, confusion and invariable litigations (should the amendment come into force, as is). It is again important to flag that any regulation or sorting or control on the content/information by the platforms shall strip them of the intermediary status offering safeguards under the IT Act.
3. Changes in the grievance redressal mechanism of the intermediary under rule 3(2)
The existing mechanism entails acknowledgement of a complaint by the Grievance Officer within 24 hours, followed by its disposal within 15 days of receipt. The amendment clarifies that the complaint includes requiring actions where a user or user account is to be suspended, removed or blocked, or an information or communication link is requested to be suspended.
The amendment has further proposed that where the nature of the complaint requires an information or communication link to be removed, it shall be required to be acted upon and redressed within 72 hours of reporting, with ‘appropriate safeguards’ ‘to avoid any misuse by users’. While the objective of the timeline appears to preventing objectionable/unlawful content’s proliferation (i.e. going viral), there are several practical issues that Big Techs may face:
For starters, it has been acknowledged time and again that Big Techs deal with massive amounts of data daily and consequently, several complaints are raised each day. The redressal timeline of 72 hours may be quite onerous. Further arises the question, whether a delay in meeting with said timeline (by say, a few hours) would strip the platform of the intermediary status?
Secondly, requiring take down of content in such a short time may result in more removals to err on the side of caution, thereby resulting in stifling the free speech that the said amendments were proposed to safeguard in the first place.
Lastly, the “appropriate safeguards” that intermediaries are expected to put in place to avoid misuse of the proposed mechanism by users need to be clarified further. Most intermediaries would not have the wherewithal in the first place to set up such complex technological measures/tools. That, with lack of clarity on what is considered “appropriate” or “sufficient” makes the entire task an uphill one for the platforms/service providers.
4. Creating a new Grievance Appellate Committee to provide an appeal mechanism to users:
The constitution of the Grievance Appellate Committee – which allows the persons aggrieved from the decision of the Grievance Redressal Officer of the intermediary to approach them in appeal – instead of moving to a court of law, is not problematic per se. It provides an alternative forum to file appeals, without taking away the right to move to the court, irrespective.
What is disconcerting however, is that there is no clarity provided under the (draft) rules regarding the constitution of the Committee, the scope of their jurisdiction and powers, the nature of the proceedings, the procedure they would follow and the binding nature of their orders/directions etc.. Most importantly, it is unclear if the intermediary shall also be given an opportunity to present their case in an appeal before the Committee. If not, how shall the intermediary be made to comply by the orders/directions, how would principles of natural justice be upheld without giving a necessary party opportunity to be heard and would that not result in a waste of time and resources, and wouldn’t this invariably lead to appeals before the courts, putting the very purpose of the Committees constitution into question.
To sum up, the draft amendments as they stand today, have validly caused an upheaval among the intermediaries and relevant stakeholders, and are bound to face resistance from the industry. In what shape would the IT Rules, 2021 be notified as amended, after due consultations, is only a matter of time; however, in the present form, the proposed amendments are set to upturn all the developments that laws related to intermediaries have seen thus far and are bound to result in a multitude of litigations on their validity.