What are the Due Diligence need to be followed by Intermediaries?

Adv Rishabh Kumar

Published on: December 15, 2023 at 14:15 IST

Intermediaries or Significant Social Media Intermediaries (SSMI) have been largely unregulated in India which had significantly harmed the Indian public with social media triggered hate crime and other criminal offences throughout the country. Although such regulations are well needed to protect the data of Indians from invasive social media companies, the same can also be misused by the central authorities to unjustly curb freedom of speech.

When it comes to, due diligence is crucial to ensure compliance with legal requirements and to mitigate potential risks. Intermediaries often play a role in facilitating communication, transactions, or the hosting of content online.

Intermediaries refer to entities or individuals that facilitate the transmission, storage, or hosting of content generated or provided by third parties. These intermediaries play a crucial role in the digital ecosystem by providing platforms, services, or infrastructure that enable users to create, share, and access information online. The term “intermediary” is often associated with legal concepts such as intermediary liability and safe harbor provisions. Common types of intermediaries include:

  • Social Media Intermediaries:Social media intermediaries refer to platforms or services that enable users to create, upload, share, and disseminate information in various forms, including text, images, videos, etc.
  • Internet Service Providers (ISPs): ISPs provide access to the internet and may offer services such as broadband, dial-up, or wireless connectivity. They act as intermediaries in transmitting data between users and the broader internet.
  • Hosting Service Providers: These providers offer storage space and server facilities to individuals or organizations for hosting websites, applications, or content. Examples include web hosting companies and cloud service providers.
  • Online Marketplaces: E-commerce platforms that connect buyers and sellers, facilitating transactions and the exchange of goods or services. Examples include Amazon, eBay, and Etsy.
  • Search Engines: Platforms that enable users to search for and access information on the internet. Search engines index and display search results based on user queries. Google, Bing, and Yahoo are examples of search engines.
  • Communication Platforms: Services that facilitate communication between users, such as email providers, messaging apps, and video conferencing platforms.
  • Payment Processors: Intermediaries that handle financial transactions between parties, often in the context of online commerce. Examples include PayPal, Stripe, and Square.
  • Domain Registrars: Entities that facilitate the registration and management of domain names on the internet.
  • Content Delivery Networks (CDNs): CDNs optimize the delivery of web content by distributing it across multiple servers geographically. They help improve website performance and reliability.

Intermediaries often benefit from legal protections known as safe harbor provisions or limitations of liability. These provisions shield intermediaries from being held strictly liable for the actions of their users, provided they meet certain conditions, such as promptly responding to infringement notices or taking down objectionable content.

These Guidelines were issued by the Ministry of Electronics and Information Technology (MeitY) in India. These rules aimed to regulate digital platforms, including social media intermediaries, digital news organizations, and over-the-top (OTT) streaming services

  • Appointment of Grievance Officer: Social media intermediaries are required to appoint a Chief Compliance Officer, a Nodal Contact Person, and a Grievance Officer, who should be residents in India. These officers are responsible for addressing user grievances and ensuring compliance with the rules.
  • Traceability of Messages: Platforms with more than 5 million users are required to enable the identification of the originator of information for the purpose of investigating and preventing certain offenses.
  • Content Removal Timelines: Social media intermediaries are expected to remove or disable access to unlawful content within 36 hours of receiving a court order or government directive.
  • Self-Regulation and Oversight Mechanism: Social media intermediaries are encouraged to develop self-regulatory mechanisms to address concerns related to content moderation and user grievances.
  • User Consent: Platforms are required to provide users with the option to voluntarily verify their accounts, and they should inform users about the consequences of not doing so.

The due diligence requirements for intermediaries under the Information Technology (IT) Act in India, particularly as outlined in the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, include several key aspects.

These rules lay down specific obligations for intermediaries to follow in order to avail themselves of safe harbor protections and manage their operations responsibly. Here are some of the due diligence measures that intermediaries are expected to take:

Sections 3 and 4 of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, outline essential due diligence measures and establish a robust grievance redressal mechanism for social media intermediaries.

1. Publication of Rules and Regulations:

Social media intermediaries are mandated to publish their rules, regulations, privacy policies, and user agreements on their websites. This transparency ensures that users are informed about the platform’s guidelines, fostering responsible online behavior.

2. Prohibition of Harmful Content:

Intermediaries must inform users not to host, display, upload, transmit, or share content that is harmful to children, infringes intellectual property rights (IPR), is defamatory, obscene, violates any law, threatens national security, or contains malicious software. This proactive stance helps create a safer online space.

3. Online Safety and Dignity:

Ensuring the online safety and dignity of users, especially women, is a priority. Intermediaries are expected to implement measures that protect users from harassment, cyberbullying, and any form of online abuse.

4. Timely Complaint Resolution:

In response to user complaints, intermediaries are required to remove or disable access to objectionable content within 24 hours. This swift action is vital in addressing concerns and maintaining a secure digital environment.

5. Compliance with Legal Orders:

Upon receiving court orders or notifications from the government, intermediaries must refrain from hosting or publishing any information deemed unlawful. This underscores the importance of aligning with legal requirements to uphold national interests.

6. Information Retention:

Intermediaries must retain user information for 180 days after the cancellation or withdrawal of registration. This aids in investigations and ensures compliance with data retention regulations.

7. Collaboration with Government Agencies:

Providing information to government agencies for identity verification and cybersecurity incident prevention is a key responsibility. Intermediaries serve as partners in maintaining national cybersecurity.

8. Adherence to Security Practices:

Following the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Information) Rules, 2011, intermediaries must implement reasonable security practices and procedures. This helps safeguard user data and privacy.

Recognizing the heightened responsibility of significant social media intermediaries, additional measures are prescribed:

a. Appointment of Key Officers:

Significant intermediaries must appoint a Chief Compliance Officer, a Nodal Contact Person, and a Resident Grievance Officer, all residing in India. These officers ensure compliance, coordinate with law enforcement, and address user grievances.

b. Monthly Compliance Report:

Publishing a monthly compliance report, including details of complaints received and actions taken, enhances transparency and accountability. Users gain insights into the platform’s commitment to addressing concerns.

c. Identification of First Originator:

For messaging services, the ability to identify the first originator of information is mandated. This facilitates accountability without compromising the content of messages or the originator’s information.

d. Voluntary Account Verification:

Intermediaries offering messaging services are encouraged to allow users to voluntarily verify their accounts. Verified users receive a visible mark, fostering trust and authenticity in online interactions.

The due diligence measures outlined in Sections 3 and 4 of the Intermediary Guidelines are pivotal in shaping a responsible and secure digital ecosystem. By adhering to these guidelines, intermediaries contribute to a safer online space, protect user rights, and support the broader goals of national cybersecurity. It is incumbent upon intermediaries to embrace these responsibilities, fostering a digital environment that promotes trust, transparency, and accountability.

In conclusion, the due diligence required by intermediaries, as outlined in The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, plays a pivotal role in shaping a responsible and accountable online ecosystem. By adhering to these guidelines, the platforms contribute to the promotion of a safer digital space while addressing the dynamic challenges posed by user-generated content.

Edited by: Bharti Verma, Associate editor at Law Insider

Related Post