Policy Briefing — India Issues Intermediary and Digital Media Rules
India’s 2021 Intermediary Guidelines and Digital Media Ethics Code overhauled due diligence, traceability, and grievance processes for social platforms, messaging services, and digital publishers, forcing sweeping governance, staffing, and compliance reporting changes within three months.
On India notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, replacing the 2011 regime and reshaping how social media intermediaries, messaging services, news publishers, and over-the-top (OTT) streaming platforms operate. Issued under section 87 of the Information Technology Act, the rules tie safe-harbour protections to extensive due diligence obligations, introduce traceability mandates for encrypted messaging, and establish a three-tier compliance structure for digital media. Non-compliance exposes companies to loss of immunity and potential criminal liability under the IT Act and Indian Penal Code.
Scope spans intermediaries and digital publishers
The rules classify intermediaries into two groups: general intermediaries (Rule 3) and significant social media intermediaries (SSMIs) under Rule 4, defined by user thresholds notified by the Ministry of Electronics and Information Technology (currently five million registered users in India). They also cover publishers of news and current affairs content and publishers of online curated content (OTT platforms), each subject to dedicated code-of-practice obligations. All entities must publish rules, privacy policies, and user agreements that prohibit unlawful content, including material threatening sovereignty, public order, decency, or intellectual property.
Baseline due diligence for all intermediaries
Rule 3 requires intermediaries to inform users of prohibited content, remove or disable access within 36 hours upon receiving a court order or government notice, and preserve relevant records for at least 180 days. Intermediaries must acknowledge user grievances within 24 hours, resolve them within 15 days, and provide mechanisms to flag content. On receipt of actual knowledge about non-consensual intimate imagery, intermediaries must act within 24 hours. They must also, upon lawful request, furnish information for identity verification or investigation within 72 hours, while maintaining reasonable security practices to protect user data.
Additional obligations for significant social media intermediaries
SSMIs must appoint three India-based officers: a Chief Compliance Officer (CCO) responsible for ensuring adherence to the Act and rules, a nodal contact person available 24/7 for law-enforcement coordination, and a resident Grievance Officer who handles user complaints. These officers must publish their contact details and file monthly compliance reports detailing takedown actions, proactive monitoring, and grievance redressal metrics. SSMIs must deploy automated tools to identify child sexual abuse material, rape or rape-related content, and content already subject to takedown orders, while preserving audit trails to evidence proactive moderation.
Traceability and messaging platform requirements
Rule 4(2) compels significant social media intermediaries providing messaging services to enable identification of the first originator of information upon receipt of a lawful order related to specified offences (such as national security, sexual violence, or public order). Providers must retain metadata necessary for traceability while minimising data retention. The rule preserves message content privacy but obligates platforms to engineer system changes—potentially undermining end-to-end encryption—within three months of the rules taking effect. Providers must also retain the content of removed messages and associated records for 180 days or longer if required by authorities.
Notice, takedown, and reinstatement workflow
SSMIs must notify users when removing or disabling access to their content, citing specific rule violations, outlining review mechanisms, and providing appeal options. Interim removal is permitted for emergent risks, but final decisions require review by the Grievance Officer within 48 hours. Intermediaries must ensure transparency by publishing the number of proactive removals, government requests, and user complaints addressed each month. Data about removal decisions must be preserved for audits and potential litigation.
News and OTT publishers face a three-tier regulatory structure
Part III introduces a tiered self-regulation framework for digital news publishers and OTT services. Level I requires publishers to establish in-house grievance officers who resolve complaints within 15 days and issue apologies, content edits, or removal as needed. Level II creates self-regulating bodies led by retired judges or eminent persons that hear appeals, issue advisories, and ensure compliance with a detailed Code of Ethics (covering classification, age ratings, parental controls, and journalistic standards). Level III forms an Inter-Departmental Committee chaired by the Ministry of Information and Broadcasting, empowered to order apologies, content modification, age-gating, or blocking under the IT Act in cases of persistent non-compliance.
Parental controls, content classification, and access restrictions
OTT platforms must classify content using age-based categories (U, U/A 7+, U/A 13+, U/A 16+, and A), implement content descriptors (e.g., language, violence, nudity), and offer parental locks for adult content. Publishers must deploy age-verification mechanisms for ‘A’ rated titles and incorporate compliant display cards. They must also adhere to due diligence for depiction of sexual violence or religion, aligning with India’s Programme Code and other statutory norms.
Timeline and transition expectations
Most obligations took effect immediately upon notification, while SSMIs received a three-month grace period—until —to appoint officers, enable traceability, and implement automated monitoring. Publishers had 30 days to constitute grievance redressal cells and 60 days to form self-regulatory bodies. Companies needed to update privacy policies, user terms, and community guidelines promptly to preserve safe-harbour protection.
Penalties and loss of safe harbour
Failure to observe due diligence strips intermediaries of section 79 safe-harbour immunity, exposing them to prosecution for user-generated content. Authorities may pursue criminal charges or monetary penalties, and courts can compel platform executives to appear in person. Persistent non-compliance could invite blocking orders under section 69A of the IT Act. Publishers face takedown directives, apology requirements, or blocking, while self-regulating bodies risk deregistration.
Governance actions for platforms and publishers
Platforms must establish cross-functional compliance offices, codify standard operating procedures for takedowns, track escalation timelines, and implement incident response drills for law-enforcement requests. Messaging providers should conduct encryption impact assessments, evaluate metadata retention architectures, and document how traceability requests will be honoured without compromising user privacy. Publishers need compliance calendars for grievance deadlines, invest in age-rating workflows, and maintain audit logs that demonstrate adherence to the Code of Ethics.
Engagement with regulators and documentation
Entities should maintain ongoing dialogue with MeitY and the Ministry of Information and Broadcasting, participate in industry associations, and prepare periodic compliance reports. Boards should receive dashboards on grievance metrics, takedown timelines, law-enforcement queries, and self-regulation outcomes. Because judicial challenges to the rules continue, legal teams must track court developments while ensuring current obligations are met unless a stay is granted.
Next steps
Within the three-month compliance window, organisations should finalise officer appointments, train moderation teams, deploy user-notification templates, and update incident management systems. After implementation, they must schedule quarterly reviews to verify SLA adherence, refresh transparency reports, and audit data retention. Preparation for potential parliamentary review or future amendments will help sustain compliance as India refines its digital governance landscape.
Continue in the Compliance pillar
Return to the hub for curated research and deep-dive guides.
Latest guides
-
Third-Party Risk Oversight Playbook — Zeph Tech
Operationalize OCC, Federal Reserve, EBA, and MAS outsourcing expectations with lifecycle controls, continuous monitoring, and board reporting.
-
Compliance Operations Control Room — Zeph Tech
Implement cross-border compliance operations that satisfy Sarbanes-Oxley, DOJ guidance, EU DORA, and MAS TRM requirements with verifiable evidence flows.
-
SOX Modernization Control Playbook — Zeph Tech
Modernize Sarbanes-Oxley (SOX) compliance by aligning PCAOB AS 2201, SEC management guidance, and COSO 2013 controls with data-driven testing, automation, and board reporting.




