Safe harbour provisions under scrutiny as fake news surges on social media platforms

Safe Harbour provisions under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 currently shield social media intermediaries from liability for user-generated content, provided they exercise due diligence and follow a grievance redressal mechanism.

By  Imran Fazal| Sep 12, 2025 4:05 PM

The Standing Committee on Communications and Information Technology has flagged serious concerns over the Safe Harbour provisions applicable to social media intermediaries in its draft report on fake news. The report provides a detailed analysis of how the existing regulatory framework has failed to keep pace with the growing menace of fake news, particularly on platforms like Facebook, Twitter (now X), Instagram, YouTube, and WhatsApp.

Safe Harbour provisions under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 currently shield social media intermediaries from liability for user-generated content, provided they exercise due diligence and follow a grievance redressal mechanism. However, the report notes that this has largely resulted in an accountability vacuum, where intermediaries have neither been adequately proactive nor sufficiently held responsible for preventing the spread of fake and misleading content.

The Ministry of Electronics and Information Technology (MeitY) emphasized that intermediaries are mandated to make reasonable efforts to ensure users do not post information that deceives or misleads about the origin of the message, or communicates false information, particularly in relation to government business. Yet, the efficacy of these provisions has been under intense scrutiny.

A landmark judgment by the Bombay High Court in September 2024 struck down Rule 3(1)(b)(v) of the IT Rules, 2021 (amended in 2023), which empowered a government-appointed authority to direct social media platforms to remove or block content deemed fake, false, or misleading. The High Court ruled that this violated constitutional guarantees under Articles 14, 19(1)(a), and 19(1)(g) of the Indian Constitution, making the rule ultra vires the IT Act, 2000.

In response, MeitY is in the process of filing a Special Leave Petition (SLP) in the Supreme Court, arguing that the establishment of a statutory Fact Check Unit (FCU) under the Press Information Bureau (PIB) is critical to prevent the widespread circulation of fake news concerning government policies and programs.

Since its inception in November 2019, the FCU under the Press Information Bureau has adopted a four-step FACT model—Find, Assess, Create, Target—to counter misinformation related to government affairs. The FCU posts verified information and debunks false claims on its official social media handles, boasting substantial follower counts across platforms (e.g., over 320,000 followers on X).

Despite these efforts, the FCU lacks statutory enforcement powers. It can only raise awareness by identifying and flagging fake news but is unable to mandate takedown or penalize violators. The Ministry itself acknowledged that the current mechanism is reactive, relying largely on public complaints and its internal monitoring rather than proactive removal of misleading content.

Gaps in Safe Harbour Regulations

Industry stakeholders and experts highlighted several loopholes in the Safe Harbour regime:

Absence of Designated Nodal Officers: There is no statutory requirement for intermediaries to appoint accountable officers responsible for action against fake news.

Arbitrary Time Frames: The current guidelines do not prescribe strict deadlines for the removal of content, leaving a wide scope for delay.

Foreign Origin of Content: Many fake news sources operate from jurisdictions such as Guatemala and Nigeria, making it almost impossible for Indian authorities to enforce action under the existing laws.

AI-Generated Content: Sophisticated deepfake videos and AI-generated misinformation complicate the task of distinguishing fact from fabrication.

The Editors Guild of India (EGI) recommended creating a mandatory system of Designated Nodal Officers within platforms to ensure that complaints and flagged content are addressed within six hours. They also stressed the need for licensing AI content generators under strict terms and conditions to prevent misuse.

A Call for a Collaborative, Multi-Stakeholder Approach

The draft report underlines that while the government has made significant steps by establishing the FCU and issuing the IT Rules, these measures are insufficient. To be truly effective, Safe Harbour provisions need reformation to include stringent accountability measures, clear time-bound actions, and statutory power for fact-checking units.

The Committee’s detailed observations signal an urgent call to action for both policymakers and social media intermediaries to collaboratively address this evolving threat. Without robust reforms, the unchecked spread of misinformation threatens not only public discourse but also the very fabric of India’s democracy.

The Committee strongly urged that curbing fake news should not fall solely on government agencies. It requires a broader roadmap involving:

Improved digital literacy as part of the formal education curriculum.

Active cooperation between MeitY, the Ministry of Information and Broadcasting (MIB), Department of Telecommunications (DoT), civil society, technology companies, and fact-checking organizations.

Greater public awareness campaigns to sensitize citizens about misinformation tactics.

Additionally, the Committee called for rethinking the definition of "Fake News" and integrating it clearly into regulatory frameworks for print, electronic, and digital media. They pointed out that the polysemic and vague use of the term “fake news” is prone to abuse, especially by those wishing to delegitimize the media or suppress dissent.

First Published onSep 12, 2025 3:35 PM

SPOTLIGHT

Brand MarketingAI, storytelling or speed: What’s the new B2B marketing edge?

Today’s B2B marketers wear many hats: strategist, technologist, and storyteller.

Read More

Explained: What the Online Gaming Bill means for the industry, users and platforms

The Online Gaming Bill 2025 imposes severe penalties, allows warrantless search and seizure, and empowers a central authority to regulate the digital gaming ecosystem. It is expected to disrupt platforms, payment systems, and advertising in the sector. Here's all you need to know about the bill.