YouTube loosens content moderation rules, citing 'public interest'

YouTube now instructs moderators not to remove content unless more than half of it violates the platform’s policies.

By  Storyboard18Jun 12, 2025 10:20 AM
YouTube loosens content moderation rules, citing 'public interest'
According to training materials reviewed by the Times, YouTube now instructs moderators not to remove content unless more than half of it violates the platform’s policies.

YouTube has relaxed its content moderation rules, now allowing videos that partially violate its policies to remain online if deemed to be in the public interest, The New York Times reported. The change, introduced internally in December 2023, marks a shift in how the platform balances harm reduction with freedom of expression — especially in sensitive areas like politics, health, and social issues.

According to training materials reviewed by the Times, YouTube now instructs moderators not to remove content unless more than half of it violates the platform’s policies. Previously, the threshold was set at 25%. The new approach is said to apply to videos addressing topics such as elections, ideologies, gender, sexuality, immigration, and race.

YouTube is also encouraging moderators to consider whether the "freedom of expression value may outweigh harm risk." If so, they’re told to escalate the decision rather than delete the video outright. The company says this guidance is part of its longstanding exception for educational, documentary, scientific, and artistic content — known internally as the EDSA framework.

“We regularly update our Community Guidelines to adapt to the content we see on YouTube,” spokesperson Nicole Bell told The Verge. She emphasized that these exceptions only apply to a small percentage of videos and help prevent overly broad enforcement. “This practice allows us to prevent, for example, an hours-long news podcast from being removed for showing one short clip of violence,” she said.

The shift builds on YouTube’s earlier decision — made ahead of the 2024 U.S. elections — to allow content from political candidates to stay up, even when it technically violates platform rules, as long as it is considered valuable for public understanding.

The move follows a wider trend among major social media platforms easing moderation. Meta, for instance, has rolled back some of its own policies on hate speech and misinformation this year, ending third-party fact-checking and introducing community-driven corrections instead, similar to X (formerly Twitter).

YouTube had previously cracked down hard on misinformation during Donald Trump’s presidency and the Covid-19 pandemic, removing false claims about vaccines and election fraud. But the latest shift suggests a new phase of hands-off moderation, especially in politically sensitive content, as platforms face increased scrutiny over censorship and bias.

First Published on Jun 12, 2025 8:50 AM

More from Storyboard18

How it Works

Global Google outage disrupts Gmail, Spotify, Discord; services restored

Global Google outage disrupts Gmail, Spotify, Discord; services restored

Digital

Meta launches AI-Powered video editing feature for everyday creators

Meta launches AI-Powered video editing feature for everyday creators

Brand Makers

Malini Agarwal moves on from MissMalini Entertainment

Malini Agarwal moves on from MissMalini Entertainment

Digital

Health ministry wraps final consultation on tobacco rules for streaming platforms

Health ministry wraps final consultation on tobacco rules for streaming platforms

Brand Marketing

NCLAT refuses to stay CCI penalty against UFO Moviez and Qube Cinema

NCLAT refuses to stay CCI penalty against UFO Moviez and Qube Cinema

Brand Marketing

Quick commerce boom faces backlash over food safety failures

Quick commerce boom faces backlash over food safety failures

How it Works

Meta and TikTok challenge supervisory fee in major EU court

Meta and TikTok challenge supervisory fee in major EU court