YouTube relaxed moderation rules without public announcement

YouTube changed its approach to content moderation in December 2024 without making an official announcement. This was reported by The New York Times, citing internal documents and training materials from the platform.
According to the new regulations, YouTube moderators can now leave videos that violate the platform’s rules – such as a ban on profanity, violent scenes, hate speech and misinformation – if the video is deemed socially relevant. Previously, the threshold for acceptable infringing content was 25 percent; now it’s 50 percent, allowing more videos to remain on the platform.
The threshold is now 50 percent, allowing more videos to remain on the platform.
Social relevance as a basis for exceptions
According to The New York Times, YouTube considers content related to elections, ideologies, movements, race, gender, sexuality, abortion, immigration, censorship and other socially relevant topics to be socially relevant. The changes were implemented through new training materials distributed to moderators in mid-December.
A 43-minute video about Donald Trump’s cabinet appointments is an example. The video was not removed because it falls under the new interpretation of relevance. Also not removed was a South Korean video that mentions the execution of a politician by guillotine in an imaginary scenario. YouTube explained that «execution by guillotine is not possible» and therefore the content did not violate the rules.
Decreased transparency amid political pressure
The new rules came into effect shortly before Donald Trump’s return to the US presidency. At the same time, other tech giants have also loosened controls: X (formerly Twitter) scrapped moderation altogether after being bought by Ilon Musk, replacing it with a Community Notes system.
The new rules came into effect shortly before the return of Donald Trump to the US presidency.
In contrast to X, YouTube hasn’t publicly announced the changes. However, a YouTube spokesperson confirmed that 192,856 videos were removed from the platform in the first quarter of 2025 – up 22% from a year earlier.
YouTube clarifies position
“We regularly update our community rules to adapt to content that appears on the platform. For example, this year we removed the remaining COVID-19 restrictions and strengthened protections against gambling-related content. The New York Times story touches on a separate aspect of our work – our long-standing practice of making exceptions for content that has a public interest or educational, scientific, documentary or artistic (EDSA) context. These exceptions apply to a small number of videos, but allow us to preserve important content. For example, don’t delete an hour-long news program because of one short violent segment.”
An hour-long news program should not be removed because of one short violent segment.
YouTube recommends checking out the Community Compliance Report for a full rundown of the new moderation approaches.
The story YouTube relaxed moderation rules without public announcement was first published at ITZine.ru.