YouTube will stop removing content that falsely claims the 2020 US presidential election was plagued by “fraud, errors or glitches,” the platform said Friday, a decision quickly criticised by anti-misinformation advocates.
The announcement by the Google-owned video website is a marked departure from its policy initiated in December 2020, which attempted to curb false claims – pushed by then-president Donald Trump – that his re-election loss to Joe Biden was due to the vote being “stolen”.
“The ability to openly debate political ideas, even those that are controversial or based on disproven assumptions, is core to a functioning democratic society – especially in the midst of election season,” YouTube said in a blog post.
“We will stop removing content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past US presidential elections.”
The new policy goes into effect immediately.
YouTube’s misinformation policy, adopted after the 2020 election, led to the deletion of Mr Trump’s video on January 6, 2021 in which repeated false claims of election fraud while telling protesters to leave the US Capitol.
The new policy by YouTube comes as tech platforms grapple with how to combat misinformation without impacting free speech.
YouTube appeared to acknowledge that policing misinformation comes with downsides.
“Two years, tens of thousands of video removals, and one election cycle later, we recognised it was time to reevaluate the effects of this policy in today’s changed landscape,” the video-sharing giant said.
“In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm.”