YouTube is pushing back against claims its platform is helping promote and spread misinformation surrounding the 2020 US election, saying its most popular videos related to the election are from “authoritative” sources. YouTube also claims it takes measures to stop the spread of videos containing false or misleading claims by not surfacing them in search results or through its recommendation engine.
“Like other companies, we’re allowing these videos because discussion of election results & the process of counting votes is allowed on YT. These videos are not being surfaced or recommended in any prominent way,” YouTube wrote from its YouTubeInsider account in response to a tweet from Bloomberg journalist Mark Bergen, who criticized the company’s slow and inconsistent moderation of election content. “The most popular videos about the election are from authoritative news organizations. On average, 88 percent of the videos in top-10 results in the U.S. come from high-auth sources when people search for election-related content.”
The most popular videos about the election are from authoritative news organizations. On average, 88% of the videos in top-10 results in the U.S. come from high-auth sources when people search for election-related content.
— YouTubeInsider (@YouTubeInsider) November 12, 2020
YouTube did not disclose what it considers authoritative, nor did it break down what percentage of views of election content come from users typing in phrases into the search box instead of following certain channels, seeking out those channels, or finding them via Facebook, Reddit, or other social networks. Even though its top-10 results for election content may contain mainstream media sources, YouTube does not appear to be acknowledging how often users seek out videos from untrustworthy sources or find them online through other means.
YouTube has come under fire in the run-up to and after Election Day for allowing videos from organizations like One America News Network that falsely say President Donald Trump won reelection and that mass voter fraud is responsible for his loss to President-elect Joe Biden.
Unlike Facebook and Twitter, which have been aggressively labeling and removing links and posts that spread false information surrounding the election, YouTube says it allows people to discuss the outcome of the election and processes like vote counting, even if they do so in ways that spread unproven conspiracies or peddle false or misleading claims. YouTube claims it counteracts the spread of such content by limiting how discoverable these videos are using search and its recommendation engine.
YouTube permits videos of people repeating fake claims about the election
However, YouTube appears to be struggling with how to contain the spread of videos uploaded to its platform on other social networks like Facebook, where the videos often go viral too fast before either company is able to slow down their spread.
In an example displaying the process by which YouTube helps amplify misinformation, Vice reported on a false claim alleging that RealClearPolitics had rescinded its Pennsylvania call in favor of Biden, which was then circulated by Trump lawyer Rudy Giuliani until right-wing YouTube channel The Next News Network published a video repeating the claim. The video was then circulated on Facebook mainly through links posted to private groups, which makes it hard for Facebook moderators to clamp down on its spread. All the while, The Next News Network is racking up views and even selling merchandise under the video, revenue cuts of which go to YouTube, Vice reports.
YouTube says it’s using an election panel pinned to the top of election-related searches pointing users toward its Google webpage with verified election results. It’s also removing advertising from certain videos that undermine “confidence in elections with demonstrably false information,” according to The New York Times, but YouTube is not removing the videos outright.