YouTube: No ‘deepfakes’ or ‘birther’ videos in 2020 election

The Google-owned service said it will take down videos that are technically altered to mislead people.

YouTube logo reflected in a person's eye [File: Dado Ruvic/Reuters] [Daylife]

YouTube is making clear there will be no “birtherism” on its platform during this year’s United States presidential election – a belated response to a type of conspiracy theory that became more prevalent more than eight years ago.

The Google-owned video service is also reiterating that it will not allow election-related “deepfake” videos and anything that aims to mislead viewers about voting procedures and how to participate in the 2020 US Census.

YouTube clarified its rules ahead of the Iowa caucuses Monday. The company is mostly reiterating content guidelines that it has been putting in place since the last presidential election in 2016.

YouTube’s ban on technically manipulated videos of political figures was made apparent last year when it became the first major platform to remove a doctored video of US House Speaker Nancy Pelosi. But the announcement on Monday further clarifies that it will take down any election-related videos that are technically altered to mislead people in a way that goes beyond simply taking clips of speech out of context.

The company also said it would remove doctored videos that could cause “serious risk of egregious harm” – such as to make it appear that a government official is dead.

Facebook, which last year had resisted early calls to yank the Pelosi video, said in January that it was banning “deepfake” videos, the false but realistic clips created with artificial intelligence and sophisticated tools. Such videos are still fairly rare compared with simpler “cheap fake” manipulations such as those used in the video that altered Pelosi’s speech to make it seem like she was slurring her words.

Silhouettes of laptop and mobile device users next to a screen projection of YouTube [File: Dado Ruvic/Reuters]

Google also said Monday that it will remove any videos that advance false claims about whether political candidates and elected officials are eligible to serve in office. That had been policy before, but was not made explicit.

The company’s announcement comes about nine years after celebrity businessman Donald Trump began to get notice for claiming that Barack Obama, the nation’s first African American president, was not born in the US.

Trump repeatedly voiced citizenship doubts even after Obama produced his long-form birth certificate.

Trump only fully backed off from the idea in the final stages of his 2016 presidential campaign.

YouTube said it will also crack down on any attempts to artificially increase the number of views, likes and comments on videos. It changed its systems for recommending what videos users watch last year in a push to curb harmful misinformation. Twitter and Pinterest also last week outlined their efforts to reduce election misinformation on their platforms.

Source: AP