YouTube Details How It Will Tackle Misleading Election Content

YouTube said it would remove any technically manipulated or doctored content, and that may pose a "serious risk of egregious harm."

Share on Facebook Tweet Snapchat Share Reddit Comment
YouTube Details How It Will Tackle Misleading Election Content

YouTube would terminate channels which impersonate another person or channel

Highlights
  • YouTube detailed how it will tackle false election-related content
  • It would remove any content that has been technically manipulated
  • It does not allow content that aims to mislead people about voting

On the day of the Iowa caucuses, the first nominating contest of the US presidential election, Alphabet's YouTube detailed how it will tackle false or misleading election-related content.

The video-streaming service said in a blog post on Monday that it would remove any content that has been technically manipulated or doctored and may pose a "serious risk of egregious harm."

It also said it does not allow content that aims to mislead people about voting, for instance telling viewers an incorrect voting date, or content that makes false claims related to a candidate's eligibility to run for office.

The blog post also said YouTube would terminate channels which impersonate another person or channel, misrepresent their country of origin or conceal their links with a "government actor."

Social media companies are under pressure to police misinformation on their platforms ahead of the November election.

In January, Facebook said it would remove "deepfakes" and other manipulated videos from its platform, althought it told Reuters that a doctored video of US House Speaker Nancy Pelosi which went viral last year would not meet the policy requirements to be taken down.

Major online platforms have also been scrutinized over their political ad policies. In November, Google, which is also owned by Alphabet, announced it would stop giving advertisers the ability to target election ads using data such as public voter records and general political affiliations.

It now limits audience targeting for election ads to age, gender and general location at a postal code level. Political advertisers also can still contextually target, such as serving ads to people reading about a certain topic.

Google and YouTube also have policies prohibiting certain types of misrepresentation in ads. However, when former Vice President Joe Biden's campaign asked Google to take down a Trump campaign ad that it said contained false claims, a company spokeswoman told Reuters it did not violate the site's policies.

While Twitter has banned political ads including those that reference a political candidate, party, election or legislation, in a push to ensure transparency, Facebook has announced limited changes to its political ad policy.

Facebook, which has drawn criticism for exempting politicians' ads from fact-checking, said it does not want to stifle political speech.

© Thomson Reuters 2020

Comments

For the latest tech news and reviews, follow Gadgets 360 on Twitter, Facebook, and subscribe to our YouTube channel.

Further reading: YouTube, US Elections 2020
Facebook Messenger Kids App to Get More Parental Control Features

Related Stories

 
 

Advertisement

Advertisement

© Copyright Red Pixels Ventures Limited 2020. All rights reserved.
Listen to the latest songs, only on JioSaavn.com