YouTube Says Viewers Are Spending Less Time Watching Conspiracy Videos but Many Still Do

YouTube said it was pushing users toward videos from more-reliable news sources, pointing to Fox News and Brazilian radio outfit Jovem Pan as examples.

YouTube Says Viewers Are Spending Less Time Watching Conspiracy Videos but Many Still Do

YouTube said it was pushing users toward videos from more-reliable news sources

Highlights
  • The announcement follows a change in YouTube's algorithm
  • However, YouTube didn't release the underlying figures
  • It didn't say whether it had reduced the times the videos are clicked on
Advertisement

YouTube said Tuesday its policies and enforcement helped reduce the length of time viewers watch videos that advance conspiracies and other debunked theories, as the leading video site responded to criticism regarding its failure to police such content. The Google-owned company said Tuesday it had pared by 70 percent the average time US viewers spend watching videos that it deems "borderline" content, such as those peddling miracle medical cures or flat-earth conspiracy theories. The announcement follows a change in YouTube's algorithm announced in January seeking to limit how often its software recommended videos espousing fringe views.

But in a blog post Tuesday, the company didn't release the underlying figures, such as how much time viewers still spend watching the videos. It didn't say whether it had reduced the times the videos are clicked on in the first place or provide global figures.

Many viewers of such content subscribe to channels that regularly peddle it. Ivy Choi, a Google spokeswoman, declined to comment beyond the blog post.

"There will always be content on YouTube that brushes up against our policies, but doesn't quite cross the line," YouTube said in the blog post. "We've been working to raise authoritative voices on YouTube and reduce the spread of borderline content and harmful misinformation."

As part of those efforts, YouTube said it was pushing users toward videos from more-reliable news sources, pointing to Fox News and Brazilian radio outfit Jovem Pan as examples. The company said that for searches for ongoing news events such as Brexit, 93 percent of the top 10 recommended videos are from creators YouTube deems "high-authority." The company didn't disclose what sample size it used for that data or how many people click on the top videos under a given search or in what order.

YouTube has historically given wide latitude to creators in the name of free speech, although it is legally permitted to prohibit whatever content it wishes. It does not permit hate speech but defines that narrowly as content that promotes violence or hatred of vulnerable groups.

In an interview on "60 Minutes" aired Sunday, YouTube chief executive Susan Wojcicki used the example of videos that suggest people should discriminate in hiring because of race as warranting removal, but said those that simply espouse racial superiority would be allowed.

Silicon Valley firms have been struggling with how to police their sites, particularly as the US presidential election heats up. Twitter, for example, has banned all political advertisements, while Facebook allows political content that may be false or misleading. YouTube hasn't taken a clear position on the issue, but Wojcicki said the company had removed some ads related to President Trump.

Congress could help untangle the thicket of varying policies by passing laws that require transparency from YouTube and other tech firms, said Jeffrey Chester, executive director of the Center for Digital Democracy, an electronic rights organisation. "You can't put the future of democracy into the hands of companies that are dependent on advertisers for their business," he said.

To help direct viewers to more-reliable information, YouTube said it has been showing users snippets of text news articles that it verifies as accurate, particularly following breaking news events, or displaying "information panels" that provide additional context. That type of information will appear to viewers watching videos pushing people to eschew vaccines, according to the blog post.

YouTube said it relies on a number of factors to determine reliability, including the amount of time a given video is watched, how many times a video is clicked on, as well as likes and dislikes. It also turns to about 10,000 contract workers around the world who review content, which helps train its software to automate the process.

Some of those workers have complained the company doesn't always listen to them when they flag content, saying YouTube applies a double standard.

© The Washington Post 2019

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: YouTube
Scientists Slam Chinese CRISPR Babies Research After Manuscript Released
WhatsApp Dark Mode Elements Spotted in Avatar Images, VoIP Screen on Android Beta
Share on Facebook Gadgets360 Twitter Share Tweet Snapchat Share Reddit Comment google-newsGoogle News
 
 

Advertisement

Follow Us

Advertisement

© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »