Facebook Investigating Why Search Box Showed Sexual Video Recommendations

Facebook Investigating Why Search Box Showed Sexual Video Recommendations
Highlights
  • Some users said they received sexually graphic search recommendations
  • Facebook's search predictions are designed to reflect popular searches
  • Facebook has apologised and said it is looking into the incidents
Advertisement

Facebook users said Thursday night that they were confronted with disturbing and sexually graphic search recommendations on the social network, leading the company to apologise without offering a full explanation.

After typing in "video of" into the Facebook search bar, some users said the tool suggested obscene terms that included sex acts and child abuse. Facebook's search predictions are designed to reflect popular searches, but it's unclear why the offensive terms appeared.

"We're very sorry this happened." Facebook said in statement Friday. "As soon as we became aware of these offensive predictions we removed them. Facebook search predictions are representative of what people may be searching for on Facebook and are not necessarily reflective of actual content on the platform."

The company does not permit sexually explicit imagery on its platform. Facebook indicated that it doesn't know why those search terms were suggested to users. The company said it's looking into the matter and working to improve the search feature.

Facebook just last week faced an outcry after the Guardian reported that it ran a survey asking users if they thought men should be allowed to ask children for sexual pictures through the site. Facebook said the survey was a mistake.

The video search episode is the latest example of autocomplete features drawing criticism. Tech companies, including Facebook and Google, designed them to provide a shortcut for people wanting to look something up. They are based in part on previous searches by users. While tech companies use search predictions to help their users, the features can also elevate repugnant behaviour, highlighting the sometimes toxic consequences of automation and reflexive decision-making.

In 2016 Google, the world's dominant search engine, had to change its autocomplete search tool when users received offensive results after looking up information about women and Jewish people. And last year, Google surfaced unsubstantiated claims in suggested search terms about the shooter behind the mass killing in Sutherland Springs, Texas.

Facebook did not immediately respond to questions about its search prediction feature and what information the suggestions are based on.

© The Washington Post 2018

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: Social, Facebook, Sexual Abuse
Fortnite Invite Codes for iOS Invites Now Available
Robot Rides Are Going to Deliver Pizza and Parcels Before People
Share on Facebook Gadgets360 Twitter Share Tweet Snapchat Share Reddit Comment google-newsGoogle News
 
 

Advertisement

Follow Us

Advertisement

© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »