• Home
  • Internet
  • Internet News
  • Microsoft’s Bing Search Engine Caught Serving Child Pornography, Assists in Finding More: Report

Microsoft’s Bing Search Engine Caught Serving Child Pornography, Assists in Finding More: Report

Share on Facebook Tweet Share Reddit Comment
Microsoft’s Bing Search Engine Caught Serving Child Pornography, Assists in Finding More: Report

Bing brings up objectionable content in image search when ‘Safe Search’ is deactivated

Highlights

  • Bing even suggests keywords to help find more disturbing images
  • Microsoft claims to have removed the objectionable photos
  • Bing still brings up such content despite Microsoft’s claim

The distribution of objectionable content depicting child pornography and sexual abuse has become a topic of hot debate, with the likes of WhatsApp, Tumblr, and Telegram having recently attracted controversy over the proliferation of such material on their platforms. The latest name to join the despicable league is Microsoft's Bing search engine. Bing was caught serving images portraying child pornography in search results and even suggested keywords that could be used to find more disturbing content whose possession or distribution is deemed a criminal offence.

The findings were reported by AntiToxin, an organisation which builds solutions to combat online abuse, after an anonymous tip received by TechCrunch highlighted how easy it was to find such images - which are banned across multiple platforms - on Bing. As per the report, Bing not only brings results for keywords as simple as ‘porn kids' in the image search, but it also provided auto-complete search suggestions which again brings up a plethora of content depicting underage individuals in scenarios that raise alarms. All such photos came up in Bing's image search section when the ‘Safe Search' filter is turned off. 

And when an image is selected, Microsoft's search engine also presents recommendations in the ‘similar images' section whose content and keyword fall in line with the original search query and the selected image's description. In its investigation, AntiToxin found that keywords like ‘porn kids', ‘nude family kids' and ‘porn cp' surfaced an extensive gallery of images depicting underage nudity and minors engaged in sexual acts. But that's not all. When researchers searched for terms such as ‘Omegle Kids' and ‘Omegle for 12 years old', Bing's auto-complete suggested ‘Omegle Kids Girls 13' and ‘Kids On Omegle Showing' respectively, both of which again opened a sea of such content. 

“Clearly these results were unacceptable under our standards and policies and we appreciate TechCrunch making us aware. We acted immediately to remove them, but we also want to prevent any other similar violations in the future”, Microsoft's Chief VP of Bing & AI Products, Jordi Ribas, told TechCrunch after being informed about the findings. He added that Microsoft's team is now working to block such queries and curb the suggestions. At the time of writing, we discovered that Bing still brings up a tonne of images which depict child pornography for some of the keywords mentioned above.

The latest discovery indicates that Microsoft's search engine not only makes it a cakewalk for pedophiles to find contemptible content, but also helps discover more child pornography content with its recommendations and suggestion algorithm. And above all, it proves that Microsoft needs to put in additional efforts and remain more vigilant to ensure that such disturbing content is not proliferated.

Comments

For the latest tech news and reviews, follow Gadgets 360 on Twitter, Facebook, and subscribe to our YouTube channel.

Further reading: Microsoft, Bing, Search Engine, Search
Nadeem Sarwar Aside from dreaming about technology, Nadeem likes to get bamboozled by history and ponder about his avatars in alternate dimensions. More
Red Dead Online Gets a New Battle Royale Mode Called Gun Rush
Redmi Note 7 vs Realme U1 vs Asus ZenFone Max Pro M2: Price, Specifications Compared
 
 

Advertisement

 

Advertisement