Warning: the following post contains information relating to offensive subjects, including anti-Semitism, racism and sexualisation of minors. The post will not contain any imagery or descriptions of offensive content. All references will be linked for further reading.
Microsoft’s search engine Bing is suggesting offensive content when users are searching normal terms. The related topics section has been found to suggest searches that contain racist terms, the sexualisation of minors and other offensive content.
This discovery was made by How-to Geek, with the SafeSearch option enabled. Many terms that would be deemed safe and appropriate suggest “related topicsâ€, but with an underlying theme of pornography, bestiality and racism.
It is also found on Yahoo, which is powered by Bing. The Verge replicated some of the search queries, and autocomplete queries continued to provide offensive terms.
This isn’t the first time Microsoft has come under fire for AI technology producing offensive content. Back in 2016, a Twitter account (which is now deleted) for a bot named Tay. Microsoft developed this with the premise of “The more you chat with Tay, the smart it gets, learning to engage people through casual and playful conversation.â€
It took less than 24 hours for the internet to corrupt the bot. Eventually, it was tweeting racist, misogynistic and comments that displayed the worst corners of the internet. Most of the information from Microsoft about Tay was on a website dedicated to the feature, which has now been taken down.
What does Microsoft need to do about Bing?
It needs to clean up the search engines. It’s a major search engine, and with the recent update on Windows 10 making Bing the default engine (which is harder to remove now), more people are going to begin noticing and reporting some offensive terms they did not want in their search engine suggestions.