Google tweaks search results to squash fake news

Share

Google will also allow users to make complaints about its "featured snippets" - Google's name for the boxed-out answers that appear at the top of searches for common queries - which have been accused of spreading fake news, such as a claim that Barack Obama was planning a coup in 2016.

Refining the whole process is user feedback in search, they can flag up offensive or misleading search autocompletes (which are of course inspired by human search patterns). And that perception is a problem for Google.

Human evaluators use the guidelines to rate the quality of search results. Among other things, Google's search engine pointed to a website that incorrectly reported then President-elect Donald Trump had won the popular vote in the US election, that President Barack Obama was planning a coup and that the Holocaust never occurred during World War II. Now Google's changing the way search works a bit, empowering users by adding new feedback tools and changing results from their search ranking to bring you more relevant and accurate information.

As part of this effort, Google is also getting more transparent about its products.

'Today, in a world where tens of thousands of pages are coming online every minute of every day, there are new ways that people try to game the system. In that environment, Google's challenge is to guard against abuse of the new feedback buttons.

"The raters don't rank results", said Mr Sullivan. "We don't expect the problem will completely disappear".

More news: Record-breaking debut for newest 'Fast and Furious' flick
More news: Plane hijack hoax: Hyderabad man behind the threat mail arrested
More news: How to enable phone sign-in for your Microsoft account

"That feedback is then used to reshape the algorithms - the recipes, if you will -that Google uses". The guidelines will now include options for identifying misleading information, unexpected offensive results, hoaxes, and "unsupported conspiracy theories".

Top: Screen shot of Google search.

Users can report suggestions for being hateful, explicit or violent.

In particular, he said, many groups and organisations were using "fake news" to help spread "blatantly misleading, low quality, offensive or downright false information".

As Mr. Gomes points out in his blog post, it's both a strength and a limitation of its service that Google doesn't create its own content. "The content that appears in these features is generated algorithmically and is a reflection of what people are searching for and what's available on the web", Gomes said.

Those results can trigger a public-relations backlash, as it did in December when a white supremacist site was featured prominently in search results about the Holocaust.

Share