Making sure such content doesn't bubble up in response to search queries is a long-term effort, Google notes, but it's making some basic changes to how Search works today that will hopefully help expedite that result. Last month, we updated our Search Quality Rater Guidelines to provide more detailed examples of low-quality webpages for raters to appropriately flag, which can include misleading information, unexpected offensive results, hoaxes and unsupported conspiracy theories. The feedback menus will allow users to notify Google if they feel anything that shows up in the search bar, via AutoComplete, is inappropriate.
Additionally, Google is launching direct feedback tools that will allow users to flag erroneous Featured Snippets and Autocomplete predictions. Just as editors at traditional media outlets have to curate content and separate fact from fiction, Google has to do the same on a massive scale for all the stuff published to the web.
The firm has been criticised in recent months over "low-quality content" on its platform, including a high-profile incident in which a page denying the events of the Holocaust appeared at the top of search results on the subject. "As is often the case when Google announces changes, this couldn't be more vague", said search engine expert Joost de Valk of consultancy firm Yoast.
Google was criticised past year for giving prominence to groups seeking to deny that the Holocaust took place. The new "This is misleading or inaccurate" feedback option seems apt for the John-Hanson-as-first-Black-president snippet.
Google announced on Tuesday that it is tweaking its search engine to scuttle misleading or false content, a major move for the company and the world considering Google's dominance in search. The changes will, the company said, "surface more authoritative pages and demote low-quality content".
"The raters don't rank results", said Mr Sullivan.
Both Google and Facebook have taken steps in recent months to curb the spread of hoaxes and misinformation amid concerns these may have influenced voters in the 2016 U.S. election.
Gomes said: "Today we're taking the next step toward continuing to surface more high-quality content from the web". Now, quality raters can use more detailed tags meant to target pages that intentionally produce misleading information.
To address the problem, Google began revising the closely guarded algorithms that generate its search with the help of 10,000 people who rate the quality and reliability of the recommendations during tests.