Internet searches may be based on algorithmic rigour -- but that doesn't necessarily guarantee the quality of a suggested page. Now, a team of Google researchers has developed a method that sorts results by factual content rather than how well linked a page is.
While Google's search algorithms are complex, they still use the number of incoming links to a web page as the main arbiter of quality. The more linked-to a page, the better it is and the higher it appears in your results. It's a technique pleasing in its (seeming) simplicity. But if a lot of people link to an awful page it can find itself at the top of the pile -- even though it's not really useful.
A team of Google researchers has developed a new way to rank pages, called Knowledge-Based Trust score. The system's not live -- or, indeed, likely to be, at least for a little while -- but it is is intriguing. Essentially, it provides results by counting the number of incorrect facts within a page, reports New Scientist.
It does that by looking up content and comparing it to Google's Knowledge Vault, a store of facts that have been pulled off of the Internet. The Vault is a pool of facts that are broadly agreed on online, which is judged by the researchers to count as proxy for truth. If web pages contain information that contradicts the Vault, they slide down the ranking.
It's not the first algorithm to try and judge truth online -- there's a browser extension called LazyTruth that tries to identify hoax emails, for instance -- but applying it to search is a new and interesting concept. Perhaps the biggest question mark, though, is the accuracy of the Vault used to look up what's right and what's not. How do you feel about having results served to you based on the Truth According to Google? [arXiv via New Scientist]