How does a search engine like Google quantify, analyse, and rank information? What factors does it take into account, and how are they weighted? The algorithms that handle queries may be opaque, but the end results are clearly visible.
That’s the idea behind Search Atlas, a new tool developed by academics that aims to show how Google would display search results if a query was entered in different locales around the world. It’s an experimental interface for Google Search that returns three, rather than one, columns of results selected from the more than 100 geographically localised versions of the search engine across the world. So, for example, a search for Tiananmen Square may prioritise the infamous massacre of protesters there in 1989 or directions for tourists; in the U.S. certain results may be removed due to Digital Millennium Copyright Act complaints; or in France and Germany, certain Holocaust denial sites may be blocked from results.
Wired reports that the creators of Search Atlas first presented their results at the Designing Interactive Systems conference in June and it remains in private beta, but they have released a paper and other preview materials on the project’s website. The tool is already turning up interesting results. For example, using Search Atlas to look for images of “God” turns up Christian imagery in the U.S. and Europe, while in Asia it turned up images of Buddha and in the Persian Gulf and North Africa it turned up Arabic script.
In the UK and Singapore, a search for Tiananmen Square turned up images related to the massacre, while a search tuned to China (where Google has been blocked since 2010) turned up “recent, sunny images of the square, smattered with tourists,” according to Wired. Results for “how to combat climate change” emphasised policy solutions in Germany, while island nations like Mauritius and the Philippines seemed to receive results emphasising the immediate, dire nature of the threat, like sea-level rise that threatens to disproportionately impact them much sooner.
Similarly, Wired wrote that queries on the war in Ethiopia’s Tigray region set to within the country turned up “Facebook pages and blogs that criticised Western diplomatic pressure to deescalate the conflict, suggesting that the US and others were trying to weaken Ethiopia,” whereas searches set to Kenya or the U.S. “more prominently featured explanatory news coverage from sources such as the BBC and The New York Times.”
Search engines lock you into a "filter bubble" made of your search history, location, and language.
But what if you can see what you wouldn't otherwise see—by Googling the same thing around the world?
— Katherine Ye (@hypotext) July 2, 2021
Search engine results come from complex struggles between the state, corporations, and academia. pic.twitter.com/5SRw2cvVpk
— Katherine Ye (@hypotext) July 2, 2021
MIT science, technology, and society Ph.D. student and Search Atlas creator Rodrigo Ochigame told Wired that the project aims to dispel the persistent notion that search engines like Google are neutral arbitrators of information: “Any attempt to quantify relevance necessarily encodes moral and political priorities.”
Project co-creator Katherine Ye, a computer science Ph.D. student at Carnegie Mellon University and research fellow at the Centre for Arts, Design, and Social Research nonprofit, told Wired that “People ask search engines things they would never ask a person, and the things they happen to see in Google’s results can change their lives. It could be ‘How do I get an abortion?’ restaurants near you, or how you vote, or get a vaccine.”
For example, Ye tweeted that Google results for “Crimean annexation” showed up results in Russia framed around the impact on the Russian Federation, in Ukraine framed around “occupation,” and in the Netherlands framed around European Union sanctions on Russia.
These disparate results aren’t necessarily the result of any intent to suppress information, but factors like Google trying to localise its results to be of more interest to people in specific geographic regions, commercial interest, local laws, and what Ochigame and Ye told Wired are “information borders” that create “partial perspectives.” These supposedly apolitical adjustments nonetheless inevitably bleed over into politics. While the difference in results for Tiananmen Square appears to reflect the Chinese government’s desire to cover up the incident, a Google spokesperson told Wired that the search engine turns up the tourist-friendly images when it infers an intent to travel. The differences in searches for “God,” the spokesperson told the site, were due to the way the term is translated into different languages.
The end result is a partial slice of reality predicated upon Google’s assumptions about the world and influenced by a desire to maximise revenue, according to the researchers.
“Even the earliest studies, based on anecdotal observations, already suggested that search engines systematically suppress some sites in favour of others, in line with financial interests,” the researchers wrote in the paper. “More recent studies have argued that commercial search engines deploy algorithms that reinforce existing social structures, particularly racist and sexist patterns of exposure, invisibility, and marginalization. Thus, it is vital to expose the partial perspective of search engines.”