Google says a very small percentage of its search results are personalized, a claim that has helped insulate the company from scrutiny over filter bubbles, especially compared with Facebook and YouTube, a Google subsidiary. But a new study from DuckDuckGo, a Google rival, found that users saw very different results when searching for terms such as “gun control,” “immigration,” and “vaccinations,” even after controlling for time and location. One participant saw a National Rifle Association video at the top of the results page for “gun control,” another saw Wikipedia at the top, while a third got the NRA video, but no result from Wikipedia in any of the first 10 links.
The study also found that most users saw roughly similar results whether they were logged in to Google, logged out, or searching in private browsing, also known as incognito mode. If private browsing on Google were truly anonymous, the study’s authors contend, all private browsing results should be the same.
DuckDuckGo’s conclusions are far from scientific. Only 87 individuals participated in the test. Each responded to a tweet from DuckDuckGo and sent screenshots of their results. Regardless of why results differed, variation in search results for political topics—particularly during an election year—underscores how users have little visibility into Google’s algorithms and don’t know whether, or how, the information they see is being filtered.
DuckDuckGo CEO Gabe Weinberg told WIRED that the study took place before President Trump and other Republicans criticized Google for alleged anti-conservative bias, an unsubstantiated and self-serving claim. Weinberg does not believe that Google is altering search results because of political bias. Rather, he says the goal of the study is to draw attention to Google’s overall political influence, whether it is intentional or not. “I think search results are politically biased just by the nature of tailoring them to your past history,” Weinberg says.
Google says that if a user is logged out or searching in incognito mode, it does not personalize results based on a user’s signed-in search history and does not use personal data. In those two modes, however, the results may be contextualized based on the session in that browser window. Google also shared a number of reasons that individuals who perform the same search query may see different results, including timing (for rapidly evolving news topics, it can vary by the second), the location of Google’s data centers, and localization of query results.
In September, the company told WIRED that only 2 percent to 2.5 percent of results from searches that are typed into the search box are meaningfully personalized; Google says this happens most often when a search is ambiguous, such as searching for Barcelona as a city or a soccer team. The implication was that users should not worry that it is creating filter bubbles, which can entrench partisan divisiveness and skew access to information. For instance, if a search algorithm reflects personal preferences, a liberal user might see more results about gun reform where a conservative user might see more results about gun rights. (In a similar vein, YouTube’s recommendations algorithm, which rewards engagement, has been known to serve increasingly extreme content to keep people watching, unintentionally radicalizing users in the process.)
Not long after Trump tweeted in August that Google’s results were “rigged” against conservatives, the company briefed reporters on changes to its search algorithm that began in December 2016—around the time Google received bad press over misinformation, hate speech, and other problematic content in its search results. A visual aid showed the before and after results for a search on “Did the Holocaust happen.” Prior to the changes, the first result came from the stormfront.org, a hub for neo-Nazis. In a search on Monday, the top result was from the US Holocaust Memorial Museum.
Tuesday’s study is a follow-up on a similar test that DuckDuckGo conducted in 2012 looking at Google search results for Obama and Romney. The Wall Street Journal performed its own independent version of the study and found that Google often customized results for users who recently searched for “Obama,” but not users who had recently searched for “Romney.” At the time, Google told the paper the discrepancy was merely the result of the fact that more individuals searched for Obama’s name and then searched for topics, such as Iran, compared with people who searched for Romney’s name and then Iran. “The findings are among the latest examples of how mathematical formulas, rather than human judgments, influence more of the information that people encounter online,” the Journal wrote.