By Seth Lewis and Efrat Nechushtai for The Conversation
Google News does not deliver different news to users based on their position on the political spectrum, despite accusations from conservative commentators and even President Donald Trump. Rather than contributing to the sort of "echo chamber" problem that critics fear have plagued Facebook and other social media networks, our research has found that Google News algorithms recommended virtually identical news sources to both liberals and conservatives. That's an important point to keep in mind when evaluating accusations that Google News is biased.
Our findings are part of an ample and growing body of research on this question. Online services - including Google's regular search function - may provide intensely personalised information. But media scholars like us have found that when it comes to news, search engines and social media tend to lead people not to a more narrow set of sources, but rather to a broader range of information. In fact, we found, Google News is designed to avoid personalised search results, intentionally constructing a shared public conversation based on traditional criteria of journalistic values.
There is, however, one aspect of this lack of personalisation that may strike conservatives the wrong way: Established mainstream news outlets strongly dominate the results, regardless of what a user is searching for. Of all the Google News recommendations we collected, a full 49 percent - nearly half - were to just five national news organisations: The New York Times, CNN, Politico, The Washington Post and HuffPost. And those five, much like other mainstream news organisations, tend to be seen as centre-left.
- Donald Trump lashes out at press, calls it 'opposition party'
- Fake news? Trump misquotes figures boasting about poll ratings
In addition, Google News favours sites with original reporting - as well as ones that produce large numbers of articles, respond reasonably quickly to events and have larger staffs. Those criteria, which don't directly have anything to do with a news organisation's political bent, do appear to disadvantage explicitly partisan right-wing commentary sites, which tend to be small, low-volume and do little of their own on-the-ground reporting. And it's definitely true that users don't know how Google News works. The company, like many of its ilk, is tight-lipped about how its news and other algorithms function - at least in part to prevent media companies from gaming the system to favor their own material.
How we tested for echo chambers
Shortly before the 2016 election, we studied what would happen when people searched for news about Donald Trump and Hillary Clinton on Google News. Specifically, we used Amazon Mechanical Turk to recruit a diverse set of 168 people in California, Florida, New York, North Carolina, Ohio and Texas. Participants were of different ages, education levels and political views: 41 percent identified as liberals and 26 percent identified as conservatives. The remaining 33 percent did not declare a political affiliation.
We asked them to search Google News for news about Hillary Clinton and Donald Trump while logged in to their personal Google accounts, and report the first five stories they were recommended on each candidate. We repeated this on two separate occasions, once after a presidential debate and later during a slow news period. Then we compared the stories that people were recommended.
The fact that they were logged in to their Google accounts was important: Google, of course, collects huge amounts of data about each of its users, and could leverage that information when returning search results. Therefore, we expected to find people getting different article recommendations based on their prior search history and online activity, as recorded by Google and applied to the results they got from Google News.
That's not what we found at all. Instead, liberals and conservatives were recommended virtually identical news sources.
No collusion against conservatives
We found, as have others, no evidence that major technology companies collude against conservatives or tweak their algorithms to return politically slanted search results.
In fact, some have suggested that the opposite may be true. In the run-up to the 2016 election, Facebook was accused of yielding to charges of bias, moving to favour right-wing views by letting leading conservatives investigate its internal biases. Twitter has been similarly accused for protecting InfoWars in 2018.
Further, as tech journalist Kara Swisher has argued, "Mr Trump himself is the most voluble politician ever to use digital media, and his entire existence has been amplified, echoed and re-echoed over and over again by the tools that Silicon Valley has let loose on the world over the past two decades."
Who determines what gets prioritised online?
However, there is reason to understand - even if not to agree with - claims of bias. First, Google News search results do favour legacy news organisations, ones with a long history. In our study, of the 14 news sites that ranked highly on at least one search, only three were newer "digital-first" news organisations. The rest were legacy newspapers, national TV stations and magazines.
Whether this is a problem - and if so, how much of one - is largely up to individual interpretation. For people who care that public discourse is based on a shared set of facts, it's good news to learn that most people get the same results when they search Google News. And for people who believe that long-standing news producers with proven track records are best equipped to report on current events, our research is reassuring.
Yet across the political spectrum, Americans have far more trust in their local media than in the national media organisations that dominate online - including the results of Google News. It's especially difficult to trust search engines and social media sites whose algorithms are secret, complex and constantly changing.
Ultimately, the concerns about algorithms and technology boil down to the principles that guide recommendation engines in shaping what reports get the most attention. Should Google News prioritise stories that adhere to traditional journalistic norms? Or should it reflect some other, yet undetermined standard? Mr Trump's rhetoric resonates with his supporters because, to them and others, the answer is not so clear-cut.
People have different visions of how societies should narrate their shared life. That's perhaps why concepts of news judgment and balanced coverage largely assume that human editors will be involved. Algorithms can't solve these quandaries - but they can help bring sharper focus to the public debate of the role news should play in a democratic society. Mr Trump's latest attacks may forestall that debate, though, by doing to technology companies what he did to the press: convincing many people they are "fake" and thus not to be trusted at all.
Seth Lewis is from the Emerging Media, School of Journalism and Communication at the University of Oregon. Efrat Nechushtai is a PhD candidate in communications at Columbia University.
The Conversation