Nick Diakopoulos focuses on it:
Even robots have biases. Any decision process, whether human or algorithm, about what to include, exclude, or emphasize — processes of which Google News has many — has the potential to introduce bias. What’s interesting in terms of algorithms though is that the decision criteria available to the algorithm may appear innocuous while at the same time resulting in output that is perceived as biased.
Madrigal considers the ways news organizations try to game Google News:
I don't think Google News has ever taken enough responsibility for the cybernetics of the system it created. What is important is not just how the software works, but the ways it structures humans' thoughts and actions in new ways. To my eye, it created feedback loops with mostly deleterious effects not because the algorithm itself was bad, but because the service did not take the human repercussions seriously. That's not what Krishna Bharat, the product's creator, set out to do. But it's happened.