Facebook: The extent of the human touch at issue

The company strongly denies its staff “news curators” manipulate their “trending” feature to delete or add stories thrown up by algorithms of which they approve or disapprove

Accusations in the US from journal Gizmodo of systematic left wing political bias in Facebook have been gleefully taken up by, among others, the Donald Trump campaign. The claims play to a longstanding conservative critique of the Facebook News Feed operation used by a billion-plus people every day and which is critical to shaping the worldwide news agenda (for example, some 20 per cent of Irish Times online traffic comes from Facebook).

The company has strongly denied that its staff “news curators” manipulate their “trending” feature to delete or add stories thrown up by algorithms of which they approve or disapprove.

Facebook has long described its trending feature as largely automatic, what one writer has called its “veneer of empiricism”. “The topics you see are based on a number of factors including engagement, timeliness, pages you’ve liked and your location,” according to a description on Facebook’s site. It is curated by a team of contract employees whose task, the company says, is simply to summarise the trending items and weed out the less credible material. But how automatic is automatic? And is automatic necessarily “objective”?

For one thing, Facebook news algorithms, which are regularly adjusted, are no less human constructs, as infused with assumptions and bias as any other human editorial decision.

READ MORE

But the claims by the former Facebook contractor, a man of “conservative” views, we are told, whether exaggerated or not, may begin to change perceptions of what Facebook is, particularly how “neutral” it is. A 2015 Pew poll in the US found that only 17 per cent surveyed believed technology companies had a negative influence on the US. For the news media, that number was 65 per cent – and rising. That gap is likely to close.

Perceptions that news values and the trending news agenda are “manipulated” through human interference, and that Facebook functions as a mediator much like a newspaper newsroom, inevitably reflecting the biases of its workers and its institutional imperatives, are likely to blur popular distinctions between the two forms of media.