Picture: REUTERS

THE neoliberal poststructuralist headline "Facebook enlists humans to decide what’s informative" on a story in Business Day (lifted from The Wall Street Journal) would have been reminiscent of an HG Wells science romance, replete with a Fabian critique of laissez faire capitalism, had it not been for the substitution of the word "humans" for "machines".

The headline is perfectly contextualised, however. It is a deadpan statement of fact in irony of the kind only James Thurber or Pieter-Dirk Uys would dare make funny, but it obfuscates the dark danger lurking in the consequences. The story is innocent too. Facebook says it is changing its news-feed algorithm to arrive at what is "most informative" to its users by polling people throughout the world. The resulting data are integrated into the algorithm that ranks posts and feeds the resulting news to members filtered by relevance.

But don’t be fooled. It is imperative not only to grasp the text of the statement, but also to analyse the ideology that produced the text. Next to what Facebook is planning, the decision-making machine is a pussycat that Facebookers should defend to the death. Actual humans of the type who, say, put Donald Trump and Hillary Clinton in the US presidential race, who voted Brexit or for the ANC, and people who believe rhino horn makes you libidinous, will be creating data that will determine, on a scale of one to five, what is interesting.

No, you say, radical effects are diluted or marginalised, but you will be wrong. In the first place, to be on Facebook, you have to be relatively wealthy, basically educated and have more bandwidth than brains. That shrinks the world. In the second quarter of 2016, Facebook users numbered about 1.71-billion active monthly users, diminishing the worldwide sample by more than 5-billion people, thus concentrating the bias. Second, think for a moment about the type of person who would fill out a Facebook questionnaire without pay. Isis recruiters on furlough? Sundry nut jobs? Who knows? But know this, their number is likely to be alarmingly large and what they put into the survey is not data but information.

Facebook is not the egalitarian free-for-all social device its users would like to think it is. It is a business, and a pretty good one thus far, judging by its $1.51bn profit for the first quarter of 2016, or just less than $1 per user. Its success is ascribable chiefly to the fantastically low cost of what web producers like to call user-generated content. It is cheap because, to its algorithm, user inputs are just numbers.

But the moment Facebook edits its content, when it begins to act as a supposedly responsible publisher with its users’ best interests at heart, its raw material becomes expensive information. Facebook is now taking the first step towards adding value and moderating and filtering and editing. Any newspaper publisher can tell Mark Zuckerberg that it is a mug’s game.

Worse is that Facebook’s plan constitutes a Fabian intervention that is the undoing of overregulated neoliberal business. Under the guise of service or social conscience or any such moral agenda, it is introducing an intolerable control over what its users should or should not find interesting, and it is doing it with user-generated content for which it has not paid one cent.

It is the way monopolies operate, and fail. It was exactly in this way that moral hazard — the absolute certainty that it was unnecessary to guard against risk — and the collusion between governments and banks broke the world economy in 2008. Humans are not best served by what other humans decide on their behalf. Amoral algorithms are safer, better and much more egalitarian.

• Blom is a fly-fisherman who likes to write