In a post on their blog yesterday, Facebook announced more changes to how it ranks posts from Pages in the algorithm that defines what appears in your News Feed.
The update appears to be specifically geared toward the ongoing conversation about the spread of “fake news” on Facebook and other social platforms. To that end, it talks about how it has identified Pages that frequently have their updates hidden or muted and will use those kinds of signals to determine when something is likely to be untrue and will diminish its ranking in the News Feed. Conversely, it will look at how Page posts are getting what’s deemed to be “positive” engagement and adjust accordingly in the other direction.
All this is interesting because it continues to show how Facebook, with zero accountability or transparency, keeps exercising its own prerogative to influence what people see. I don’t have a problem with it fighting the spread of factually incorrect and inaccurate news on the platform, but as always this seems like a fix that doesn’t actually address the problem. Changing some minor points in a much-larger an algorithm doesn’t get to the core of the problem, which is that these liars – there’s no other word for it – will continue to use that platform for their own financial and ideological gains.
I’m not pro-censorship. I believe that if Facebook wants to play dumb when it works for them (e.g. when it comes to who can use the tool and for what purpose) it can, even if it then exercises absolute – and absolutely opaque – judgment about what the end result of that usage is. But there are changes it could make that would more substantively show it’s taking an active role in halting the spread of purposeful disinformation.
First and foremost among those potential actions is drawing more distinct lines around who sees what. If Facebook is going to make the call that X is a source of inaccurate and misleading information, don’t just drop its News Feed ranking. That may still result in a piece of “news” that’s a full-on fabrication appearing in the feeds of those who didn’t sign up for it but might be friends with someone who did. A better course of action would be to draw a clear demarcation around the spread of updates from that Page to *only* those who have Liked and are following it. That way the recipient must have actually taken a positive action on their own in order to receive it. Everyone else will be spared and the ignorance can remain confined to a self-selected few.
As always Facebook ends with a note that this is unlikely to impact the reach of any given Page, despite the fact that it’s designed to do just that. Ask any content program manager and they’ll tell you the sum effect of all these updates that Facebook says aren’t that consequential have been significantly consequential to reach and other engagement metrics. But it’s Facebook’s sandbox and as long as it remains the biggest dealer on the block there’s little choice but to play by its constantly-shifting and self-serving rules.