You are here


Did you just throw a sheep at me?

Renewing focus on shared content among friends may do little to crack echo chambers and anxiety woes

Facebook has said it expects users to spend less time on the platform - is it really just distancing itself from its responsibility (which is still fuzzy anyway)?

FACEBOOK is taking us back to the days - make that ancient 2007 - of throwing sheep at one another. The social media giant said this month it would prioritise posts that engage interactions among friends over content from businesses, including traditional media companies.

The move comes amid mounting criticism that Facebook has enjoyed the rich perks of a media company, without having to measure up to the responsibilities in distributing information - or fake news - and the psychological impact it has on users.

But is Facebook really warming to human relations at the risk of profit?

The opposite seems likelier. Facebook may be wising up to business realities that would hurt profit further if it did not try to put people in greater control of their online revelations.

And rather than bringing "friends" closer, that algorithmic change may generate new social risks of louder echo chambers, and deeper social anxiety.

That Facebook is a domain that is easily dominated by interested parties has become perilously clear in the last few years. While earning advertising dollars from its control of two billion monthly users, Facebook also became a useful platform for Russians to influence the 2016 US elections. They used virulent content to drive a larger wedge in the ideological divide. The severity prompted a warning from no less than former US president Barack Obama.

What's more, the US now has a "stable genius" for president who is also unrelenting on another social media platform, and ironically made the term "fake news" a catchphrase for the ages.

One running prediction is that 2018 may be the year for greater Big Tech regulation, as authorities turn wary of how tech giants can engender strife, while taking an agnostic position on spreading misinformation. That agnostic view makes sense once their business model is taken into account: they make money from distributing as much as possible content they do not own, powered by users' data taken in exchange for a service that is in many cases, "free".

Facebook has said it expects users to spend less time on the platform - is it really just distancing itself from its responsibility (which is still fuzzy anyway)? Claiming that it has less influence reads like a form of self-regulation that keeps regulators off its backs. Its decision may also not help the larger challenge of promoting dialogue across ideological divides.

Bridgewater, the world's largest hedge fund, suggested that the proportion of the vote captured by populist candidates had risen from about 7 per cent in 2010 to 35 per cent in 2017, a Financial Times report said this month.

This is a first since the 1930s, then possibly reflecting tensions just before World War II.

The problem with Russian or online propaganda in general is not fake news in itself. The ones that cause more trouble are one-sided articles and videos, which are lapped up by online users already predisposed to clicking on those headlines. Facebook's distribution model sets off reverberations in echo chambers.

Granted, Facebook did not create such divides in the United States. But what it will have a hand in, is entrenching such views; among like-minded friends, highly commented posts would now take centre stage.

Facebook also expects greater interaction among friends to be more valuable. It has acknowledged research suggesting that the rabid scrolling affects mental health.

The status anxiety that comes with constant comparison through social media seems obvious. Does interaction improve this? Facebook certainly thinks so. It has also put in place a compassion team that debates, among other things, the merits of a "dislike" button.

But Facebook's fondness for tinkering with emotional appeal could lull users into false fulfilment. It brings to mind as well an episode from sci-fi TV series Black Mirror, where an office worker tries desperately to raise her star ratings that would take her to the upper echelons of society. The character turns into a Pollyanna in seeking validation from highly ranked users, but discriminates those of the one-star variety.

The show, in its extreme way, critiqued the like-for-likes manufacturing - our modern-day cottage industry - that has seeped into other social media such as Facebook's Instagram.

It also shares eerie similarities to the rating system now explored in China.

Author Alain de Botton wrote that envy does not fester from comparisons with inaccessible elites, say the Queen, but from peers. Just as well that de Botton's book on the topic of status anxiety was published in 2004, the year Facebook was born.

Facebook users struggling to cope should regard Facebook as a frenemy that sidles up when it suits its interests.

There is that option to disconnect, and news agencies are weighing out how a time-out is worthwhile.

Though, if columns suggesting we stop using Facebook were written but nobody saw them, did journalism make a sound?

BT is now on Telegram!

For daily updates on weekdays and specially selected content for the weekend. Subscribe to