Home Insider Articles Algorithms of Social Media

Algorithms of Social Media

Social media in Myanmar is referring to one and only Facebook. Its algorithms are major contributors to the unrest and infighting within the country and if not careful, eventual destruction of our beloved motherland.

What does it mean by algorithm?

At the base level, it is just a set of rules that would determine the order of things or define cause and effect relationships. In computer terms, it is simply programming that put rules into an application (app).

In social media term, algorithms decides a post’s position on the news feed, based on predictions about each user’s preferences and tendencies. The details of Facebook algorithms design determine what sorts of content/news/stories thrive on its platform, and what types languish. 

How does Facebook rank news and stories?

Typically your posts or the posts that people shared are ranked based upon popularity, content type, relationship and recency. Clearly, the Facebook posters want their posts to be popular and be on top of everyone’s view, ahead of others. They want their posts to reach to most number of Facebook accounts. Yet for Facebook, everything is about $ and revenue generation through the advertisements.

That’s why over the years, organic reach (i.e., the number of people who have seen your post without spending on an advertisement) has declined significantly. The key benefit of organic reach is that you can publish your posts for free. Nowadays, organic reach is just 5%. Engagement is worse, i.e., getting reactions and comments, is at 0.25%, worse at  0.08% if the page or account has more than 100,000 followers.

What are the issues with FB also nowadays?

Since 2018, the algorithm has elevated posts that encourage interaction, such as the posts popular with friends. This broadly prioritises posts by friends and family and viral memes, but also divisive content. For a subset of extremely partisan users, such as NNCP terrorists and their sympathisers, today’s algorithm can turn their feeds into echo chambers of divisive content and hate news, of varying reputability, that support their outlook.

Starting in 2009, a relatively straightforward ranking algorithm determined the order of stories for each user, making sure that the juicy stuff — like the news that a friend was “no longer in a relationship” — appeared near the top. Just think about this; post 1 is of your friend and her boyfriend of 5 years, getting married happily, whereas post 2 is about your female friend marrying two men at the same time. Which one do you truly think would Facebook algorithm put place in a better position for more organic reach, by having the post appear more times and prioritising its placement on top of individual pages?

In 2016, however, Facebook executives grew worried about a decline in “original sharing.” Users were spending so much time passively watching and reading that they weren’t interacting with each other as much. Young people in particular shifted their personal conversations to rivals such as Snapchat that offered more intimacy.

Once again, Facebook found its answer in the algorithm: It developed a new set of goal metrics that it called “meaningful social interactions,” designed to show users more posts from friends and family, and fewer from big publishers and brands. In particular, the algorithm began to give outsize weight to posts that sparked lots of comments and replies.

The downside of this approach was that the posts that sparked the most comments tended to be the ones that made people angry or offended them. Facebook became an angrier, more polarizing place. It didn’t help that, starting in 2017, the algorithm had assigned reaction emoji — including the angry emoji — five times the weight of a simple “like,” according to company documents. In Myanmar case, which of these two posts would ignite comments, angry emoji, swearing, hatred and sharing: Myanmar military helping the poor in flood hit areas or a fake news post about military bombing a school with children inside? Based on facebook algorithms, which one, do you think, would go more into the users’ news feed? Mind you, facebook has no filter to validate fake news, especially in a non-english language post.

Internal documents show Facebook researchers found that, for the most politically oriented 1 million American users, nearly 90% of the content that Facebook shows them is about politics and social issues. Those groups also received the most misinformation, especially a set of users associated with mostly right-leaning content, who were shown one misinformation post out of every 40, according to a document from June 2020. (Washington Post, 2021)

Facebook engineers gave extra value to emoji reactions, including ‘angry,’ pushing more emotional and provocative content into users’ news feeds.

What can we learn from knowing this?

You are giving up your privacy in exchange for attention, in terms of likes, emojis and comments. The widespread online harassment and cyber bullying mean that the stealing of private words, actions, conversations or photos and the use of them without consent, context, or compassion all fuel the trafficking of shame and the culture of humiliation.

That price of shame and humiliation through fake news and half truths, funnel into the profits of those who prey upon them, including Facebook. Invasion of others privacy is the raw materials efficiently and ruthlessly mined and packaged and sold at a profit, through a marketplace platform where fake news and public humiliation is a commodity and shaming is now an industry.

Dollars are generated through clicks, emojis and comments. More shaming means more of these three. That means more eyeballs, more emojis, more comments, more engagements and more advertising dollars and more profits for Facebook.

Based on what has happened in Myanmar, many individuals and entities minutely related to Tatmataw is targeted via fake news and half truths for shaming and public humiliation. Some of them who are caught committing these crimes are now in jail. Some are still overseas still trying to repeat the same offences. Yet who is making money at the back of our fellow countrymen humiliation and our nation’s suffering? Only Facebook!

What else we need to know?

We live in an avatar world of Facebook, where a majority of the users are having compassion deficit and suffering from empathy crisis. Its algorithms are not going to be altered in the near future. It is their business model. That’s how Facebook make money.

The only way to counter that would be having empathy and compassion towards those who have been targeted unfairly. Algorithms that promote hatred and anger cannot survive empathy. Even one encouraging comment would make a huge difference to the victim between following the route of Tyler Clementi or standing stall. We can also be up-standers instead of just being bystanders to fake news.

Know that the right to the freedom of speech must come together with the responsibility to the freedom of expression. Are we speaking up with intention or are we speaking out for attention?

Between 80% – 100% of what is on our site/platform is B*llshi*(crap)!, according to John Oliver of Last Week Tonight show. Don’t fall victim to their algorithms to aid the destruction of our own country.

People communication network concept. Social media.

On a final note, one need not lose sleep over this so much. Facebook is facing its own set of major issues now. Apple has closed down the tracking ability of apps on iPhones starting from iPhone 14 causing a huge blow to Facebook who needed that to do targeted ads. More and more people are experiencing and realising all over the world about Facebook unethical algorithms. Facebook new service, Horizon Worlds, debut so bad that it even has to force its own staff to install and use. All these are reflected in the fall in the value of the company of nearlys 700 billion, since becoming Meta in June 2022.

As Obama quoted Martin Luther King’s words, ‘The arch of moral universe is long, but in the end, it bends towards justice’!

(Reference: Washington Post, October 2021)