Facebook: Brazen Censorship or a Tactical Solution to a Sophisticated Social Ask?

Written by Pavel Cherkashin | Published 2021/01/16
Tech Story Tags: facebook | censorship | media | social-media | facebook-censorship | facebook-censor | facebook-constitution | hackernoon-top-story

TLDRvia the TL;DR App

By Pavel Cherkashin, managing partner at Mindrock Capital
It seems that this year Facebook has been using its power to censor content and deleting user posts as often and as fiercely as never before. Grounds are easy to find, it is sufficient to mention the pandemic conspiracy theories and fake news about the virus.
In October, Facebook announced an awareness campaign on coronavirus and influenza vaccination and threatened to ban the posts encouraging people to refuse the vaccination. Does it mean that Facebook has introduced full censorship because it has taken sides with some group or because it is trying to impose its own decisions on the users?
At the Congress meeting in 2018, answering the question ”Do you subjectively manipulate your algorithms to prioritize or censure speech?”, Mark Zuckerberg said: ”We don’t think about what we are doing as censoring speech”. And that is where I am going to take his side.
Facebook State Constitution
At the same Congressional hearing, Zuckerberg said: “When people ask us if we are a media company, what I hear is, do we have a responsibility for the content that people share on Facebook? And I believe the answer to that question is yes“. Making a profit is no longer the one and only priority of the management. The company is now accountable to humankind for what is happening on the platform.
The comparison of a social network and mass media is obviously inappropriate: a social network has no editorial policy or plan as well as no editorial staff approving content and fighting for readers’ attention. It’s the society that communicates and acts in a relatively new digital space.
Comparing Facebook to a new type of state seems to be far more logical. In some sense, Facebook assumes the role of a state by committing to establish a social contract with the users. Just as states regulate their citizens’ lives by establishing certain rules and boundaries, the platform uniting three billion people needs to regulate people’s lives in the digital space.
Digital communities operate under the case-based model “everything which is not forbidden is allowed”, which means that rules are going to primarily restrict people’s options. Since everything on the social network happens in the content domain, the rules can only limit our freedom to write, share, and comment whatever and however we want. For instance, we must restrict a person’s right to use lies to manipulate public opinion and ban violent advertising and any kind of hate. By the way, the concept of “hate” (or “inciting hatred”) as we know it now was born and is currently thriving in the digital space, and is a totally new phenomenon in our lives.
If a social network is affecting the minds of almost half of the world’s population and knows more about them than the people themselves, then its content policy is no longer a matter of individual preferences of the social media owners but a global social and political problem. It is impossible to solve it discreetly because by defining what may be posted publicly and what may not, we actually form a unified public consciousness of the civilization, which is common for all the humankind Constitution. That is an interesting task for humanity. Its solution — the creation of a new supra-state social norm — is not political, but rather scientific and technological.
In the new realities, humankind gets the right to demand two things from major technology companies like Facebook.
The first is the Constitution that is comprehensible, open and treats everyone equally, just like the Constitution for the citizens of the state is. Meanwhile, the citizens, or in this case — users, must contribute to its development. This task cannot be left to the company alone.
The second requirement is for the Constitution to be implemented and enforced impartially, which means that its implementation should be embedded in the program code. While there is a person with their subjective opinion mediating the relations between the content and the user, there is always room left for prejudice and mistakes.
Step 1. Developing the Constitution
So where is the line between what may be posted and what may not? A very good example is discussed in my favorite podcast Radiolab.
At the dawn of time a couple of pages were enough to accommodate the whole of Facebook’s content policy. It was all too simple: nudity, insults, calls to violence had to be banned. This restriction included among many other things any images related to breastfeeding.
In 2008, women from the Facebook community about breastfeeding whose content had constantly been blocked organized a protest at the company’s door. That was the beginning of the freedom of speech reconsideration in the modern digitalized world, and this process is still going on and sure to last for many years more.
By the year 2018, the content policy had grown to 50 pages, and learned to distinguish between a tutorial on breastfeeding and erotic content. It may seem like a trivial matter similar to a hot dog-identifying app created by Jian-Yang from the TV series Silicon Valley, but it’s not.
Here is an incomplete list of questions that had to be addressed: What part of the breast may be shown? One breast or both? What can be called ”breastfeeding”? If a baby is lying on their mom’s belly and we can see the mom’s breasts, is it feeding? What age of a child is appropriate for breastfeeding? What if it is not a child, but a grown-up man? Or a dwarf? Or a baby goat? (It seems crazy for the western world, but it is a norm in Africa). Should we take into consideration the context of the photo or just the image itself? And what if it is a work of art? What do we call art? And so on, and so forth, an endless rabbit hole of questions on this small, but common for all human aspects of life.
We should understand that Facebook is a gigantic dataset, and a third of the world’s population adds millions of new entries every day. It is impossible to foresee the users’ fantasy as well as cultural norms and events happening in the world. It is impossible to foresee life in all its variety.
How can anybody set common and simple enough rules for a third of the world’s population?
The comparison of the company with mass media raises other ethical issues for Facebook: Must it rely on the mass media in defining what content should be banned and what permitted? And if the mass media slur over a problem and the social network users want to expose it, what should the company do?
It is difficult to answer all these questions in a way that will suit everyone. However, Facebook tries to design not an ideal policy but the one that will work. No policy is formed based on the logic “we are going to ban this and forbid that, and here there is something obscure, so, we are going to ban it just to be safe”. The rules are changing, just like the exceptions to them, and these changes are not only about banning. New issues lead to new amendments. News on the harmful effects of vaccination was banned because it was not a scientifically proven fact and because it was against the interests of humankind. While some content is moderated less successfully, the terrorist content, according to Zuckerberg, is identified with almost 100% accuracy.
Step 2. Technical Implementation
Next, Facebook needs to find a way to technically implement this policy. It will take no less than a decade to let these policies settle in and teach the system to identify violations.
There is still no algorithm that could sort the content in an unambiguous and unbiased way and identify humor, sarcasm, fakes, and lack of academic knowledge in the discourse, just like a human brain does. That is why all the content goes through a human filter. According to Zuckerberg, more than 35,000 people are currently working on the implementation of content safety policies.
People reviewing the content for Facebook and making decisions to block users or content work in regions with cheap labor, like the Philippines, where conservative Christian beliefs are prevalent.
Why is it significant? Because it turned out that these people have their own cultural norms and opinions, as well as psychological traumas caused by their work. All those factors directly affect their decisions. Moving the reviewing centers to a different region is not going to help, wherever you go there will always be some specifics.
Obviously the reviewers’ powers amount only to pressing the “yes” or “no” button, but they and the extensive policies of the company are enough for users to constantly accuse Facebook of bias and restraints on freedom of speech where there should not be a place for it.
The tension is building not around the Facebook censorship. Facebook content has always been censored. The problem is that Facebook censorship is not the same for all its users. It is still regulated by a subjective opinion of a human, be it Zuckerberg himself or a religious Filipino who is paid 10$ a day.
A group of people sitting in the head office or close to it have rights that other people on the Earth do not possess. Some ten years ago this situation was acceptable for a private commercial company, but it is absolutely unacceptable in the present-day world.
As a civilization we are on the brink of creating a Digital Space Constitution or a Universal Moral Code. I am almost sure this code will contradict laws and rules of some states and communities but it will stem from the laws of reason that are common to all humankind.
We are entering a completely new supra-state level of regulation. This function that has suddenly gone to Facebook as a state has all the chances to contribute to the development of democracy around the globe. If Facebook meets this challenge, it will get even more influence on the political arena.
But chances are that it will not make such a contribution if its censoring policy turns into an instrument of political pressure of a state or a political alliance.

Written by Pavel Cherkashin | Pavel Cherkashin is a founder at Mindrock Capital and a serial entrepreneur in the past
Published by HackerNoon on 2021/01/16