One of the most difficult decisions to make when looking at social media is where the line is; what opinions should be deemed as free speech, and which fall into the offensive category.
Some around the worst think opinions should be sacred and never touched, some try to police what can or cannot be said, while others believe opinions which offer any form of opposition should be crushed into the ground. But the complication is personal opinion; some things which are offensive to some, are not to others. When creating a platform for expression, the social media giants have the dreaded task of deciding what should and should not be allowed. Facebook thinks it has struck the right balance.
“The core concept here is whether a particular restriction of speech is necessary to prevent harm,” said Facebook’s Richard Allan, the VP of Policy. “Short of that, the ICCPR [International Covenant on Civil and Political Rights] holds that speech should be allowed.
“This is the same test we use to draw the line on Facebook. After all, giving everyone a voice is a positive force in the world, increasing the diversity of ideas shared in public discourse. Whether it’s a peaceful protest in the streets, an op-ed in a newspaper or a post on social media, free expression is key to a thriving society.”
Article 19 of the International Covenant on Civil and Political Rights (ICCPR) is just one of the sources used by the social media giant, but it is an important one. It sets the standards for when it’s appropriate to place restrictions on freedom of expression, but this again is an issue. Who is to say this is right or wrong? Why is person A right, when person B has a different view? The breadth of documentation putting arguments together is breath-taking, as is the number of academics who have an opinion.
That said, instead of looking at what is allowed, Facebook has approached the challenge by looking at what shouldn’t. First are posts that contain a credible threat of violence. This is a simple one for Facebook as it is all about preventing harm. Disagreements are a healthy aspect of social interaction, but when the conversation descends into violent threats, Facebook feels this is the right time to step in. The difficulty here is understanding what threats should be taken seriously.
Hate speech is the second no-no, as it creates environment of intimidation and exclusion. This is a more difficult one to consider, as some instances might be more difficult to judge. Of course, at the extreme end of the scale, it is very easy to spot hate speech, but further down it becomes less clear.
The final area which is another tricky one is fake news. On one hand it is almost impossible to fact check the sheer breadth of fake news, but the team also do not want to take everything which might be considered fake news down. People who believe the world is flat are entitled to post, and some might argue that as there is no concrete proof of god, this could be deemed misinformation. Should all religious posts be banned from Facebook? The key here is when misinformation has real-world, detrimental impacts such as violence or tricking people for profit.
Every rule or policy Facebook introduces will of course be heavily debated, and we suspect the social media giant will cause offence to someone, somewhere irrelevant as to what it does. But at least these rules paint some form of clarity around the contentious debate. How much clarity, we will leave you to decide.