Is free speech under attack in the digital age?

Is free speech under attack in the digital age?

On Friday 30th June, the German parliament approved a bill which will aim to crack down on instances of hateful speech or criminal material on social networks. Under the Netzwerkdurchsetzungsgesetz (NetzDG) law, social media companies such as Facebook or YouTube must take down instances of ‘obviously illegal’ hate speech or criminal content within 24 hours. Content which is not obviously unlawful must be assessed within seven days. Companies in Germany which fail to comply with these new regulations face fines of up to 50m euros. What does this mean for free speech online?

One of the key individuals in pushing this new legislation through is German Justice Minister Heiko Maas. He has argued that “freedom of speech ends where the criminal law begins.” Maas has claimed that official figures have shown that the number of hate crimes in Germany have increased by 300% in the past two years, and a number of high profile cases of ‘fake news’ and illegal content spread on Facebook and YouTube in Germany have demonstrated the need for such a law. This news also comes just a few months after YouTube’s own ‘Ad-Pocalypse’ in which a number of large US companies halted spending on advertising on the platform after it was claimed that advertisements were appearing over ‘extremist’ content. This was the cause of huge controversy for Creators on the website who make a living from the Ad-Revenue they receive for creating content for the platform. Many users found their content being demonetised for being merely satirical or factual in their coverage of controversial subjects.  

These stricter guidelines on what is deemed hateful or illegal speech are perhaps to be expected as laws surrounding criminal content and hate speech catch up with how information is now shared and disseminated in the digital age. In the United States, for instance, laws surrounding hate speech have constantly evolved and changed over the last century. In 1941 a Jehovah’s Witness called Walter Chaplinsky was arrested under an offensive conduct law of New Hampshire after his public preaching and demonstrations incited aggression from bystanders. In Chaplinsky v. New Hampshire (1942), The Supreme Court upheld his conviction unanimously. Justice Frank Murphy, writing the opinion of the court explained that “the right of free speech is not absolute at all times and under all circumstances.” This was called the ‘fighting words’ doctrine and prohibited speech which included “the lewd and obscene, the profane, the libellous, and the insulting or ‘fighting’ words, those which by their very utterance inflict injury or tend to incite an immediate breach of the peace.”

The ‘fighting words’ doctrine did not remain within US judicial precedence for long and was soon criticised, crucially on the issue of defining ‘an immediate breach of the peace.’ In recent decades, hate speech provisions that seek to restrict speech have typically been declared invalid by the Supreme Court because they are content-based. Cases such R.A.V. v. City of St. Paul (1992) exemplified this problem when the Supreme Court overruled an ordinance which charged a teenager after he burned a makeshift cross on the lawn of an African-American family. The St. Paul Bias-Motivated Crime Ordinance, under which R.A.V. was charged, made it a crime to place a symbol, such as a burning cross or Nazi swastika, to arouse “anger, alarm or resentment in others on the basis of race, color, creed, religion or gender…” The Supreme Court overturned this ordinance because it discriminated against certain types of fighting words based on their content. In essence, the ordinance prohibited racist or sexist speech, while allowing others, such as political fighting words. As the Supreme Court explained, “the government may not regulate use” of speech or fighting words “based on hostility – or favouritism – towards the underlying message expressed.” Even as recently as 2011 controversial groups such as the Westboro Baptist Church have had their speech protected under law. After picketing the funeral of a US marine Lance Corporal Matthew Snyder, the Supreme Court upheld the decision that the speech of Westboro Baptist Church was related to a public issue and therefore protected, despite some of their signs and placards reading “Thank God for dead soldiers.”

Laws surrounding hate speech and illegal expression have constantly changed in recent history. In general, it has been realised than context and content are important and American courts have moved away from categorical or definitional approaches to categorising hate speech and considered more recent instances of potential hate speech on a case-by-case basis. Is this about to change for the internet age? With sweeping new policies and legislation being enacted in last few months, there is a concern that online content on platforms such as Twitter, Facebook, and YouTube are going to wrongly fall under the category of illegal or hateful content. Facebook have already expressed their concern regarding NetzDG, saying “the draft law provides an incentive to delete content that is not clearly illegal when social networks face such a disproportionate threat of fines.” Stephen Deadman, Facebook’s global deputy chief privacy officer said the following when discussing the issue around illegal content online: “We want everybody to be safe. We also want an open and free internet with a variety of content. We also don’t want companies to become the censors of the internet, or governments for that matter.”

Could we be seeing a return to broader, more sweeping legislation like in the case of Walter Chaplinsky? Context is important and it should matter to the German government, or advertisers on websites, or the social media platforms themselves. A person’s right to speak should be protected, not because we believe in what they are saying but because we lose the ability to learn why that person thinks the way they do if we shut them down. While hate speech and illegal content does exist online, social media should ideally be the platform where people of differing views can challenge and debate one-another. This is a right which some are beginning to feel will be prohibited with the introduction of NetzDG and other similar legislation.

When did Pride become a parade and not a protest?

When did Pride become a parade and not a protest?

Why we need votes at sixteen

Why we need votes at sixteen