Corpus linguistics and the problem of non-violent extremism

Will Baldet Blogs 24/04/2018

During the 2016-17 Home Affairs Select Committee on Radicalisation, Google claimed that, in 2014, they had removed over 14 million videos relating to different forms of abuse, including content that promoted violent extremism. They have now announced they will be employing an additional 10,000 staff as human moderators to take down videos and comments that violate their policies. It therefore seems fair to assert that the problem of abusive online content is getting worse.

While we can agree that content which promotes or incites violence and abuse has no place online, there is a grey zone that sits between freedom of speech and hate speech that currently has limited opposition but is becoming an increasingly urgent problem.

Until their bans from Twitter and Facebook (in 2017 and 2018 respectively), Britain First personified this grey zone. Through a steady trickle of anti-Muslim and anti-immigration content, they poisoned the air of cyberspace with a toxic cloud that seeped from the virtual world into the physical world, creating a hostile environment for minority groups in the UK.

To my knowledge, Britain First has never called for violent attacks against Muslims, nor did they openly demand violence or vandalism against Muslim institutions (despite their contemptable ‘Mosque invasions’, where they thrust copies of the Bible into the hands of bewildered and frightened worshippers). As my colleague at the Centre for the Analysis of the Radical Right, Dr Craig McCann, recently noted, Facebook eventually removed Britain First from its platform for “repeatedly posting content designed to incite animosity and hatred against minority groups”. In other words, they created the mood music to which violent extremists can dance.

We can commend Facebook and Twitter for taking the bold step of banning Britain First, but I cannot help but wonder if the very public arrests and convictions of its leaders, Paul Golding and Jayda Fransen, were the impetus to implement their ban. Would social media companies have taken note, for instance had the President of the United States not raised their profile to his 50 million followers and the world’s media by retweeting some of Fransen’s anti-Muslim messages?

And herein lies a bigger problem; Britain First are far from the only game in town. There are countless websites, Facebook pages and Twitter accounts that promote the same animosity towards Muslim communities, both in Britain and abroad. Some, such as Jihad Watch and Atlas Shrugs, cultivate an air of authority and do not shy away from self-promotion. Others like Bare Naked Islam (which carries the tagline “It isn’t Islamophobia if they really are trying to kill you”) are similarly brazen in their contempt for Muslims, but operate below the radar of mainstream media and thus the glare of public scrutiny and outrage.

By framing every conversation about Islam and Muslims through a negative prism, these online platforms are endeavouring to change the meaning of those words to represent all that is wrong in society. As a simple test for yourself, recall just three recent stories about Muslims that do not make reference to terrorism, grooming gangs or forced marriage. It’s a tragic indictment of today’s society (and our mainstream media) that this should be difficult. Those that study the concept of corpus linguistics will be familiar with this method of defining public opinion through the normalisation of harmful language.

That is not to downplay these issues or to deny that they can exist within Muslim communities in non-Muslim majority nations. Yet we must also recognise that all of the above issues exist elsewhere in society and are not the preserve of one community over another. Surely society can recognise that Islamist terrorism is currently the most prevalent terrorist threat in most European countries without denigrating all Muslims in the process. However, these anti-Muslim platforms pump out such a relentless and persistent torrent of negative stories that, in the words of Baroness Sayeeda Warsi in the UK, anti-Muslim prejudice has become “Britain’s bigotry blind spot”.

So how do we define these websites, groups and individuals who stay the right side of our hate crime laws, but advance the rhetoric necessary for violent extremism, and what can be done about them?

The UK’s counter terrorism strategy defines them as ‘non-violent extremists’ because they are sympathetic to the aims of violent extremists, but without engaging in or promoting the acts of violence themselves. The term itself has been contentious when applied to Islamist extremists, but that is mostly because critics of counter-terrorism policies wrongly ascribe religious conservatism to its remit. Few, however, would deny it aptly describes the bile of the radical right.

In the United States, the Dangerous Speech Project is creating a framework for tackling this issue. They define ‘dangerous speech’ as any form of expression (speech, text or images) that can increase the risk that its audience will condone or participate in violence against members of another group. To reduce its impact and still preserve our right to freedom of speech, they suggest two approaches: the first is education of what constitutes dangerous speech and why it can be so harmful (effectively inoculating society by being able to recognise and resist it); and the second is to counter dangerous speech directly, by responding to it in a way that undermines it. Likewise, the US Holocaust Memorial Museum has produced a guide on counteracting dangerous speech.

Here in the UK, much of this work already forms part of the Prevent Strategy, although this is more explicitly focused on terrorist propaganda. However, the Counter Extremism Strategy, led by its new Commissioner Sara Khan, will have dangerous speech and non-violent extremism directly in its sights.

Whilst the appointment of a Counter Extremism Commissioner has received some predictably bloviated responses, it is evident that simply removing content from online platforms can only ever be a game of whack-a-mole. To that end, new approaches to tackle this problem are welcomed and we should be mature enough to assess Ms Khan and the Commission upon their results, not personal grudges and hyperbole. Put simply, unless we find a way to curtail the Danse Macabre of non-violent extremists, the will band play on.

Originally posted: