Researchers at the University of Bath have identified signals in social media posts that can predict when someone posting on far-right forums is likely to go on to commit a terrorist act.
Posts that related specifically to logistics, operational planning (including knowledge about weapons and avoiding law enforcement) and violent action marked out individuals that would go on to perpetrate terror offenses. This was evident up to four years before criminal action.
In the first study of its kind, published in Personality and Social Psychology Bulletin, the team compared posts of convicted far-right terrorists with posts from people holding far-right extremist views who have not gone on to commit violence offline.
The majority of offenders were convicted in the United States (75%), with 20% convicted in the United Kingdom and the remaining 5% in Australia, Canada, and New Zealand.
The discussion of far-right ideology and the expression of hateful views actually decreased the probability that the user mobilized to action.
“Online Signals of Extremist Mobilization” is published as senior US and UK police officers warn, in an interview by the BBC, that an increasing number of those turning to terrorism are driven by ‘a fascination for violence, rather than ideological fanaticism’.
“Our research shows that we can identify people on social media who go on to commit extremist action by picking up on posts that are about acquiring know-how and developing capability to commit terrorism,” said Dr. Olivia Brown, Associate Professor in Digital Futures at the University’s School of Management and Deputy Director of the Bath Institute for Digital Security and Behavior.
“This method can help to identify people that are genuinely dangerous and likely to cause physical harm as opposed to those that are likely to contain their extremism to radical views and hate speech online.”
The researchers spent a year compiling a unique database of over 200,000 social media posts from 2011–2019. These posts were from 26 individuals convicted of terrorism-related offenses (mostly in the US and some in the UK) and 48 people sharing extremist content on far-right forums on Iron March, Gab, and Discord, who had not been convicted.
“Unfortunately, the sheer volume of extremist content online means that identifying people most likely to cause harm is like finding a needle in a haystack,” said Dr. Brown.
“We have pinpointed signals of risk to make the haystack smaller and the needle bigger, which can be used to prioritize monitoring resources on a smaller pool of people who we think are more likely to act.
“Of course, ideological content will still be a major concern to security services, but this is an additional technological tool, alongside existing resources, to differentiate between individuals who are likely to engage in terrorist action and those who are not.”
Dr. Brown is seeking funding to apply the methods of this research to the January 6 Capitol Building riots in the U.S., to understand more about the mechanisms of mobilization.
She is also working with law enforcement to look at social media posts in the context of online forums—examining group interactions and creating a tool to analyze risk within social networks.
More information:
Olivia Brown et al, Online Signals of Extremist Mobilization, Personality and Social Psychology Bulletin (2024). DOI: 10.1177/01461672241266866
Provided by
University of Bath
Citation:
Potential terrorists can be identified from social media posts, new research shows (2024, August 2)