TASM 2024 - Panels and Abstracts

Access the full schedule for TASM 2024, complete with a full list of sessions, speakers, panels and abstracts.

TASM Conference 2024 Panels and Abstracts 18 th -19 th June Great Hall, Swansea University Bay Campus

Schedule.............................................................................................................................................................................3 Breakout session 1.............................................................................................................................................................4 Panel 1A: Websites and the far-right...............................................................................................................................4 Panel 1B: The Swansea Model: Terrorist and Violent Extremist Researcher Safety and Security................................6 Panel 1C: Bringing Old and New Together: Understanding The Evolution of Violent Extremist Strategic Communication Online.....................................................................................................................................................7 Panel 1D: Video.................................................................................................................................................................8 Breakout Session 2..........................................................................................................................................................10 Panel 2A: Disinformation, misinformation and conspiracy...........................................................................................10 Panel 2B: A Civil (Society) Discussion: How to Better Integrate Civil Society into Multistakeholder Projects..........12 Panel 2C: Combatting the adversarial shift: Emerging challenges in tackling terrorist use of the internet................13 Panel 2D: Masculinities....................................................................................................................................................14 Breakout Session 3...........................................................................................................................................................16 Panel 3A: Incels................................................................................................................................................................16 Panel 3B: TikTok Research API Workshop......................................................................................................................18 Panel 3C: Through the Looking Glass: The Methodological and Ethical Implications of Using Visual Material Within Research................................................................................................................................................................19 Panel 3D: Telegram.........................................................................................................................................................20 Breakout Session 4..........................................................................................................................................................22 Panel 4A: Gender............................................................................................................................................................22 Panel 4B: Pathways.........................................................................................................................................................24 Panel 4C: Shaheed: Accounting for linguistic and cultural context at scale while addressing terrorist and illegal content online..................................................................................................................................................................25 Panel 4D: State and non-state propaganda..................................................................................................................26 Breakout Session 5..........................................................................................................................................................28 Panel 5A: Understanding The Base: Analyses of online interactions, recruiting, networks, and motivations...........28 Panel 5B: Assessing the Complexities of Multistakeholder Work in Responding to Extremism and Terrorism Online...............................................................................................................................................................................29 Panel 5C: Ecosystems.....................................................................................................................................................30 Panel 5D: Red Teaming: Emerging Threats in Online Extremism..................................................................................32 Breakout Session 6..........................................................................................................................................................33 Panel 6A: The far-right....................................................................................................................................................33 Panel 6B: How can researchers best support online safety regulation of terrorist and violent extremist content online?..............................................................................................................................................................................35 Panel 6C: An uneasy relationship: the ethics of online data integrity in violent extremism research........................36 Panel 6D: Understanding violent extremism, non-violent extremism and non-radicalisation....................................37 Breakout Session 7..........................................................................................................................................................39 Panel 7A: Building Qualitative and Quantitative Assessments of Accelerationist Activity........................................39 Panel 7B: Psychological and psychiatric drivers...........................................................................................................40 Panel 7C: GIFCT’s Hash Sharing Database - Research Collaboration Opportunities..............................................42 Panel 7D: Innovation and emergent issues....................................................................................................................43 Breakout Session 8..........................................................................................................................................................45 Panel 8A: Regulation......................................................................................................................................................45 Panel 8B: Professional practice......................................................................................................................................47 Panel 8C: Preventing and Countering Violent Extremism Online.................................................................................49 Panel 8D: Identity.............................................................................................................................................................51

2

SCHEDULE

Day One

8:20-8:50: 8:50-9:00:

Registration and refreshments Welcome Plenary 1: Brian Fishman keynote Room change Breakout session 1 Refreshments Breakout session 2 Lunch Breakout session 3 Refreshments Breakout session 4 Drinks reception at The Village hotel

9:00-10:00: 10:00-10:10: 10:10-11:25: 11:25-11:50: 11:50-13:05: 13:05-14:30: 14:30-15:45: 15:45-16:15: 16:15-17:30: 20:00

Day Two

8:20-8:45: 8:45-10:00: 10:00-10:10: 10:10-11:25: 11:25-11:50: 11:50-13:05: 13:05-14:30: 14:30-15:45: 15:45-16:15: 16:15-17:15: 17:15-17:30:

Refreshments Breakout session 5 Room change Breakout session 6 Refreshments Breakout session 7 Lunch Breakout session 8 Refreshments Plenary session 2: Panel discussion with Anjum Rahman, Dia Kayyali and Anne Craanen Closing comments

3

BREAKOUT SESSIONS 1

Panel 1A: Websites and the far-right

Chair: Connor Rees (Swansea University)

Evaluating the Affordances and Popularity of Extremist Website Infrastructure Dr Seán Looney (University of Plymouth)

Abstract: Considerable attention has been paid by researchers to social media platforms, especially the ‘big companies’, and increasingly also messaging applications, and how effectively they moderate extremist and terrorist content on their services. Much less attention has yet been paid to if and how infrastructure and service providers, further down ‘the tech stack’, deal with extremism and terrorism. While prior research has shown that extremist and terrorist websites make use of Content Delivery Networks (CDN) this research was limited in scope to relatively few websites. This presentation aims to broaden to scope of the author’s previous work to a wider range of websites and to other forms of website infrastructure as well as CDNs such as domain registrars, cloud services and website designers. This is done in order to see which providers are most popular with extremist groups and what are the particular affordances of these providers which might explain their popularity.

Digital Frontiers of Hate: Netnography’s Role in Studying Online Extremist Movements Mgr Jonathan Collins (Charles University) Mgr Kristián Földes (Charles University)

Abstract: The growing popularity of the internet for online extremist use necessitates different methodologies to study their behaviour, interactions, and content. While previous research has employed different qualitative and quantitative approaches, a substantial empirical gap exists concerning in-depth, immersive methodologies for examining these digital communities. This paper proposes a potential solution to this problem by highlighting the utility of netnography as an effective tool for gaining insights into virtual cultures and user experiences for extremist online behaviours. Netnography, as an approach, is geared towards unravelling the cultural practices embedded in and reflected through the traces, rituals, and systems of online communication platforms. The presentation outlines the method’s benefits, including the easy-to-follow methodological guidelines, the bridge between qualitative and quantitative data, the different content types, platforms, and multimethod applications, the nuanced socio-cultural findings, and more. Two distinct case studies illustrate these qualities: (1) Slovenskí Branci, a violent paramilitary group in Slovakia, on Facebook, and (2) Neo-Nazi extremists on Gab Social. The two-tiered approach provides a valuable preview and starting point for scholars looking to engage in immersive online research on extremist communities.

4

Interpolations of gender: Similarities and differences in the role of gender grievances in right wing extremist, far right, and mainstream right-wing discourse Ninian Frenguelli (Swansea University) Abstract: This paper presents the results of a PhD project mapping right wing extremist websites on the surface web. Using hyperlink network analysis, extremists were found to connect to non-extremist and mainstream websites. Discourse analysis of these extreme, non-extreme, and mainstream websites was performed with the intention of understanding what different groups believed about gender and gendered issues. At the beginning of the project, the intention was to study the different attitudes towards the roles that men and women should play in movements and beliefs about gender roles in wider society. By the time of data collection in 2023, “gender” had become synonymous with transgender issues and the LGBT+ community in general. The findings of this study largely reflect this shift. Very little discussion of the rights, roles, responsibilities of women or children was seen without being in reference to trans people. Preliminary findings show that for the most extreme in the dataset, these gendered passages were subsumed into antisemitic conspiracy theories, whereas, as the actors got less extreme, this antisemitism was less overt. These findings suggest that antisemitism is still the defining element of right wing extremism and that gender performs different functions in discourse depending on a group’s agenda.

Digital Reconstruction: An Interdisciplinary Analysis of Ku Klux Klan Websites Over Time Dr Ashton Kingdon (University of Southampton) Dr Aaron Winter (Lancaster University)

Abstract: In response to the data revolution, academic research and media attention has increasingly focused on the technological adaption and sophistication displayed by the far right. The greatest attention is paid to Web 2.0, and particularly how groups and organisations are utilising technological advancements and growth in virtual networks to increase recruitment and advance radicalisation on a global scale. This presentation will argue that although the Web 2.0 platforms on which the far right operate can be considered as “gateways” into the promotion of more extreme ideologies, these platforms are the tip of an iceberg and what is needed is a longer-term historical view and analysis of the wider and more diverse far-right online ecosystem. Taking this into consideration, this article examines the less-well studied traditional and official white supremacist websites and their role and function as incubators for past, present, and future far-right recruitment, organisation, mobilisation, and violence. The case study is the Ku Klux Klan, the most established and iconic of American far-right organisations, and the evolution of its websites, from their emergence in the early 1990s to the present day. We examine the ways in which traditional printed communications and other ephemera have progressed with advances in technology, focusing on the following central elements of Klan political activism and community formation: Klan identity, organisational history, aims and objectives; technology and outreach, including online merchandise and event organisation; and the constructions of whiteness and racism.

5

Panel 1B: The Swansea Model: Terrorist and Violent Extremist Researcher Safety and Security

Chair: Dr Michael Loadenthal (University of Cincinnati)

Panellists: Dr Marc-André Argentino (Accelerationism Research Consortium) Prof Maura Conway (Dublin City University & Swansea University) Dr Sara Correia-Hopkins (Swansea University)

Abstract: How can scholar-practitioners engage with online extremism without endangering themselves, their families, their respondents, or their institutions? These questions have increasing relevance as many threat actors are increasingly aware of researchers’ efforts and routinely target academics, journalists, and activists with threats, harassment, and violence. How do these values, principles, and practices relate to engagement with the public, news media, and academia? How and where does safety intersect with ethical reporting and informing the public without amplifying extremist content? Rather than claiming ownership or authority over the ‘best practices’ in the realm of operational security, this workshop seeks to facilitate an exchange between scholar-practitioners with the aim of collectively authoring a document for researchers on the best safety practices to help guide our field. Building on the group effort to author the Threat Modelling manifesto (see: https://www. threatmodelingmanifesto.org/), participants will share experiences and knowledge to identify, elevate, and record these best practices and guiding principles for safe(r) and ethical engagement in violent online spaces. The hope is that this session can draw on our collective knowledge to record and recommend standards for scholar-practitioners tailored specifically to our field and that these standards can be revisited on an ongoing basis as the threat landscape and actors’ abilities change.

6

Panel 1C: Bringing Old and New Together: Understanding The Evolution of Violent Extremist Strategic Communication Online

Chair: Dr Moign Khawaja (Dublin City University)

Panellists: Prof Miron Lakomy (University of Silesia) Dr Ali Fisher (Human Cognition & Università Cattolica del Sacro Cuore) Federico Borgonovo (Università Cattolica del Sacro Cuore) Giulia Porrino (Università Cattolica del Sacro Cuore) Silvano Rizieri Lucini (Università Cattolica del Sacro Cuore)

Abstract: This panel aims to discuss the latest tactics, strategies and tools employed by violent extremist organisations in their online campaigns. It explores the methods used to spread extremist ideologies, recruit new members, and incite violence through a broad spectrum of mainstream and new online platforms. The panel focuses on several critical aspects of this phenomenon regarding emerging (PMC Wagner and Whitejihad) and existing actors (Salafi-jihadist groups), including their latest set of tactics and strategies employed in disseminating propaganda online, new trends in the broader milieu of digital extremist culture, including the fragmentation of ideology, as well as VEOs’ use of the emergent technologies. To summarise, this panel highlights the latest trends in contemporary digital extremism and discusses how to mitigate its impact more efficiently.

7

Panel 1D: Video

Accelerating Towards the End of the World: Exploring the Narratives in Prepper Video Content and User Response on TikTok Kate Tomkins (University of Southampton) Abstract: The need to understand accelerationism is becoming increasingly critical for countering the extremist threat in the United Kingdom as a growing number of extreme-right actors and agencies have channelled narratives following the emergence of the Coronavirus pandemic. While the connection between social media platforms and radicalisation is well documented, there is a notable lack of literature on how the fringe subculture of doomsday preppers intersects with far-right accelerationist discourse. The present paper attempts to bridge this gap by mapping the narratives of prepper content and user responses on TikTok. Using a mixed-methods design, a critical-realist-informed thematic analysis of prepper video content and a textual network analysis of aggregate user responses suggest that accelerationist-inspired narratives influence content and response on TikTok both explicitly and implicitly. The analysis revealed content creators emphasise current societal tensions through imagery and language, signifying an immediate and dynamic danger. Conspiracy theories of nuclear threats, The Great Reset, a New World Order, and societal collapse were prominent and influenced by morphogenetic and morphostatic factors, including economic uncertainty and perception of persecution. Narratives were further developed throughout user response, potentially fostering echo chambers and the dissemination of extremist content and contributing to their prevalence within the prepper community. Right-wing HateTok as a portal: cross-platform recruitment and propaganda strategies of German right-wing extremist ecosystems Erik Hacker (SCENOR) Daniela Pisoiu (SCENOR) Abstract: In the context of the everchanging environment of social media platforms, our research suggests that German-speaking far-right players are building ecosystems across social media sites not only to ensure their sustainable presence in the face of deplatforming, but also to exploit different content moderation policies. For the sake of recruitment, right-wing digital ecosystems consciously spread implicit propaganda and hate speech on mainstream platforms to reach a larger pool of people, and to attract vulnerable youth to niche platforms via linking. The project “Right-wing Extremist Eco-Systems Driving Hate Speech: Dissemination and Recruitment Strategies” (RECO_DAR) contributes to the comprehension of how right-wing extremist hate speech evolves and spreads online using an innovative mixed method approach combining computational techniques and qualitative frame analysis. The paper analyses 40 prominent German-speaking far-right users and their wider ecosystems as well as clusters on TikTok, and compares their TikTok strategies to their behavior on niche platforms by following external links posted by them in bios, captions, and comments. The first insights show a fragmented far-right scene with isolated clusters, a growing emphasis on implicit visual hate speech on TikTok, an uptick in far-right players posing as alternative news outlets, and the dominance of anti-LGBTQ+ sentiments. Recommendation Systems and the Amplification of Online Harms Dr Joe Whittaker (Swansea University) Ellie Rogers (Swansea University) [Co-authors: Dr Sara Correia-Hopkins (Swansea University) & Dr Nicholas Micallef (Swansea University)] Abstract: There is a considerable policy concern over the role of recommendation systems and the potential amplification of problematic content online. This study assesses the empirical research on this phenomenon. Drawing from a scoping review of 43 studies, it asks four key questions: i) Is illegal content being amplified?; ii) Is “legal but harmful” content being amplified?; iii) What are the experiences of users in so called “filter bubbles”?; and, iv) What methods and data collection are being employed to investigate this phenomenon?

8

The Role of YouTube’s Recommendation Algorithm in the Radicalisation of Portuguese Users towards the Far-Right Vanessa Montinho (Swansea University) Abstract: The Portuguese far-right has been expanding its political influence through the use of social media, which has led to increasing concerns regarding the risk of online radicalisation and extreme right-wing terrorism in the Portuguese context. Furthermore, social media recommendation algorithms have been highly criticised due to evidence found that they progressively recommend more extremist material to users who interact with it, which might contribute to users’ radicalisation. Therefore, this paper empirically analyses whether YouTube’s recommendation algorithms contribute to the radicalisation of Portuguese users towards the far-right. It aims to answer the following research questions: RQ1. Do YouTube’s algorithms recommend increasingly extremist videos to Portuguese users?; and, RQ2. Does extremist content rank higher in recommendations after users interact with it? To do this, the study recreated the conditions in which Portuguese users might find themselves when using the platform and analysed the personalised recommendations YouTube offers them.

9

BREAKOUT SESSIONS 2

Panel 2A: Disinformation, misinformation and conspiracy

Chair: Dr Sara Correia-Hopkins (Swansea University)

Mapping and Mining Extreme Anti-West Social Media in West Africa: Between Legitimate Grievances and Foreign-produced Disinformation Prof Stephane Baele (Université Catholique de Louvain) Dr Lewys Brace (University of Exeter) Abstract: Over the past couple of years, West Africa has undergone significant political change including coups in Gabon, Mali, Burkina Faso, Guinea, and Niger, serious social unrest in relatively stable states such as Senegal, and the spread of Islamist terrorism to traditionally peaceful regions such as the North of Benin. While this unravelling has its roots in the systemic shift of the international system, it is also fuelled by information operations on the internet that encourage anti-French – and more generally anti-Western – sentiment and promote a new version of pan-Africanism imbued with pro-Russian sympathy. With no scientific research published yet on this critical issue, this paper aims to better understand these social media activities, mapping their multiple dimensions in a rigorous empirical way in order to gain a solid analysis of both their structure (via network analysis) and their content (via Natural Language Processing). Snowballing from a seed list of influencers and news channels on Facebook, Twitter, and YouTube with documented links to Russia (this list is compiled from a series of investigative journalism reports, and qualitatively checked and consolidated by the authors), the paper seeks to offer a nuanced account of the sources and expressions of radical ideas mixing legit- imate postcolonial grievances, aspirational pan-Africanist ideals, and foreign-produced Manichean disinformation. Seeing The Light: Tracing the Evolution of UK Conspiracy Narratives Darja Wischerath (University of Bath) Emily Godwin (University of Bath) Desislava Bocheva (University of Bath) [Co-authors: Alberto Arletti (University of Padua), Dr Brittany Davidson (University of Bath) & Dr Olivia Brown (University of Bath)] Abstract: The mainstreaming of conspiracy theories in recent years poses a new threat for harms to both individuals and society. In particular, conspiracy narratives can act as radicalisation multipliers in extremist environments and incite violence through their unique rhetorical structure. Prior research has explored the allure of conspiracy theories, yet there remains a significant gap in understanding the evolution of narratives as well as group mobilization within online-offline ecosystems. The Light is a self-published British “truthpaper” standing as a pivotal node in the UK conspiracy theory movement. Initially championing anti-vaccine and anti-lockdown positions during the COVID-19 pandemic, it continues to amplify anti-mainstream, polarising rhetoric. Online dissemination of this content is widespread alongside volunteer-led distribution of physical copies across the UK. We showcase a novel dataset comprising 37 issues of The Light and associated Telegram conversations from September 2020 to September 2023. Using computational methods and qualitative insights, we aim to 1) illuminate how narratives in The Light parallel real-world events and 2) explore the relationship between the paper and the Telegram channel for interaction in shaping conspiracy narratives and extreme rhetoric. We will make our dataset available for further research into the dynamics of UK conspiratorial discourse.

10

Mobilization strategies, threat narratives and historical parallels in conspiracy theories Dr Janina Pawelz (University of Hamburg)

Abstract: There is an increasing recognition of the link between conspiracy theories, violent extremist intentions, and their function in bridging extremist ideas, narratives, and scenes. Conspiracy theories, commonly driven by (perceived) grievances and fear, are postulated through narratives of threat that resonate with perceptions of injustice. More recently, conspiracy theories have increasingly surfaced in post-ideological, post-organizational extremist and non-extremist scenes, and they are often used to delegitimize governments, politicians, ‘the elite’, and democratic institutions. Conspiracy theories are not a new phenomenon as they have endured for centuries, transcending time and permeating across generations. These theories have historically served as catalysts for the persecution of marginalized groups and minorities. This paper presents empirical findings of the frame analysis of historical defamation campaigns against Freemasons, alleged Witches, and Jews (blood libel). The frame analysis reveals that there are five key ingredients to successful conspiracy mobilizations, which can also be found in (online) mobilization strategies implemented by contemporary conspiracy actors: two kinds of resonating threat narratives, times of crisis, new means of communication, malicious single actors and renegades’ knowledge. Online extremism and Islamophobic language and sentiment when discussing the COVID-19 pandemic and misinformation on Twitter Hollie Sutch (Birmingham City University) [Co-authors: Prof Imran Awan (Birmingham City University) & Dr Pelham Carter (Birmingham City University)] Abstract: This paper looks at the profiles of those who are engaged in Islamophobic language/extremist behaviour on Twitter during the COVID-19 pandemic. This two-part analysis considers factors such as anonymity, membership length and postage frequency on language use, and the differences in sentiment expressed between pro-social and anti-social tweets. Analysis includes comparisons between low, moderate and high levels of anonymity, postage frequency and membership length, allowing for differences in keyword use to be explored. Our findings suggest that increased anonymity is not associated with an increase in Islamophobic language and misinformation. The sentiment analysis indicated that emotions such as anger, disgust, fear, sadness and trust were significantly more associated with pro-social Twitter users whereas sentiments such as anticipation, joy and surprise were significantly more associated with anti-social Twitter users. In some cases, evidence for joy in the suffering of others as a result of the pandemic was expressed. This presentation will use insight to explore contemporary trigger events that have since witnessed largescale extremism, misinformation and disinformation online.

11

Panel 2B: A Civil (Society) Discussion: How to Better Integrate Civil Society into Multistakeholder Projects

Chair: Dr Katy Vaughan (Swansea University) & Dr Ashley A. Mattheis (Dublin City University)

Panellists: Anjum Rahman (Inclusive Aotearoa Collective Tāhono) Dia Kayyali (Christchurch Call Advisory Network) Tonei Glavinic (Dangerous Speech Project) Niklas Brinkmöller (Violence Prevention Network) Dr Farzaneh Badiei (Digtial Medusa)

Abstract: Multistakeholder research in preventing and countering violent extremism (on- and offline) primarily incorporates academia, industry, policy makers, think tanks, and law enforcement. When civil society partners are included, they are often only included as listeners or audiences. Civil Society, however, is both the foremost stakeholder and beneficiary of this work. Moreover, civil society organizations do much of the work in “on-the-ground” interventions and response to the effects of extremist violence. This results in an impact and knowledge transfer gap between existing multistakeholder work in this area and civil society that reduces our capability to respond to the problem and that leaves out a crucial perspective. This workshop will involve a facilitated, interactive discussion with a cohort of seven members of civil society organizations focused on combatting extremism and terrorism on- and offline. The session is aimed at developing new connections for and with civil society partners and organizations through engagement with other TASM attendees. The goal is to discuss ways to potentially improve engagement with, and integration of, civil society actors and organizations in online (and offline) CT work.

This session is part of an ESRC-funded Impact Accelerator Award, entitled: Co-creating Impact via Integrating Civil Society: Building Inclusive Multistakeholder Networks (CIvICS).

12

Panel 2C: Combatting the adversarial shift: Emerging challenges in tackling terrorist use of the internet

Chair: Archie Macfarlane (Tech Against Terrorism)

Panellists: Grace Rollison (Tech Against Terrorism) Rory Donovan (Tech Against Terrorism)

Abstract: Tech Against Terrorism is a public-private partnership that disrupts terrorist and violent extremist use of the internet. Our panel will closely align with the theme of the conference, covering both challenges and responses to terrorist and violent extremist (TVE) use of online platforms. We will set out the context by outlining the emerging threat picture in terms of exploitation of existing and new technologies by TVE actors based on our open-source monitoring. We will then present on our research mapping far-right terrorist propaganda dissemination online based on Terrorist Content Analytics Platform (TCAP) data. Finally, we will highlight the challenge of responding to incidents where attacker-produced content is circulating online, setting out our new TCAP Incident Response policy and how it reinforces existing multistakeholder crisis mechanisms.

13

Panel 2D: Masculinities

Chair: Ninian Frenguelli (Swansea University)

The “Male State” (MS)– the “manosphere” with a Russian accent Dr Anna Kruglova (University of Salford)

Abstract: While the phenomenon of incel violence has now attracted a lot of scholarly attention, it still remains largely under-researched. One of the gaps that exist in the field is the fact that the majority of studies of the phenomenon were done within the English-speaking online space. However, misogynist groups and movements exist in other countries as well and equally represent a serious threat not just to women but to LGBTQ+ people as well as non-white people. One of them is the “Male State”, an extremist organisation based in Russia. Not only do they openly call for violence against women but also actively engage in it – starting from death threats to direct physical attacks. The group has a strong presence on Telegram – with more than 60,000 members subscribed for the channel. Since the group is relatively new, there has not been any research done on the “Male State”’s online activities and the way they use Telegram. The presentation will locate this movement on the map of the global manosphere by analysing the group’s visual and textual propaganda on Telegram as well as the dynamics of their online interactions. Identifying Ideological Cross-Pollination Within and Between Alt-Right and Anti-Feminist Extremist Platforms Online Simone Long (University of Exeter) Abstract: Recent years have seen considerable growth in the number of radicalisation cases associated with an ambiguous combination of various alt-right beliefs and anti-feminist convictions. Such ideologies, often referred to as “mixed, unstable, and unclear”, have emerged as a result of technological affordances characteristic of Web 2.0, and have been at the centre of many significant extremist attacks. Despite this growing trend, there remains a lack of extensive empirical work exploring the potential cross-pollination of ideas between anti-feminist and far-right online spaces. As such, this paper will present a data-driven understanding of how such MUU ideologies emerge and evolve through dynamic interactions between these different platforms. Specifically, it will computationally investigate user posts made to a small sample of alt-right and anti-feminist platforms. Analysis, then, will focus on comparing the ideologies of these communities, as well as how users (re-) constitute notions of identity by drawing on theories of radicalisation and social identity. By utilising Natural Language Processing (NLP) techniques to analyse entire corpora of text, this paper will serve as an effective demonstration of the utility of conceptualising these different extremist communities as groups operating within distinct but nevertheless overlapping “ecosystems”, especially by highlighting how patriarchal narratives intersect with popular ethno-religious supremacist discourses perpetuated by far-right groups.

14

Mainstreaming Male Supremacy: A Comparison of Men’s Rights Activists and Terrorists William Arnold (American University)

Abstract: Over the last ten years, there have been a number of terrorist attacks by alt-right and men’s rights actors, young men with misogynistic motivations. At the same time, numbers of people holding these extreme beliefs have substantially increased, in part due to social media and the popularity of men’s right influencers. This study explores the extent to which terrorists with extreme men’s rights/alt-right ideologies use the same language as the broader online men’s rights movement. This paper analyses the manifestos of men’s rights and MUU terrorists and compares that to transcripts of the popular YouTuber Andrew Tate, who is a mainstream articulation of the men’s rights movement. By performing a comparative thematic content analysis between the terrorist and ‘mainstream’ movements, this study draws out key disparities in the language that the two different groups use. This includes the specificity of the calls for violence, the use of militaristic language and ideological statements. These differences can help to illuminate the extent to which the language of extremism and terrorism is present in mainstream social media content. It suggests the language of potential terrorists may be differentiated and spotted from amongst more mainstream voices in the men’s rights space. A visual and netnographic analysis of the interaction between far-right masculinist influencers and their audiences Joshua Farrell-Molloy (Malmö University) Abstract: The ‘Right-wing Bodybuilder’ (RWBB) subculture is a digital far-right community made up of manosphere-adjacent fitness gurus and esoteric nationalists who focus on promoting alternative men’s health and nutrition advice. In recent years, RWBB has developed an established presence on Twitter, with key influencers ‘Bronze Age Pervert’ (BAP) and ‘Raw Egg Nationalist’ emerging as leading thinkers and masculinity influencers among the so-called ‘Dissident Right’. The sparse research on RWBB is restricted to ideological texts, not on the subculture’s participants. Meanwhile, research on online extremism, in general, typically hyper-fixates upon extremist content and narratives. This article will go beyond these analyses and explore how participants interact with content in extremist subcultures to establish group boundaries through their participation in com- munity rituals, alongside the banal and everyday interrelationships between the physical and online worlds that define the everyday experience in virtual communities. The presentation will address these gaps with a qualitative analysis using a combination of netnography and visual methodologies. It will examine how group dynamics, particularly in-group cohesion and in-group hierarchies, are established and maintained through the performances of rituals and practices, such as the sharing of daily life routines, which aim to construct the authenticity of participants.

15

BREAKOUT SESSIONS 3

Panel 3A: Incels

Chair: Jack Springett-Gilling (Swansea University)

Into the Mainstream: Understanding the Communication Strategies of Incel Content on TikTok and YouTube Anda Solea (University of Portsmouth) [Co-author: Prof Lisa Sugiura (University of Portsmouth)] Abstract: TikTok and YouTube, two of the leading mediums in the social media landscape, have seen an insurgence in hateful anti-feminist content. Incels (involuntary celibates), an online subcultural group, have become an increasing security concern following their association with several mass-casualty attacks and cyber violence predominantly directed at women. Once mostly contained on niche men’s forums, blackpilled incel communities are gaining prominence on mainstream platforms. The present study examines the tactics employed by accounts disseminating the blackpill ideology on TikTok and YouTube. Employing Multimodal Discourse Analysis, we explore how different forms of communication, such as text, images, and audio are used to convey the blackpill ideology to appeal to wider audiences. Additionally, we investigate video metadata, such as hashtags, likes, views, and comments, to understand the role of popular video-sharing platforms and their unique features in amplifying the visibility of incel content and driving engagement. The study offers a comparative analysis between TikTok and YouTube, uncovering the differences and similarities in the content and style of incel videos present and their reception. The implications resulting from the increased visibility of incel tropes, theories and misinformation are considered and recommendations for content moderation strategies aimed at tackling the spread of misogynistic narratives online are discussed. The Chad in the Mirror: A mixed-methods analysis of self-perceptions and grievances in video and textual incel content JJ West (American University) [Co-authors: Kaitlyn DaVisio (American University)] Abstract: Involuntary celibates (incels) exist in online milieus, circulating and reinforcing their male supremacist ideology amongst like-minded others. In 2014, the first self-identified incel attacker used these digital beliefs to justify widespread violence. Since then, a number of subsequent attackers have followed suit. The present study seeks to evaluate differences in internal and external perceptions between online incel personas and real-world perpetrators of mass violence associated with the community. Using a combination of language and sentiment analysis and qualitative code-booking, we examine variation in the presentation of grievances and self-perceptions from self-identifying incels. We further compare content from social media incel forum posts and pre-attack manifestos and video blogs to identify common narrative themes. Given their continued prevalence on social media, understanding the myriad of ways male supremacist ideologies manifest themselves is a crucial first step to identifying potential routes toward intervention.

16

“You didn’t read yourself into this, you felt it”: Assessing the use of counter- and alternative narratives on support-focused incel subreddits Allysa Czerwinsky (University of Manchester) Abstract: Alongside growing discussions around misogynist incels, attention is increasingly being paid to issues of deradicalisation and factors behind exiting the community. Despite this, understanding pathways out of inceldom has proven difficult, as the community’s main forums offer limited opportunities for progressive discussion around exiting and ban participation from both former incels and non-incels. However, content from support-focused subreddits like r/IncelExit and the now-defunct r/IncelsWithoutHate can provide important insight into how conversations between self-identified incels and other users help counter the narratives embedded in the community’s guiding ideology, paving a way out for motivated community members. Through a mediated narrative analysis of text posts collected from r/IncelExit and r/IncelsWithoutHate, this paper assesses how counter- and alternative narratives impact discussions around exiting for self-identified incels. Specifically, I trace the narratives that promote progressive conversations between incels and others, highlighting the importance of validating and empathising with a person’s subjective circumstances while still engaging in critical discussions about the community’s guiding ideology. I highlight instances where counter- or alternative narratives were ineffective, stymying conversations and limiting opportunities to receive support. This research adds to the growing body of literature around exit trajectories for individuals involved in misogynist extremism, and offers key insights into effective narrative strategies that can be used to guide support and exit resources for motivated incels and others who subscribe to male supremacist ideologies

17

Panel 3B: TikTok Research API Workshop

Chair: Dr Nayanka Perdigao (TikTok)

Panellists: Kathryn Grant (TikTok) Shiva Hullinakatte (TikTok) Dr Nikki Soo (TikTok) Dr Zhanna Terechshenko (TikTok)

Abstract: In this workshop, researchers will learn about TikTok’s Research API and transparency commitments. Presenters will demonstrate the tool in real time and participants will have the opportunity to ask questions and provide feedback.

18

Panel 3C: Through the Looking Glass: The Methodological and Ethical Implications of Using Visual Material Within Research

Chair: Dr Ashton Kingdon (University of Southampton)

Panellists: Dr Aaron Winter (Lancaster University) Katie Passey (Moonshot) Dr Ashley Mattheis (Dublin City University) Dr Cori E. Dauber (University of North Carolina – Chapel Hill) Hirah Azhar (University of Southampton) Dr Christopher Fuller (University of Southampton)

Meili Criezis (American University) Dr Lewys Brace (Exeter University)

Abstract: As the turn to interdisciplinary work becomes greater, it is important to consider that advances in research and development are more likely to happen at the intersections between multiple fields. This workshop is an interactive, interdisciplinary, multi-stakeholder roundtable discussion. It is designed to be deliberately inclusive and conversational, with audience members actively encouraged to ask questions of the panel and to share their own experiences and ideas within a ‘fishbowl’ format that generates honest reflections and healthy debate. The session will shine a light on the realities of undertaking research into extremist and sensitive imagery disseminated on social media. The panel brings together PhD students, early career researchers, academics, and industry professionals from across the world and will allow the opportunity to discuss sensitive issues honestly and openly, as well as for informal networking. During the workshop, the participants will discuss their perspectives on best practices for engaging with visual materials, for utilising mixed methods and interdisciplinary approaches, and for mitigating the real-life day-to-day challenges of what their sectors face when engaging with this material. Focus will also be placed on the ethical components that come from researching sensitive imagery, including access, anonymity, storage, researcher safety, mental health, and cybersecurity. Participants will be asked for their reflections on addressing the trauma that comes from researcher exposure to sensitive images, how researchers can report on and share content on sensitive topics in an ethical way, and the ways in which imagery is used by bad actors, especially to intimidate and traumatise marginalised communities

19

Panel 3D: Telegram

Chair: Dr Kamil Yilmaz (Swansea University)

Accelerationists’ Exploitation of Digital Platforms Erica Barbarossa (Center on Terrorism, Extremism, and Counterterrorism) Isabela Bernardo (Center on Terrorism, Extremism, and Counterterrorism)

Abstract: On April 26, 2024, the United Kingdom highlighted the severe threat of militant accelerationism when it proscribed the terroristic online network known as the Terrorgram Collective. The militant accelerationist movement—a group of actors employing various tactics and strategies to hasten societal collapse—heavily relies on digital platforms to advance this agenda. For social media platforms, the accelerationist movement poses significant safety and reputational risks regarding recruitment, radicalization, and monetization. Accelerationist actors utilize both explicit and implicit images, hashtags, and terminology to promote their hate-based ideology and activities. Indicators of accelerationism are prevalent across nearly all social media platforms, indicating a high level of awareness among these bad-faith actors regarding platforms’ Terms of Service (ToS) and content moderation practices. Posts on mainstream platforms often use coded or implicit language to obfuscate their intent; however, many of these posts can lead to highly violative content on more permissive sites within 1-3 clicks. This aligns with one of their primary online objectives: to redirect users to Telegram, where a network of channels host explicitly hateful and terrorist content with the goal of radicalizing and mobilizing users to violence. In our presentation, we will provide an overview of how accelerationists exploit digital platforms in their objective to dismantle liberal society Trust no one: a reflexive thematic analysis of right-wing extremist Telegram content mentioning children Mackenzie Hart (Simon Fraser University) [Co-author: Dr Garth Davies (Simon Fraser University)] Abstract: Modern society has enshrined children as the ultimate symbol of all that is good and pure. Historic episodes of moral panic, like the Satanic Panic of the late 1980s, help demonstrate the potency of this symbol, amongst parents and non-parents alike. While children have commonly been recognized as particularly vulnerable to radicalization, our findings suggest that parents themselves are a vulnerable group, vis-à-vis their children. This study explores how right-wing extremists (RWE) understand children by analysing content about children produced by and shared through RWE Telegram channels. Findings reveal that RWE Telegram content does not target children, but their parents. Using reflexive thematic analysis (RTA), four themes were generated from the data – veneration of the natural, ritual & systemic harms, action & responsibility, and ‘parents know best’. These themes are united by the single overarching theme of trust. Parental distrust of anyone or anything outside of the family is fostered through and evidenced by conspiracy theories and their designation of a conspiratorial other. Crucially, research suggests that a parent’s belief in conspiracy theories and related distrust in institutions motivate action. Relatedly, the concept of altruistic fear has also been highlighted as an explanation for parents’ behaviour.

20

Christgram – White Christian Extremist Communities on Telegram Jakob Guhl (Institute for Strategic Dialogue)

Abstract: One of the key platforms on which violent extreme right communities today network and share propaganda or instructional material is the messaging-app Telegram. This presentation will focus on “Christgram” sub-communities within Terrorgram which have so far mostly been overlooked by the research community or journalists who cover online extremism. For these communities, white Christian identity plays an important role in constructing an identity, helping it to clearly distinguish between in-group and out-group. The focus of the presentation will be on the visual style, iconography, music, historical narratives and scriptural references shared within Christgram communities. While these elements often do not serve any immediate political purpose, they reflect the wider cultural background and preferences of these communities. Given the apparently renewed interest by some extreme right groups in Christianity, more attention should be paid to the use of religious language, aesthetics and references to scripture to justify violence, interpret history and contemporary society and provide advice to the movement’s adherents. Election Manipulation in Brazil’s 2022 General Elections: The Role of WhatsApp and Telegram on the Attacks Against Electoral Integrity and the Threats to Democracy Dr Débora Salles (Federal University of Rio de Janeiro) [Co-author: Dr Lorena Regattieri (Just and Sustainable Technologies Consultant)] Abstract: This case study embarks on a rigorous examination of Brazil’s 2022 general elections, unveiling an orchestrated playbook of election manipulation. The core focus is the deliberate insinuation of fraud in the electoral system, fuelled by meme wars, political propaganda, participatory crowds and a sophisticated networked manipulation cycle meticulously woven through WhatsApp groups and Telegram groups and channels. In addition to comprehending the ‘how’ and ‘why’ of these manipulations, we discuss the practical implications of such strategies on electoral campaign regulation, exploring how Brazil’s electoral commission, the Superior Electoral Court (Tribunal Superior Eleitoral - TSE) had attempted to contain the spiral of disinformation and how technical affordances of WhatsApp and Telegram can be instrumentalized in upcoming electoral cycles around the world. Ultimately, this study delves into the broader objective of discrediting democratic values and diminishing public participation.

21

Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7 Page 8 Page 9 Page 10 Page 11 Page 12 Page 13 Page 14 Page 15 Page 16 Page 17 Page 18 Page 19 Page 20 Page 21 Page 22 Page 23 Page 24 Page 25 Page 26 Page 27 Page 28 Page 29 Page 30 Page 31 Page 32 Page 33 Page 34 Page 35 Page 36 Page 37 Page 38 Page 39 Page 40 Page 41 Page 42 Page 43 Page 44 Page 45 Page 46 Page 47 Page 48 Page 49 Page 50 Page 51 Page 52

www.tasmconf.com

Made with FlippingBook HTML5