Panel 1D: Video
Accelerating Towards the End of the World: Exploring the Narratives in Prepper Video Content and User Response on TikTok Kate Tomkins (University of Southampton) Abstract: The need to understand accelerationism is becoming increasingly critical for countering the extremist threat in the United Kingdom as a growing number of extreme-right actors and agencies have channelled narratives following the emergence of the Coronavirus pandemic. While the connection between social media platforms and radicalisation is well documented, there is a notable lack of literature on how the fringe subculture of doomsday preppers intersects with far-right accelerationist discourse. The present paper attempts to bridge this gap by mapping the narratives of prepper content and user responses on TikTok. Using a mixed-methods design, a critical-realist-informed thematic analysis of prepper video content and a textual network analysis of aggregate user responses suggest that accelerationist-inspired narratives influence content and response on TikTok both explicitly and implicitly. The analysis revealed content creators emphasise current societal tensions through imagery and language, signifying an immediate and dynamic danger. Conspiracy theories of nuclear threats, The Great Reset, a New World Order, and societal collapse were prominent and influenced by morphogenetic and morphostatic factors, including economic uncertainty and perception of persecution. Narratives were further developed throughout user response, potentially fostering echo chambers and the dissemination of extremist content and contributing to their prevalence within the prepper community. Right-wing HateTok as a portal: cross-platform recruitment and propaganda strategies of German right-wing extremist ecosystems Erik Hacker (SCENOR) Daniela Pisoiu (SCENOR) Abstract: In the context of the everchanging environment of social media platforms, our research suggests that German-speaking far-right players are building ecosystems across social media sites not only to ensure their sustainable presence in the face of deplatforming, but also to exploit different content moderation policies. For the sake of recruitment, right-wing digital ecosystems consciously spread implicit propaganda and hate speech on mainstream platforms to reach a larger pool of people, and to attract vulnerable youth to niche platforms via linking. The project “Right-wing Extremist Eco-Systems Driving Hate Speech: Dissemination and Recruitment Strategies” (RECO_DAR) contributes to the comprehension of how right-wing extremist hate speech evolves and spreads online using an innovative mixed method approach combining computational techniques and qualitative frame analysis. The paper analyses 40 prominent German-speaking far-right users and their wider ecosystems as well as clusters on TikTok, and compares their TikTok strategies to their behavior on niche platforms by following external links posted by them in bios, captions, and comments. The first insights show a fragmented far-right scene with isolated clusters, a growing emphasis on implicit visual hate speech on TikTok, an uptick in far-right players posing as alternative news outlets, and the dominance of anti-LGBTQ+ sentiments. Recommendation Systems and the Amplification of Online Harms Dr Joe Whittaker (Swansea University) Ellie Rogers (Swansea University) [Co-authors: Dr Sara Correia-Hopkins (Swansea University) & Dr Nicholas Micallef (Swansea University)] Abstract: There is a considerable policy concern over the role of recommendation systems and the potential amplification of problematic content online. This study assesses the empirical research on this phenomenon. Drawing from a scoping review of 43 studies, it asks four key questions: i) Is illegal content being amplified?; ii) Is “legal but harmful” content being amplified?; iii) What are the experiences of users in so called “filter bubbles”?; and, iv) What methods and data collection are being employed to investigate this phenomenon?
8
Made with FlippingBook HTML5