TASM 2024 - Panels and Abstracts

Regulating Online Platforms: From Addressing Terrorist Content to Tackling Online Harms Reem Ahmed (University of Hamburg)

Abstract: There have been significant developments in platform regulation in recent years, with increasing efforts to regulate not only terrorist content, but also legal but harmful content. This shift towards harmful content raises a number of questions regarding the boundaries of legal speech and freedom of expression, which have been explored extensively within the platform governance literature. However, further analysis is needed to understand the (political) processes behind such developments in discourse and practice. Focusing on the UK’s Online Safety Act 2023, this study assesses the extent to which similar logics and discourses of counter-terrorism have been applied to the broader debates on platform regulation and content moderation. Specifically, this paper examines whether the language around terrorist, extremist, and harmful content has converged over time and how such threats and risks have been imagined and articulated by policymakers. It does so by analysing British parliamentary debates, statements, and documents surrounding the discourse on online extremism and the Online Safety Act. Drawing on previous work that has critically examined the discourses of (online) radicalisation, extremism, and terrorism, this paper reflects on the implications of the application and understanding of online harms on the broader counter-terrorism agenda within and beyond the UK context. Power and Process: What does Meta’s Oversight Board tell us about state actors’ referral of content for review and removal under Terms of Service? Dr Alastair Reed (Swansea University) [Co-author: Dr Adam Henschke (University of Twente)] Abstract: Drawing on the case reports published by Meta’s Oversight Board, this paper explores the content moderation processes engaged by Meta and the relationships between Meta and state actors, following state actors’ referral of content for review and removal under terms of service. We have selected four Oversight Board cases, that examine directly or indirectly Meta’s content moderation practices and content referral by state actors. In part 1 the paper outlines what these case reports have told us about this process, and part 2 provides an ethical analysis, identifying key ethical challenges for social media companies facing similar situations to Meta.

46

Made with FlippingBook HTML5