TASM 2024 - Panels and Abstracts

Vicarious Trauma and Radicalization Risks: How to Improve Conditions for Social Media Moderators Scott H. Vlachos (Council for Emerging National Security Affairs) Sarah M. Lynch (American University) Abstract: Content moderators working for vendors contracted by social media companies are often exposed to traumatizing and radicalizing content. To date, the impact of this content has received little academic analysis due to the secretive nature of the industry. Content moderators must sign nondisclosure agreements, and the wellbeing practices both required and implemented by each vendor are opaque. This paper provides a literature review of the current struggles facing the industry as outlined by current and former content moderators that have spoken publicly about their workplace experiences. A common critique towards vendors is that they fail to provide adequate care to address mental health concerns. Some vendors make available non-licensed “wellness coaches” while others retain the services of psychologists that have the power to recommend termination for employees deemed unfit for work. Anecdotes provided by employees demonstrate a concern for colleagues that have displayed signs of increasing violence at the workplace and susceptibility to extremist propaganda. The paper provides recommendations for vendors to reduce trauma and address potential radicalization by implementing new moderation tools, providing workers greater flexibility when selecting the types of content they are responsible for moderating, and introducing more flexible quotas that allow moderators to surpass a target number. Moderating expectations: reflections on the impact of partnerships for reducing digital harms in the ‘global south’ Janeen Fernando (United Nations) Abstract: The ‘global south’, particularly in Asia, represents the largest growth markets for social media platforms with a further 60 million new users estimated to be added in 2024. Yet, content moderation remains a largely centralized and opaque process made in the corporate headquarters of companies that are often not answerable to their less influential markets and have lacked understanding of the varied operating contexts in their approach to trust and safety concerns. This gap, and subsequent allegations of serious harms, has seen unorthodox partnerships emerge between local actors in developing countries and social media platforms, at times with positive results in more contextualized and responsive content moderation. However, these improvements remain fragile and subject to the ‘benevolence’ of social media platforms, rather than representing any obligation for continuity. Costs are often externalized to others seeking to manage the harms caused by poorly resourced content moderation. This uneasy and unequal partnership will continue to be important to local actors in the global south, even as social media companies appear to be limiting support and governments push to close regulatory gaps through legislation that will struggle to be fit for purpose.

48

Made with FlippingBook HTML5