Core 11: The Change Makers' Manual

Digital Innovation & Entrepreneurship

ADOPTING AI TOOLS

These insights will allow the AI to gauge nuance and context. One friend chatting to another about their frustrations is different from an isolated teenager seeking information about suicide. Intimate photos sent between consenting adults are a world away from those sent by a young girl who has been groomed by an older man. The key is how you distinguish one from the other. Technology platforms – social networks, search engines, and online platforms – could use our technology as an unobtrusive layer of understanding tailored to their own recommendation engines. This means technology that can understand the context of comments, the social bonds that exist between specific individuals, their age, and their relationships. It could flag, for example, the age gap between a man and a girl, the short time they have been online ‘friends’ and the differences between their social groups. “Content without context doesn’t mean much. Tech giants know this” To date, lawmakers have largely allowed social media platforms to mark their own homework. This has prompted outrage, both from those who care for vulnerable youngsters and from defenders of free speech. As ever, the route between moderation and censorship is a tightrope. Meanwhile, in the UK, the Government’s flagship legislation for internet regulation – the online safety bill, nearly four years

How to avoid buyer’s remorse over AI tools by Hila Lifshitz-Assaf

in the making – is treading a difficult path between free speech and protection as it enters what many hope is the final stage. Even if tech businesses know that an extra level of Responsible AI could better govern reams of content, they currently have little incentive to impose it. These are commercial platforms with commercial considerations. Fewer users means less cash. In recent years, whistleblowers have spoken out against Meta’s algorithms and moderation methods and their harmful impact on individuals, accusing them of putting profit over people. But legislators need a better understanding of the technology they are seeking to govern. If they understood what was possible – the power to create ‘intelligent’ technology that reads between the lines and sifts

benign communication from the sinister – they could use the laws they are seeking to pass to demand that internet giants implement these safeguards. For every high-profile case such as Molly’s, there will be hundreds – probably even thousands – of children who are severely traumatised by harmful content they encounter online but do not make the headlines. Every day, millions of parents fear the bleak impact of social media upon their kids. It is our responsibility to protect them, and we are running out of time to do it.

Learn more about WBS research on Digital Innovation and Entrepreneurship.

Sustainable Development Goals

Warwick Business School | wbs.ac.uk

wbs.ac.uk | Warwick Business School

24

25

Made with FlippingBook Learn more on our blog