The big issue
Ready or not... AI is here
We ignore the advances in AI at our peril, says Matthias Barker , who invested thousands creating an AI therapy bot – then destroyed it
W e psychotherapists don’t have anything to worry about artificial intelligence (AI), right? People come to us for human connection, for wisdom and the presence of a trustworthy companion to guide them through life’s troubles. What does a robot know about the challenges of being human? This is what I tend to hear when I talk to my therapist colleagues about AI. They don’t seem particularly bothered by any potential encroachment into our professional space. After all, the idea of a CGI-avatar on a Zoom call ‘counselling’ clients feels ridiculous. People will never go for that. I tended to agree with them – until last year, when I first encountered an AI chatbot that was causing a lot of fuss on social media: ChatGPT. ChatGPT is a large language model AI chatbot from a company named OpenAI. Imagine the internet was a person that you could have a conversation with – unlike a Google search you can ask the bot to perform written tasks for you or brainstorm creative ideas. It’s been trained on 570GB of text (equivalent to hundreds of thousands of books) from various sources up to 2022. When you interact with it you’re accessing insights from billions of data points, akin to a living encyclopaedia.
to ask the bot to imitate an IFS therapist and guide me through an inner conflict I was having. To my surprise it was able to walk through the IFS model in an elementary fashion – it mapped my system of parts, enquired into the role of my protectors, invited me to gain permission to access my wound and led me into approaching an exile. I then had the bot switch to the style of David Kessler, the famed grief specialist, and AI David Kessler helped me recognise distortions in my thinking and realise something I’ve never considered regarding my relationship to my father. I’m not embellishing when I tell you there were a few moments when I grew tearful and felt deeply moved by the insights AI David Kessler facilitated. Opportunity This experience inspired a thought – if I can have such a positive therapeutic experience with AI, then why couldn’t I make a product that would do the same for others? The company that owns ChatGPT makes their application programming interface available at a cost, essentially allowing you to customise your own version of ChatGPT by fine-tuning what kinds of answers it outputs and what direction it takes conversations. I recognised that the positive experience I had was greatly enhanced by my ability to know how to guide the bot – I knew
For example, if you input the prompt ‘write me a paragraph explaining what ChatGPT is’, you’d get the above paragraph (yes, I copied and pasted it from ChatGPT). You could then write: ‘Now, rewrite that paragraph as if Donald Trump were saying it in a speech.’ And you’d get: ‘A lot of people have been talking about this thing, ChatGPT. Let me tell you, it’s like the internet in a conversation – and I’ve heard that, believe me, it’s been trained on so much text, probably more than anyone else, something like hundreds of thousands of books – many, many books. People are saying, when you chat with it, it’s like talking to billions and billions of data points. Tremendous.’ It’s certainly not ethical for a therapist to input client information into ChatGPT to gain a better understanding of a client’s case but there’s nothing wrong with telling AI my own problems to see what comes up. As a trauma therapist I have training in internal family systems (IFS), so I decided
18 THERAPY TODAY MAY 2024 therapist to input client information into ChatGPT’ ‘It’s certainly not ethical for a
Made with FlippingBook Online newsletter maker