AI is not your therapist.
Slowly but surely, “Artificial Intelligence” (AI) is taking over. We’re seeing it in everyday life: folks asking ChatGPT for business advice, help with instagram filters, or planning vacations. These tools can be really helpful when used in the right way, and can be incredibly detrimental to our ability to use our very important critical thinking skills (a muscle that is most effective when exercised frequently).
There are also the sneaky ways in which AI is being used behind the scenes on a larger scale. Music streaming platforms are slowly introducing AI technology as well as AI created music. Efforts are being made to replace artists, actors, and creators in various forms of media. As technology is pushing out real art and artists, it is replacing our ability to produce creatively (another muscle most effective when exercised on the reg).
One of the most startling uses of AI (in my entirely unbiased opinion) is through therapy. AI technology is cropping up all over: most pervasively in the use of therapist notes, and increasingly in actual therapy.
Before getting into more detail on my justification against using AI in therapeutic settings, I’d like to share some ways in which the increasing use of AI across the board is impacting all of us who live on the Earth. According to data from the UN Environment Program, AI technology requires the use of rare, raw materials that are hazardous to mine, emit toxic waste, burn fossil fuels, and use exorbitant amounts of water to cool their systems. UNEP predicts that by 2026, AI infrastructures may consume six times more water than the country of Denmark alone (6 million people). You can read more about that here.
It’s interesting and relevant that something so artificial is causing so much destruction to nature, the highest truth of our world.
So what does this all have to do with therapy?
One of the first things we learn in school is a term called the “therapeutic alliance.” Broadly, this means the relationship between a client and a therapist. It can be measured or described through a sense of collaboration, trust, and agreement. I would say that this can only be achieved through a sense of mutual care.
Remember that movie with the harsh yet true advice: He’s Just Not That Into You?
Well, here’s my version: AI Doesn’t Care About You. At All.
You wouldn’t work with a human therapist who doesn’t care about you. Through relationship building and trust in the therapeutic alliance, you are able to work to heal wounds, build confidence, and do so in a setting with another person who supports and encourages you to push your boundaries of tolerance toward growth.
Through challenging and compassionate interaction with other humans, and experiencing empathy from another person, we can enhance our own empathy. When we increase our interaction with computers and AI, and move away from human interaction, what might that mean about our ability to empathize? I can’t say for sure, but my magic 8 ball is saying that the outlook is not so good.
So what does AI care about? Engagement. Algorithms. Historically, we can see that engagement is increased through feeding on an individual’s insecurities - look at the model of social media as a whole.
Further, many therapists have reverted to using AI to take notes. (If this is a concern for you - make sure to discuss it with your therapist. They should provide a consent form specifically for this if they use this practice). Honestly, I completely understand why using AI to take notes would be tempting for any therapist. Notes are often the last thing we want to do at the end of a long day, and often AI can do it better. Most therapists and mental health providers didn’t get into this work to write notes, it’s just a necessary part of the job.
While therapy specific note takers claim to be HIPAA compliant, there remain some concerns around this issue. There are many free AI note takers (and we need to ask if they’re free, what do they have to gain?) include some of the following terms of service in their agreement:
“transferable, assignable, perpetual, royalty-free, worldwide license to use the Recordings”,
"we may use the resulting … ‘De-Identified Data’ for our own internal business purposes, including without limitation training any [artificial] intelligence program we develop or use"
Doesn’t sound hella confidential to me, personally.
With AI being used to generate music, art, and healing practices, we risk this technology taking away from people’s ability to make art, to heal in relationship with others, to empathize, to be creative. What do all of these things have in common?
They all require feeling.
AI will never be able to feel, so AI can not be an artist. AI can not be a healer.
With every new gain made by technology, we should ask ourselves what we have to lose. When I imagine an Artificial-generated world, I feel immense grief. I think of all of the truth that is being stripped away. All of the creativity, the empathy, the human generated ideas and problem solving, the beauty of nature, our humanity.
I also think about who is ultimately behind all of this. Who is the wizard behind the curtain? Obviously, there are many AI companies, primarily owned by big tech. But who is holding them accountable? They are. OpenAI is an organization that is moderating the use of AI in an “ethical” way. Its founders include Sam Altman, Elon Musk, and Ilya Sutskever, all billionaires with financial interest in tech.
My point is this: we are putting so much trust into this amorphous beast which is owned and operated by billionaires, and whom I can pretty safely say do not have any of our best interests in mind. We are trusting them with our information, with our secrets. We are feeding it and in doing so, starving our own imaginations and the planet.
I hope we all wake up and realize that we are the best that we have. Remaining in our humanity is our best option in creating a beautiful world. As much as possible, I encourage you to be thoughtful and discerning about how you interact with these new technologies. The future of our world is counting on us.