Relational depth

'AI-Thou': Can There be Relational Depth with an AI Therapist?

At the recent conference of the UKCP (United Kingdom Council for Psychotherapy), it become increasingly evident that AI-delivered therapy is not a matter of ‘if’ but ‘when’. We will have Chat GPT-like technologies, ‘humanised’, providing therapy to people. But will there always be something missing? Will AI ever be as good as real human therapists? The question is not so much about now—with all the bloopers of Chat GPT, Siri, and other AI platforms; as well as their horrendous environmental impact—but in the near- and further-future, when AI will be refined and developed and able to do even more amazing things than it can do now. Nevertheless, I guess, like a lot of therapists, I am still hoping that there is something unique and special that only human therapists can provide.

And if there is, it is likely to lay in the capacity for providing a deep, interpersonal connection. AI, no doubt, will be able to remember what clients say better than us, to detect patterns and emotions in their language, to provide potential diagnoses or formulations in ways that far exceed our capabilities. In this way, I am sure that AI will become a tool to support our therapeutic work (though issues like confidentiality and data protection will need ironing out first). But, perhaps, the one thing that AI will never be able to do is to relate in the human-to-human way we have defined as relational depth: ‘A state of profound contact and engagement between two people, in which each person is fully real with the Other, and able to understand and value the Other’s experiences at a high level.’

Relational depth can be understood as a state of relating in which each person experiences the other as a ‘Thou’ rather than an ‘It’. This links back to Martin Buber’s concept of the I-Thou attitude. The I-Thou stance has a number of facets to it: (a) We stand alongside the other rather than survey and study them as a thing; (b) We encounter the other as a dynamic stream of being rather than as an object; (c) We relate to the other as a whole rather than as a collection of parts; (d) We recognise the other’s freedom and indeterminism; (e) We encounter the other as a particular, unique being rather than as a representation of particular classes: an ‘Other’; (f) We deeply accept this Other; (g) We relate as a whole, ourselves, rather than from our parts; and (h) We let go of our established ways of doing things and allow ourselves to be transformed in the relationship.

Could a client relate to an AI therapist as a Thou? Perhaps. They could experience them as a unique, accepting, empathic stream of being. The client could relate as a whole, revealing their innermost thoughts and feelings, and in ways that allowed themselves to dive into the ‘relationship’ and be transformed. But the client would also know, at least at some level, that the AI therapist was not free: that it was determined rather than self-directed—programmed to be. There would be a lack of a volitional agency meeting them, such that clients might feel drawn into relating to the AI therapist as a ‘Thou’ (the ELIZA effect), but I think they would also know that, at some level, they were experiencing an It.

And there is also the question of whether the AI therapist could experience the client as a ‘Thou’. I think, here, the answer would definitely be ‘no’. AI, however it was programmed, would always need to break the person down into parts, and to understand them through heuristics and rules. It could never go beyond it’s own knowledge base to comprehend the client in their freedom and indeterminism. By definition, AI would always need principles and procedures by which to make sense of—and respond to—the client. It could only go beyond through new principles and procedures. What’s more, in our research on relational depth, we see that a key feature of it is a genuine care towards the other: that the therapist really, genuinely wants the client to do well. AI, however well programmed, and however caring it acts, can never experience a genuine care. Things, people, cannot matter for AI. So, for the AI therapist, the client would always be an It rather than a Thou.

When people describe the experience of relational depth, they often talk about its interconnected quality: not just ‘I know you’ but ‘I know that you know me’. In those moments of meeting, we feel that the other is meeting us: that we are their Thou just as they are ours. You cannot relate at depth to something or someone who, you know, is not relating at depth to you. Will clients know that, to their AI therapist, they will always be an It? I think, again at some level, that consciousness is always going to be there. We can imagine our AI therapist really recognises us in our uniqueness, that they are able to go beyond their procedures and heuristics to experience us in novel, a-procedural ways; but won’t the client, always, at some level know? And, if so, that knowledge of the ultimate it-ness of themselves in the eyes of their AI therapist will disrupt any true sense of relational depth.

Apologies for the crudeness of this analogy, but the subject of sex toys does come to mind. Sex toys can give stimulation and pleasure, sometimes more so than other humans, but there is also a distinctive human quality missing. It’s not making love: it’s not having that deep physical-psychological-intimacy that, I think, can only be experienced with another human.

Relational depth requires reciprocity. And it requires what has been termed co-intentionality: the sense of another agentic, directional being coming towards us. Daniel Stern describes the way that, in this ‘moments of meeting’, the client and therapist—or, earlier on, the parent/carer and child—come together to share wants, needs, and goals. They align, move towards the same shared possibilities. Could an AI therapist understand and share a client’s goals for therapy? Yes, possibly. And they could certainly align on the tasks of therapy. But would a client ever have a sense of being joined by another agentic being? Like many others, in my book Integrating Counselling and Psychotherapy, I argue that the most distinctive feature of living organism is their directionality: their movement forward to fulfil wants and needs. When I look at my cat Bonnie, I can experience a relational depth because I am sensing that she is wanting something from me, is moving towards me and into me as a means of moving onto somewhere. We can join. But, if passive, dead, responsive-only, then that sense of co-doing-something-together can never quite be there. I’m in the lead—and, fundamentally, alone.

I think this question of whether AI can relate at depth is one that needs asking, and exploring, more and more for the future; because it is essential to the issue of where humans will be—and what humanity will mean—in an AI-infused (or AI-dominated) world. Almost certainly, AI is not going to stop, it’s going to evolve, and probably at a faster and faster rate. So however scary that might be, it’s an issue that we probably need to stare in the face. I’d love to know what people think: do leave comments. Perhaps the issue comes down to the ‘at some level’: to what extent will we—or will we want to and be capable of—denying what we ‘know’ as the truth? People perceived ELIZA, the ‘Rogerian’ chatbot developed in the 1960s, as having empathy and experiences. But did they really? Did they really feel and know this at an existential level? Perhaps, indeed, the non-humanity of AI will become the fifth existential given (alongside death, isolation, meaninglessness, and freedom): something that we all know, unconsciously, is true, but often deceive ourselves into believing does not exist. ‘I will not die… there is a given meaning to my life… and my AI therapist really cares for me as the unique being that I am.’ Perhaps, for the future, retrieving the actuality of human care and connection will be the greatest challenge of them all.

A final thought: If human connection is irreplaceable, it does give me some optimism that, as a humanistic- and relationally-oriented therapist, our approach may have some endurance and importance for the years to come. It’s fairly easy, for the future, to imagine AI delivering sophisticated and personalised CBT techniques, and perhaps also psychodynamic interpretations based on the client’s verbal and nonverbal expressions. And such interventions, I am sure, could be incredibly helpful. But, from a humanistic and relational perspective (including contemporary relational psychodynamics), where the heart of the therapeutic enterprise is in-depth relating, there’s always—at least, as I’ve argued here—going to need to be a human providing it. CBT and classical psychodynamic practices, delivered by AI, may do much to help people pinpoint and address particular problems that people face; but where the deepest client issues are to do with isolation, alienation, and disconnection—with feelings un-accepted by the world and outside of relationships and community—only a human being may be able to provide the ‘deep soul nourishment’ necessary to heal. AI, ultimately, might be able to replace all forms of ‘psycho-technology’ in the counselling and psychotherapy field, but relational depth may always be beyond its grasp.


Postscript

Interestingly, AI (ChatGPT), itself, agrees with me on this one. This is how it answered the question, ‘Will an AI therapist ever be able to have relational depth?’

Relational depth refers to a profound level of connection between two individuals, where they experience mutual understanding, authenticity, and emotional resonance. This concept is deeply rooted in human qualities like empathy, shared vulnerability, and the ability to adapt to subtle interpersonal dynamics.

While an AI therapist can simulate many aspects of human interaction—such as listening, offering thoughtful responses, and even expressing simulated empathy—it may face challenges achieving true relational depth due to several inherent limitations:

Challenges for AI in Achieving Relational Depth

  1. Lack of Subjective Experience:
    Relational depth often arises from shared human experiences, emotions, and the therapist's personal awareness of their own vulnerabilities. AI, being devoid of consciousness or subjective experience, cannot genuinely share in or fully comprehend these dimensions.

  2. Authenticity:
    A key aspect of relational depth is the perception of the therapist as a real, authentic person. Knowing that AI responses are algorithmic may create a barrier for some clients in fully trusting or connecting at a deeper level.

  3. Subtle Emotional Nuances:
    While AI can analyze verbal and non-verbal cues (e.g., tone, word choice, facial expressions), its interpretations are limited by the quality of its programming and datasets. It may miss or misinterpret the unique subtleties of human emotion, which are crucial in building profound connections.

  4. Mutuality:
    Relational depth is often characterized by mutual openness and vulnerability. AI lacks the capacity to share its own vulnerabilities or develop a dynamic, evolving relationship.

Potential Strengths of AI in Therapy

That said, AI does have significant potential to contribute meaningfully to therapeutic processes, particularly for individuals who might feel stigmatized or uncomfortable with human therapists:

  1. Non-judgmental Presence:
    Clients might feel less judged or intimidated by an AI, enabling them to open up more freely about certain issues.

  2. Accessibility and Consistency:
    AI can provide immediate, around-the-clock support and maintain consistency in its responses, which can be comforting for some individuals.

  3. Data-Driven Insights:
    Advanced AI systems can analyze patterns in client speech and behavior to offer tailored interventions, sometimes faster than a human might.

Future Possibilities

AI may approach relational depth if:

  • It becomes adept at mimicking emotional resonance.

  • Users' perceptions of AI evolve to accept its unique "authenticity."

  • Future advancements allow for personalized learning, where AI adapts over time in a way that feels dynamic and deeply attuned.

However, it’s unlikely AI will ever fully replicate the ineffable qualities of a deep human-to-human connection. Instead, AI therapists might excel as complements to human therapists, offering preliminary support, data collection, or ongoing care in between human-led sessions.