I suspect Leigh Manning was not the only person-centred therapist to find our recent research on relational depth and outcomes ‘fantastically boring and completely meaningless’ when I posted our recent paper on social media earlier this month. ‘Rather than discovering something new, studies like this are in fact tiresome efforts to cling ever more rigidly to a pet theory or approach,’ wrote Leigh, ‘I could go on, but what’s the point?’ (15/9/2025, quoted with permission).
There’s things here that I agree with Leigh on, and this blog is not a personal critique of Leigh, but a counter to his position because I think he’s expressing—very eloquently—a view that, at some level, is shared in the person-centred field and beyond. Just to add, Leigh and I have been having an interesting and, I think, affable, backchannel conversation as this blog has been developing, and he specifically asked to be named, personally, in this blog, when I asked his consent to cite him: ‘Don’t feel any pressure to hold back.’ You can read Leigh’s response to the blog in the comments below. Previous reflections of mine on the role of research in person-centred therapy can be found here.
So to start with areas of agreement, I think Leigh is absolutely right that ‘allegiance effects’ are pervasive throughout the psychotherapy research field, with researchers (including, I’m sure, myself at times) setting out to prove what they believe to be true, and interpreting findings in ways that tend to support their a priori beliefs. From a person-centred standpoint, too, research evidence (particularly number-based) can reduce complex lived experiences to de-individualised ‘results’, devoid of any real human content. ‘No research comes close to encapsulating what therapy is about,’ Leigh goes on to write. He states:
Even I have no idea what it [therapy] is about what I do, how I do it, when I do it, that helps people as it seems to. These days, thankfully, I do not feel the need to know. Instead, I listen very closely to my clients, I observe what occurs to me in the moment and we go from there. We stumble around, we stall, things get clunky, but ultimately, if things go well, change occurs. Each therapy session is an encounter that is unmeasurable, unquantifiable, and irreplicable. It occurs between two (or more) enormously complex people. The only people who have any hope of knowing how this works are themselves immersed in the encounter. This to me is what makes therapy beautiful – the mystery of it. The evidence-based brigade try, unnecessarily and in vain, to find patterns in clouds.
I think that’s put beautifully, and I love the relational and holistic sentiments, but I also fundamentally disagree with Leigh on the value of research findings to person-centred therapists: indeed, to counsellors and psychotherapists of any orientation.
Admittedly, I might have a bit of a bias here, having spent the last year or so sweating blood and tears over the new edition of my Facts are Friendly book. I’ve also had a career as a researcher as well as a practitioner, so the chances of agreeing that research is ‘fantastically boring and completely meaningless’ is slim. Indeed, the truth is that it hurt, at a personal level, to read Leigh’s initial response. But I also think that, for the person-centred field, and from a person-centred perspectives, it’s essential that we don’t dismiss research evidence.
A Part of the Whole
That’s not to say that I think research gives us all the answers. Far from it. But it’s a false dichotomy to conclude that, because it doesn’t give us all the answers, it therefore doesn’t give us any answers. For me, there are multiple sources of knowledge from which to gain guidance on how to practice most helpfully. Leigh says that the only people who know what’s going on in the encounter are those immersed in it. I don’t agree. I am sure they have a particular, and perhaps privileged vantage point, but there are also an immense amount of other perspectives from which to view—and gain knowledge about—this interaction. Sociologists, for instance, would have one particular view (say in terms of the particular discourses being used), or neurobiologists (on the biochemical processes involved), or ethicists (in terms of the standards of behaviour to each other). And then there’s the evidence from psychotherapy researchers who might note, for instance, that the clients’ experiencing of empathy from the therapist is predictive of good outcomes; or that the therapist’s use of exposure techniques might cause discomfort to the client in the short term, but tends to leads to sustained long-term benefits. That’s not the answer, but why is that, one source of knowledge ‘completely meaningless’ ? I think it’s always tempting to adopt all-or-nothing positions, but my reading of Rogers’s understanding of full functioning is that we move towards appreciating the complexity of the world and the multiplicity of positions from which we can know and understand the world.
What do Clients Really Experience?
I guess, in response, you could say that Rogers does advocate that the only reality is subjective: as Leigh says, its the people who know, in their experiencing, what is ultimately going on. But we fall into an interesting paradox here, when we consider therapist and client perspectives, because how does the therapist know what the client is experiencing? If we take the client’s experiencing as the most important indicator of what the therapy is like and how helpful it is, then the therapist is never going to truly know it (as Rogers says), they can only come to an approximate understanding of it. And on what are they going to make that assessment?
Is it in the interaction: That by being deeply involved in the complex, entangled web of relating we will intuit what the client is experiencing? Yes, undoubtedly to some extent, but research findings show over and over again that therapists, with the best will in the world, often mis-perceive clients’ experiences, as well as their wants and needs. Why? Probably because our own experiences, beliefs, and perceptions are always, at least to some extent, going to get in the way—what the analysts call ‘countertransference’. We can’t ever cleanly, purely perceive what clients are experiencing, and therefore relying on our own intuitive senses can never tell us the whole story. Indeed, one of the reasons I’m so passionate about research evidence is because I think a 100% reliance on our perceptions of clients’ experiences can end up becoming a self-reinforcing, therapist-centred mode of practising (the very opposite of ‘client-centred’), in which our own assumptions, biases, and beliefs get imposed upon clients in implicit and, potentially, damaging ways.
Are we just confirming what we believe to be true?
Say I believe that insulin shock therapy is what my client needs. I really believe that: maybe it helped me at some point in my earlier life, maybe it was part of my training and it’s become the main therapy that I now earn a living from, maybe I’ve got shares in an insulin company so I’ve got a personal interest in proving its effectiveness. And so, when I see a client experiencing psychosis, or perhaps any client who is experiencing psychological distress, I ‘perceive’ that what they need is insulin shock therapy. And if that’s my honest, intuitive, deeply-felt view, then is there anything wrong with it? Of course so: ‘while widely used from the 1930s to the 1950s, insulin shock therapy was abandoned by the 1960s due to its danger and ineffectiveness’ (Google AI Overview). But if I just rely on my own sense of what is going on, or perhaps my clients if I ask them, ‘Has what I’ve done helped?’ then what’s to stop me continuing in this harmful practice? Will I see harm being conducted? Probably not because, like most people, I’m biased towards self-serving perceptions.
We might say that person-centred therapy is nothing like insulin shock therapy, and of course it isn’t, but how do we know? We perceive PCT as helpful… ok, but so did practitioners of insulin shock therapy, or scientology, or Scared Straight behavioural management programmes. Our clients say PCT’s helpful… Ok, but given the power dynamic in therapy, it can be incredibly difficult for clients in any therapy to say, or indicate, that something hasn’t helped. So I’d guess, for instance, that if a Scared Straight practitioners asked one of their young participants if they thought the programme had helped, they’d be pretty certain to say ‘yes’. So what, outside of ourselves, can we use to get some kind of perspective beyond our own, potentially self-serving, perceptions. Supervisors… yes, but if they’re from the same school of therapy as ourselves, then they’re likely to want to see what we do as helpful too. Trainers, friends, colleagues…. again, all potentially helpful sources, but there’s also the danger that this becomes something of an echo chamber, and what is often missing is the voice of the client. Leigh says, ‘No research comes close to encapsulating what therapy is about,’ but when you read the deeply layered qualitative studies of researchers like Heidi Levitt, for instance on what clients find helpful in therapy, or interpretative phenomenological studies that dig down into clients’ lived-experiences of mental health interventions (see here), you find some powerful, moving, and engrossing accounts of what it means to be a client in the therapeutic encounter. So this, for me, is where research evidence comes in. It is a perspective on therapy that, in many instances, gives us a client’s eye view on what is going on, both in terms of processes and outcomes. And it helps us see things that, from the therapist’s chair in the immediate encounter, we may not be aware of. The total answer? The final answer? No, of course not. But, as above, neither is the therapist’s intuitive perception of what the client is experiencing. So all of these are just pieces of a larger jigsaw puzzle, and an ‘evidence-informed’ approach, just like a supervisor-informed approach, or a person experience-informed approach, can help us see the greater whole.
So when Leigh writes that, ‘What [research] cannot do, no matter the methodology or researcher, is anything meaningful to improve the so-called “outcomes” of therapy’ it feels far too black-and-white to me. Indeed, if such a claim were true, we would never have the six conditions that Rogers hypothesised in 1957 to be necessary and sufficient for therapeutic personality change to happen—so much of which was built on careful, studious analysis of psychotherapy session recordings. It seems to me that, in the person-centred field itself, people sometimes forget that the approach was built on research evidence. We’ve had almost seven decades now of more research, more discoveries, more evocative and informative methods. There’s so much there now to learn from. Would a Rogers today, analysing the conditions for necessary and sufficient change, ignore the tens of thousands of research findings now available? Would he say that only research from before 1957 counted? I’d like to think that what he’d express, consistently with his philosophy, is a stance of openness and an appreciation of complexity: no, research findings aren’t the be-all and end-all of guidance on practice, but neither are they to be wholly dismissed.
I want to share a story that really moved me (details disguised to preserve anonymity). A few years ago, I was on the independent steering committee for a randomised controlled trial of a CBT programme for a particular form of eating problem. The researchers had worked incredibly hard, for about four years, to develop and test their intervention, and they were pretty heartbroken when they came to us at the end of the study to say that, on most indicators, the results showed that the programme hadn’t helped. Perhaps, because I felt sorry for them, I started talking at the meeting about the kinds of indicators that they might want to highlight to show some kinds of effects, or to examine the reasons why limited improvements were found. But, no, they were resolute: they’d found no effects and they weren’t going to try and fudge that: they had to be true to their findings. To me, that was such a fine example of what it means to be person-centred: open, humble, willing to be wrong… qualities that, for me, seem so essential to a person-centred way of being. It’s the spirit Rogers expresses when he talks about his own research and the facts being friendly: ‘in our early investigations I can well remember the anxiety of waiting to see how the findings came out. Suppose our hypotheses were dis-proved! Suppose we were mistaken in our views! Suppose our opinions were not justified!’ However, he goes on to write:
At such times, as I look back, it seems to me that I disregarded the facts as potential enemies, as possible bearers of disaster. I have perhaps been slow in coming to realize that the facts are always friendly. Every bit of evidence one can acquire, in any area, leads one that much closer to what is true. And being closer to the truth can never be a harmful or dangerous or unsatisfying thing. So while I still hate to readjust my thinking, still hate to give up old ways of perceiving and conceptualizing, yet at some deeper level I have, to a considerable degree, come to realize that these painful reorganizations are what is known as learning, and that though painful they always lead to a more satisfying because somewhat more accurate way of seeing life.
Supporting Social justice
Underneath the concerns of colleagues like Leigh is, I think, a sense that research is allied to a socially conservative agenda: that is, a mechanised, de-humanised, de-individualising way of seeing human ‘subjects’ that can underpin—and reinforce—a wider agenda of dominance and social control. I share this concern. However, what I think this also misses is the possibility that empirical research can also play an important role in challenging forces of dominance: in questioning assumed ‘truths’ and in affirming minority experiences. In a just-published article entitled, Understanding the Attack on Science as a Discrediting of Minoritized Lived Experiences: The Vital Importance of Qualitative Methods at This Time, Heidi Levitt argues that, ‘there are four routes via which qualitative research affirms minoritized experiences and supports people to understand them’:
(a) Qualitative research uncovers underlying mechanisms that were not hypothesized before but can improve interventions and access to them. (b) Qualitative research provides compelling, narrative-based information that counters harmful beliefs and myths about minoritized populations. (c) Qualitative methodology inherently values epistemic privilege and diversity of experience within the phenomena studied. (d) Qualitative research helps people to see their commonalities with others and their shared humanity.
Of course, this is not to suggest that all research serves such a social justice agenda. Only that it can: and that to entirely dismiss research is to reduce the number of tools we may have available to forge a more person-centred world.
the need to show effectiveness
And even if we’re not interested in research findings for our own practice, or out of a concern for social justice, my sense is that, for the survival and development of the person-centred approach, we desperately need to know more about research evidence and engage with the research field. Whether we like it or not, today, commissioners, funders, clients are wanting professionals to justify, with evidence, what they do. And I think they’ve got a right to. When I go and see a physiotherapist, for instance, I want them to use the methods that have shown the best outcomes to date for my problems—if they did a treatment on me that they believed was effective, even if all the evidence ran against it, I’d be annoyed. And beyond that, politically, if it’s all just about internal perceptions, then what’s to stop someone like Trump saying that paracetamol causes autism? He, no doubt, intuitive feels that that is the case, and if we say that research evidence doesn’t count, what do we turn to to assess the validity of such a claim? So why should we, as therapists, expect things to be any different when lay people look at us and our work? You could say that the intricacies of the therapeutic relationship are much more complex and individual than, say, paracetamols and autism, but, then, the latter—like most real world processes—is pretty complex too. I worry, sometimes, that there’s something of an exceptionalism in the therapy field: that we shouldn’t have to be held to the same standards as we would want others to be held to.
As a person-centred field, we have a massive hole where contemporary research evidence could be, and that absence, I would argue, has done immense damage to our field. In the last few decades, for instance, person-centred therapy has been virtually pushed out of primary care in the UK because the evidence is for CBTs, not person-centred approaches. What’s next? schools; bereavement services; or even private practice, where in some European countries insurance laws make it almost impossible to deliver non-‘evidence based’ therapies. Today, in places like Germany, the person-centred community is massively struggling and has radically reduced in size. Perhaps I’m doom-mongering, but if person-centred therapies are removed from public services, it is clients who are, ultimately, going to suffer—and particularly those clients without the socio-economic resources to afford private practice. What I think I also find frustrating is that, based on what we know, if person-centred therapists were to start evaluating their practice more, and use research evidence to refine and develop their approach (as, for instance, the emotion-focused therapy field has done), there’s a very good chance that we’d be able to show how helpful our work is, and also find ways in which, evidentially, it could be even better.
This is probably too polarised, but I think, as a person-centred community, we have a choice about what we want to be in the future. Do we want to be (or be seen) as something like crystal healing, where we’re providing a therapy outside of public services, that we claim is indefinable and un-categorisable, and based on the evidence ‘of our own eyes’—that is beyond any research validation. Or do we want to sit alongside other therapies like CBT and IPT as credible and publicly-available practices, which are seen as genuinely being able to help people with particular problems? By completely rejecting research evidence, we are in danger of falling into the former category, and I think that’s to our loss and to the loss of clients who may benefit from our services. And I don’t think we can have it both ways: I don’t think we can say that we want to be publicly commissioned and supported at a level equivalent to evidence-based therapies, but not to have to provide the evidence to sit alongside them. Again, it feels exceptionalist to me. Standing back, why should any one therapy have the right to be taken seriously by public bodies if it’s not willing to enter the public fray and demonstrate it’s worth?
Ethics
I also think there are some serious ethical issues here. Supposing—and this is totally hypothetical—there was research showing that, two years after completing person-centred therapy, clients were more depressed than before they started. But supposing that, in the therapy itself, they seemed to be doing very well. In that case, would it still be ok to use our own intuitive sense of how the sessions had gone as the sole source of guidance on what we should be doing. Worse, to give the most extreme example, supposing a research study showed that clients in person-centred therapy were actually more likely to take their own lives after therapy had ended. Again, would it be ok to ignore such evidence if it ran against our own subjective perceptions of the therapeutic interaction? Of course, there might be many reasons for such findings to emerge, and it wouldn’t necessarily mean that the person-centred therapy was causing the problems, but to entirely dismiss such evidence on the grounds that it was academic research, without even considering it? Ethically, such a position would really concern me and, I would guess, most people outside of the field. And if we, or are trainers, are not engaged with the research evidence field, how do we know that such findings are not already out there? So, for me, I think we have an ethical responsibility to be open to, and knowledgeable about, research evidence—just as we’re expected to be open to our supervisors, or are trainers, or the things our clients say. And, perhaps, the professionals bodies have a role in ensuring that therapists have that basic understanding.
I think the point here, again, is that ‘intuition’ is not neutral. Leigh, in his eloquent response to this blog, below, says that his intuitive perception is the evidence, shows what is real. And I do not doubt here that it can be very informative. But to assume that this can act as the sole source of truth denies the many factors that may also influence this intuitive sense. Indeed, research, itself, shows that our intuitive senses of things can sometimes be biased, misleading, and wrong. Daniel Kahneman summarises the plethora of research here in his book, Thinking fast and slow. For evolutionary reasons, we have to make ‘fast’, intuitive judgments all the time (what Kahneman calls ‘System 1 thinking’), but it is prone to numerous errors. For instance, we are more likely to ‘intuitively’ assess an attractive person as being kind or intelligent, and a xenophobic person will have a ‘gut feeling’ about foreigners as bad. ‘System 2’ careful, logical reasoning tends to reach more accurate conclusions. Then there is also all the self-confirmation biases, again very well established by research, that tend to show that we interpret events in ways that support our self-esteem. Of course, from a counter-position, such research can be discounted on the grounds that it is empirical research, and not the ‘research’ of one’s intuitive felt sense. Or it might be argued that such evidence may be applicable to others but not oneself. But how do any of us know that our intuitive feelings are not biased in some way? To claim that we know they are not biased because we do not experience them as biased is entirely circular and separates us off, I think, from a larger ethical responsibility to open up to the voice, experience, and perception of others. Research, at its best, represents the cumulative experiences, perspectives, and outcomes of clients, and to be entirely closed to that, whatever it indicates, in favour of our own gut feelings, seems to me an abdication of our ethical responsibilities.
a non-judgmental acceptance?
I think what concerns me most in the stance that Leigh represents, which becomes clearer in his response to this blog, is a fundamental lack of non-judgmental acceptance towards those who engage in research and academic writing. He states, for instance, that,
The reality is that psychology research papers, including in the field of psychotherapy, are almost always bone-dry, clinical, jargon-filled, statistic-obsessed documents that seem purposefully designed to obfuscate, impress, and blind the reader with a sort of meaningless science-salad.
Subsequently, he writes about researchers as being vain, desiring to impress, and being motivated by a desire to maintain their career. So where, here, is a person-centred acceptance and valuing of others, an appreciation of their actualising tendency, and a respect for the lives and activities of those who may try to grow and develop knowledge and understanding in different ways? In my experiences, there are not ‘good’ clinicians (wise, intuitive, caring) and ‘bad’ researchers (career driven, vain, and statistics obsessed). Rather, there are human beings, all of us, all trying to do the best we can in the worlds we inhabit—surely that is the essence of a person-centred outlook. So, rather than rubbishing a community that approaches the world, and uses language, in a different way, it seems incumbent on me that clinicians should engage the research community with respect and empathy. If a researcher said that the work of person-centred clinicians was ‘fantastically boring’ or ‘completely meaningless’, I think person-centred therapists would be rightly incensed. Not by the challenge, but by the tone of it and the lack of respect for difference. Even if, as members of the person-centred community, we do not want to engage with research, at the very least I think we should exemplify an attitude of respect and non-judgmental acceptance to others.
Why don’t therapists engage with research evidence?
This is what I honestly think… I think part of the reason that so many therapists are ‘allergic’ to research evidence is for all the reasons that Leigh is putting out there: we value the complexity and immediacy of the lived relationship, and have an intuitive sense of its healing capacity. But I also think that many of us come into the field from non-scientific backgrounds, and we don’t get the training to really understand—or be able to engage with—what a lot of the research studies are saying. And if you look at some of the leading journals in the field, even with my PhD in psychology and years of researching and writing, I haven’t much of a clue what’s going on: ‘The study employed a multilevel structural equation model with Bayesian estimation to simultaneously account for therapist effects, session-level fluctuations in alliance, and client-level moderators’… What’s worse, many of these findings are tucked away in journals with paywalls that make it impossible for everyday therapists to find out the latest news about what’s going on. So I think the field of research evidence can feel like a foreign, unapproachable, incomprehensible, and infinitely complex one—one where we feel way out of our depth, overwhelming—and dismissing it can be a way of reducing the dissonance and discomfort that that can evoke. If research evidence doesn’t matter, then I don’t have to worry about the thousands of findings out there that I may not understand, or that might challenge my way of thinking and practising. Phew!
OK, so now comes the plug: in my book Essential Research Findings in Counselling and Psychotherapy: The Facts are Friendly (Sage), with a second edition out in spring 2026 (and a training event on the 11th April 2026), I do try and summarise the current research findings in a way that is digestible and accessible to all. And there’s also some other texts out there, like Bergin and Garfield's Handbook of Psychotherapy and Behavior Change, that give reviews of the research evidence that are even more extensive, comprehensive, and in-depth. But I also think, as we’ve argued in a recent commentary, that there’s an onus on counselling and psychotherapy trainers to really know the research findings, and to be able to communicate this information to trainees. More than that, to communicate to trainers that a knowledge of this research evidence is one important element of professional knowledge and development. I might be wrong about this, but my guess is that most therapy trainers in the UK, (1) Don’t know much about the research findings in the therapy field, and (2) Aren’t particularly interested in them. And while that remains the case, I don’t think new practitioners are ever going to take research findings seriously.
Actions
An ongoing critique of research, particularly positivist assumptions, is an important element of a person-centred relationship to research. As Leigh does, we need to question the privilege that can be given to particular kinds of research evidence, and to emphasise the importance of human lived experience: gathered both systematically and non-systematically. But, in addition, if we want to engage with the research field, some of the things we can do are as follows:
Person-centred trainers can ensure that they are familiar with the core research evidence on therapy, and ensure that it is integrated into their curricula.
Academics in the person-centred field can consider conducting research that can help to genuinely develop our knowledge and understanding of person-centred practice.
Where research is a mandated element of person-centred courses, trainers and trainees can look towards conducting research that has the potential to make a real difference to clients’ lives (e.g., research on what clients find helpful in therapy).
Person-centred therapists can consider how they could be involved in developing research that could support the evidence base for person-centred therapies, for instance through practice research networks.
concluding thoughts
Person-centred philosophy, for me, is about growth, and change, and being open to the world so that we can become more complex, and fuller, and live life in all its diversity. And if we apply that to what it means to be a therapist, I think it means being open to the world in all its manifold elements, and not dismissing or disregarding or judging a priori any one source of knowledge. It can all be in there, as a complex, nuanced, multi-faceted whole. So there’s nothing, for me, that is inherently 'boring and completely meaningless' (except, perhaps, curtain fabrics!), and I guess you can see, in this blog, that I find it hurtful and frustrating when something I love is demeaned as such. As Leigh says, what goes on in the therapeutic relationship is, ultimately, unmeasurable, unquantifiable, irreplicable, and complex. But, for me, that doesn’t mean turning away from research findings. Rather, it means drawing on every shred of knowledge we can to help work out, with our clients, what it is that can best support them to flourish and grow.