13th August 2019 Blog

Applying common sense to show the benefits of relevant lived experience

Lived experience • Research methods •

Dan Robotham

One of the questions that people, particularly researchers and academics, often ask about ‘patient and public involvement’ (PPI) is whether the involvement of people with relevant lived experience makes any difference. If so, where is the evidence?

I’ve always thought that this was a bit of a non-argument. In research, involving people with lived experience is as much about values as anything else.

It is less important whether there is ‘evidence of impact’ in the traditional scientific sense. The benefits of involving people who understand the topic from a variety of perspectives should be filed under ‘common sense’.

Surely involving people with relevant lived experience in the research process will result in a useful/better/different (if longer and more expensive) process? As researchers, we are so trained to look for evidence that we can sometimes underestimate the importance of common sense.

For those who suggest that more evidence is needed, I had the fortune of hearing Petra Videmšek talk about her research recently.

She is a social worker and a lecturer at the University of Ljubljana. She did her PhD research into social care-run ‘group homes’ in Slovenia.

As I understand it, the group home model in Slovenia is supposed to be a transition between institutions and independent living. She formed a group of ‘Experts by Experience’ to conduct the research, in which people from one group home were involved in interviewing people living in other group homes about their experiences.

More open, honest and personal

What really interested me though was that Petra then ran an experiment inside this process. Being a social worker and a lecturer on a social work university course, she recruited some social work students to also collect qualitative interview data in similar settings.

Petra then compared the data collected by students to the data collected by the ‘Experts by Experience’. She found that the interviews done by the latter were deeper, the interviewers were better at opening people up (without censorship) and the interviews covered a broader range of topics, had a better understanding of context, and included topics which the students had not thought of.

In her words, “the answers were more direct, personal, honest and open compared to those obtained by student researchers”.

It would be interesting to repeat the experiment but compare interviews done by experts by experience with those done by professional researchers without relevant lived experience. But even without that comparison, we can still apply the principle of common sense.

It shouldn’t be surprising that people who have experienced similar things to research participants can empathise and are therefore better equipped to gather data. Petra’s study is one piece of evidence which can be used to refute arguments about the ‘lack of evidence’ for involvement.


You can read more about Petra’s study here.

Dan Robotham is deputy research director at McPin.