Menu

Our Methods Papers

Developing and testing new models for service user and carer involvement in mental health research is a core part of the McPin Foundation’s mission. Our methods reviews are an opportunity for us to describe these approaches and reflect on what we have learnt.

Our first series of method reviews are based on an evaluation commissioned by Mind. We have carried out two ‘Your experience in Mind’ surveys to understand how people using Local Minds felt about the services and the support they received.

Employing service users to act as ‘Survey Champions’ was at the heart of the evaluation model we adopted. They helped design the survey, encourage participation in the evaluation, reviewed findings and commented on the final report.

Our first Survey Champion methods review (July 2015) explained how the model worked and the impact it had on the quality of the evaluation.

One of our peer researchers working on the study reflected:

“I was really pleased to see Survey Champions rise to the challenge… One real advantage of using Survey Champions was their local knowledge and the time they could spend promoting the survey. They used a very ‘hands-on’ approach, talking to as many service users as possible, visiting social groups and activity sessions.

…this approach really helped recruitment, and the local Minds with a Survey Champion had an average of 69 surveys returned, while those that didn’t have a Champion had 52 surveys returned”.

Our second paper “Learning a New Role” (September 2016) looked at the impact collaborating on the survey had on the wellbeing of the survey champions themselves. The feedback was that although challenging the process had been a positive experience for the survey champions. Many of them reported learning new skills, or building confidence. A number had used their experience in applications for other jobs or training.

McPin Methods Papers

  1. Survey Champions Model: ‘Your Experience in Mind’
  2. ‘Learning a New Role’: How does employment on a collaborative evaluation impact on peer evaluators?