Menu

How lived experience can help revamp mental health tools

Even well-established tools can benefit from service user input found Public Involvement Lead Carolyn Asher following a recent study

Carolyn Asher 

There are lots of good reasons to involve service users in shaping mental health measurement tools.

Including service users in the  redesign of mental health measures is an important move, because those with poor mental health are the ones who are tested by these psychological measures; because clinicians and academics do not have all the answers; because it’s morally right to involve people who are the recipients of these measures; and because they have a unique perspective.

That said, unfortunately service user involvement does not always happen, which is why the Early Youth Engagement (EYE-2) study I’ve recently been involved with as a Public Involvement Lead is worth a closer look.

It rejected this tendency to leave out people who use services in the creation or adaptation of a psychological measure, developing a more person-centred and sensitive tool that could be used by clinicians and research assistants.

Supporting people experiencing psychosis

The EYE-2 project is about improving services for people who have a first episode of psychosis, so that more people stay with the service and benefit from its support.

The project builds on the work of the first Early Youth Engagement project with Sussex Partnership Foundation Trust, and in Kent and Surrey, which developed a new approach with young people, their parents and Early Intervention in Psychosis (EIP) staff.

The project ran across five site locations – London, Manchester, Hampshire, Thames Valley and Cambridge-Norfolk. Participants in the study (people aged 18-35 under EIP services) completed regular questionnaires about their mental health, usually with the support of their care co-ordinator. 

The HoNoS (Health of the Nation Scale) was used at several points in the intervention to measure how people were doing. Below is an example of one of the topics covered by HoNoS.

Problems with relationships: Rate most severe problem associated with active or passive withdrawal from social relationships and/or non-supportive, destructive or self-damaging relationships

  • 0 No significant problems during the period
  • 1 Minor non-clinical problem
  • 2 Definite problems in making or sustaining supportive relationships; evident to others
  • 3 Persisting major problems due to active or passive withdrawal form social relationships, and/or relationships that provide little or no comfort or support
  • 4 Severe and distressing social isolation and/or withdrawal from social relationships

(Sourced from Assessments.pdf (peardonvillehouse.ca))

“The whole structure of the HoNoS, as it stood, was not person-centred, or appropriate for telephone interviews.”

A need for standardised conversations

As you can see from the example, it relies on the doctor – or, in this case, the research assistant – making decisions based on their conversation with the person. This conversation is not standardised and can change each time it is done. One doctor may ask one thing to be able to score the HoNoS effectively whilst others may ask different things. 

As each individual section is scored, a total for the individual is tallied. This can then be compared with previous scores or analysed, with a high score indicating a higher need for support.

The scale is very subjective with the final score based on the doctor’s opinion, so it was felt it would be very hard for the EYE-2 research assistants to complete.

The research assistants wouldn’t know the participants that well, if at all, especially if the participants had disengaged from the service and therefore the study. There were also concerns there would be no consistency between the study sites.

I think the whole structure of the HoNoS, as it stood, was not person-centred, or appropriate for telephone interviews, and required changes. This was echoed by those whose opinion we sought as part of the study.

Developing a fair system

To change it, we gathered input from carers and service users of the EIP services in the form of LEAPs (Lived Experience Advisory Panels), and held meetings to gather questions that could be asked by the research assistants over the phone.

The service users & carers helped create these and came up with different wording that would help the research assistants judge how to score that factor. 

This was still subjective, but the questions were now aimed at providing a more structured approach to finding out the information, so that all sites were using the same questions.

Working in this way we ensured that we heard the voices of those impacted by early intervention services, and those receiving the EYE-2 intervention, as often as we could and that they had a participatory and creative role in this co-produced piece of work. 

“It seemed to flow and didn’t have any of the challenging wording that was on some of the previously revised versions.”

Revamping the HoNoS

Six months later the LEAPs met again to discuss the newly created – and much more in-depth – HoNoS questionnaire based on combining everyone’s suggestions. The LEAPs then reviewed this questionnaire to come up with a more useable structure and conversational style.

This included altering the wording and order the questions were presented in, as it was felt the original HoNoS started with a topic that could cause issues when asked, such as talking about ‘overactive’, ‘aggressive’, ‘disruptive’ or ‘agitated’ behaviours.

After the second round of LEAP input, the questionnaire was revamped again by the study team. This version was then put to the Public Involvement Leads, who all also have lived experience. This resulted in a final version which was used by the research assistants in the study.

They reported that it was easier to use than the original version. All LEAP participants were thanked for their involvement in the study and kept informed as to what had happened with their input. 

Before the research assistants used this new version of the HoNoS, they completed practice interviews with the Public Involvement Leads over the phone, including myself.

We assumed characters, though often ones based on real life incidences from mine and others’ lives, giving them confidence and practice at completing the HoNoS, as well as the chance to thoroughly test the revised version.

I found that it seemed to flow and didn’t have any of the challenging wording that was on some of the previously revised versions. This shows that the public involvement was important, necessary and effective.

“Every researcher should be looking at involving the public throughout their study in a way that coproduces the materials, methods and data collection.”

So what happens next?

As the study’s grant funding did not include a review of the HoNoS effectiveness and the study is now coming to an end, it might be useful for future work to more thoroughly explore how effective and well-received the new tool was.

As a Service User Research Assistant and Public Involvement Coordinator, I come across far too many studies that either haven’t involved the public enough to make a sustainable difference or circumvent the process by not doing it thoroughly enough. 

Involvement of the public is morally right, and funders and research bodies should make it easier and more realistic to get the funding to do it properly.

Every researcher should be looking at involving the public throughout their study in a way that coproduces the materials, methods and data collection, as well as the writing up and the dissemination of the results.

You can take a look at the social club booklet here:

For more information on the work McPin does follow us on Twitter @McPinFoundation or sign up to our newsletter


Carolyn Asher is the Public Involvement Coordinator and Service User Research Assistant for Southern Health NHS Foundation Trust. She works for Southern Health’s Research and Development Department and worked on the EYE-2 Project as the Public Involvement Lead.