Over the weekend we read about how the text support service Shout has shared users’ anonymised conversations with academic researchers, reportedly without getting explicit informed consent, including from people as young as 13.
We know from our work that people don’t like it when things happen without them being informed and agreeing to it. Young people have told us that they have concerns about their information being shared and that this can put them off seeking support.
We don’t know the wording that was provided but our view is consent to share underpins the ethical conduct of research & service evaluation. So legally, a website’s privacy policy might cover an organisation but open, explicit informed consent is needed to retain users’ trust.
We don’t know the research team involved. We’d be interested to know if they or Shout had involved young people and their parents in the decision process to use anonymised data in this way.
Big Data is a powerful tool that has potential but it must be used with the utmost care and with lived experience expertise alongside. Also interdisciplinary research involving people with lived experience can avoid blind spots.
We understand those involved – Shout staff & the researchers – felt they were operating ethically had approval from a university-based ethics group. But you only have to look at Twitter reaction from mental health service users & others to understand how wrong this feels to them.
A bigger conversation needs to happen about data sharing and research ethics to avoid erosion of trust in services seeking to support people in distress.