Why some research still needs a human in the room
At Claremont we’ve been engaging in the current conversation about whether AI interviewers can do qualitative research. We recognise the benefits AI can bring – reducing bias, reaching more voices, quick, systematic analysis and we liked the labels of narrative or conversational quant used by Jiten Madia to accurately describe what AI can do for insight gathering.
As Claremont’s Qualitative research lead, I thought I’d share some reflections on this in relation to a recent piece of research I’ve been doing with 100 women – talking about their experiences of the health care system specifically in relation to women’s health issues such as menstruation, contraception and menopause.
As anyone with first-hand experience of the topic might expect, these conversations were complex and emotional. Women started sharing stories about things that had happened to them which they might have never discussed with anyone before or perhaps not for a long time – painful stories – physically and emotionally about misdiagnosis, dangerous delays and dismissive practices.
Throughout these conversations I consistently heard how gender dynamics, disregard, emotional discomfort and lack of clinical curiosity, damage trust and actively deter women from accessing care. Women told me time and time again that they felt ignored, belittled and written off when seeking help.
As a human rather than an AI facilitator, I knew there were things I needed to do differently in response to these stories.
- I slowed the conversation down, gave more room to those stories and adapted the questions to suit the mood
- I flagged what was happening to our client and asked for additional signposting materials to give women specific advice and guidance
- I encouraged women to take breaks from the conversation, make a tea, go off camera, whatever they needed to do
- I forewarned women that some of the content might be difficult and that they didn’t need to share personal experiences if they weren’t comfortable in doing so
- I adapted planned use of tools such as Mentimeter which on occasions felt useful and at other times felt disruptive
- I reflected on what I’d heard in other sessions and brought additional questions into the conversation to identify emerging patterns and themes
There were also positive unintended consequences that came from these groups which are only able to flourish when everyone in the virtual room is a human being. Women showed visible shock and concern for one another, expressed support and empathy and gave advice for how to access better care and information.
AI facilitators (and synthetic participants) cannot interact in these distinctly human ways. The risk for research like this isn’t only that the process is extractive and lacking nuance but that it potentially causes harm. These women have already been misunderstood and overlooked – they don’t need AI moderators giving them the same treatment. They need someone real to listen to them.