Community listening: A relational, people-first process that shouldn’t be turned into a tickbox exercise


On May 5, our Community of Practice gathered for a conversation about the place of digital technologies in community listening processes. We were joined by more than 90 practitioners working in humanitarian organizations, development NGOs, philanthropic foundations and funders, research hubs, universities, and community-based organizations. 

Our guest speakers, Soledad Muñiz, Jessica Mayberry, and Gabriella Prandini, kindly took us up on the challenge of answering “What has changed over the past decade with regard to tech-enabled community listening? What remains the same? What’s different about community listening now that we have AI, if anything?” It was a conversation that highlighted how community listening should, at its core, be designed with and for communities. In this blog post, we share some of our key takeaways from the discussion.  

When does digital technology add value at the community level?

“Community listening is not just generating data”. That’s how Soledad Muñiz, Director of Programmes at Insight Share, opened our conversation. Reflecting on how the meaning of participation and community listening has been diluted in the past few years and wrongfully equated to data collection or research, she pointed out how our focus as a sector should be redirected towards community listening as a process that supports local organizations and communities to move forward with their own agendas

“No matter if you work in activism, international development, or the humanitarian sector, at the end of the day, you are supporting locally-led change, and that should be your aim.” Soledad explained how, in her work, community listening is a “process to achieve anti-racist, anti-colonial forms of development and humanitarian processes and ways of working… listening to what the community wants and co-designing, rather than imposing agendas and just measuring progress or feedback.”

Soledad argued that community listening is essentially a relational process. In that sense, to figure out if and how digital technologies can become a part of community listening processes, Soledad proposes that we first ask ourselves, “Is it a relational conversation? Is it a two-way path, where not just one gives information and the other ones makes decisions? Are we both exchanging information? Are we both learning from each other and jointly making decisions? That is crucial for us.” 

While the use of digital technologies in community listening can be beneficial if – and only if – it’s meaningful to the people, at the service of communities, and strengthening local voices. To make that happen, Soledad shared three questions that she asks herself when designing programmes: 

  1. Who defines what’s worth listening to? Who owns the data?
  2. When is digital technology adding value at the community level? When is it inclusive and accessible?
  3. How can we decolonise our approaches to aid and development through community listening?

What can the ethical use of AI look like for community listening?

Jessica Mayberry, from Video Volunteers, also highlighted how the uncritical introduction of digital technologies, especially AI, in community listening work can be harmful. “The concept of community listening is so easy to exploit,” shared Jessica. “Right now, an explosion of people are saying how important it is to listen to communities, but there’s not a commensurate explosion in work where this listening is actually happening.” 

Jessica’s words prompted a reflection on how, when we combine the reality that development sector structures perpetuate power imbalances with the hype around AI’s promises for the sector, we could end up in performative processes: “This could mean another tick box effort to prove how much community involvement we have without actually doing it.”

To Jessica, the use of AI in community listening can only “be done ethically if it’s truly in service of and emerging from social movements and local community-based groups.” For example, the team at Video Volunteers is using AI to transcribe their whole archive of 30,000 videos. They’ve used AI to build a custom pipeline that was able to read the YouTube links, generate a transcript and translation, and rate the quality of the transcripts.  Notably, this worked really well in Hindi but did not work very well in Bengali. Though the project was not excessively expensive, it took time (more than a year), upskilling staff, and hiring technical staff. For Video Volunteers, using AI in fit-for-purpose, community-centered ways has generated relevant results for their work, allowing them to identify health-related insights they had missed earlier and categorize climate-related content into sub-themes. 

“My actual recommendations are not particularly fashionable or easy to do: We need to invest in social movements and local leaders and their communication skills, not dictate the topics the communities you listen to should be talking about, and finally, compensate community members, always.”

Jessica explained that this multi-year process required a lot of foundational work in two areas:“The first was the ethics. For us, doing this work ethically meant doing it with communities. The second was that we began not with tech, but with research into the nature of community voice. And this is where we created a methodology for voice with the Aapti Institute.” 

“With AI, we have a greater responsibility towards communities”

Gabriella Prandini, is the Managing Director of Talk to Loop, a digital platform that enables communities to provide feedback, complaints, and requests for assistance at any given time. A key feature of their platform is the absence of a fixed questionnaire. “By choosing the questions, you’re in a way also guiding the answers”, Gabriella explained. “What happens with our platform is that people can provide feedback through the website, messaging, and, most importantly, through voice as well, in their own local language. They can provide feedback whenever they want, about whatever they want, so there’s no one there trying to extract information from them.” 

At Talk To Loop, when they receive the information, they work with moderators from the local contexts they work in, who can understand both the language, the culture, and the context. When it comes to the use of AI, Gabriella points out that “the fact that it is possible to collect data so much faster, I think we have a greater responsibility towards these communities.”

Reflecting on where AI fits into their work, Gabriella pointed out that AI may have a role when it comes to data analysis, but highlighted the ways in which AI is unfit for use in their moderation methodology: “We don’t foresee using AI in the near future for moderation [in our work]. We still think it’s very, very important that it’s a person who either answers the call or receives the voice note, because again, it’s not about speaking the language. (…) It’s very important that we ensure that these communities are heard by people who understand the culture. And I don’t think AI is anywhere near. With the whole discrepancies related to inequality within AI… we’re miles away from trusting AI to moderate content from our communities.”

***

As our speakers reminded us during the call, community listening works best when it’s designed with and for communities to advance their own agendas, discussions, and needs. When it comes to thinking about the place of digital technologies in this work, keeping Soledad’s question top of mind seems essential: When is digital technology really adding value at the community level? 

Keep an eye out for our event series about community listening

In the coming months, we’ll organize more events focusing on community listening and digital technologies. If you’re interested in joining these conversations, make sure you sign up here.

Leave a Reply

Your email address will not be published. Required fields are marked *