Tech Salon recap: listen more and shift away from Western-centric framing to better address online violence against women and girls

As the online world continues to evolve, with AI, deep fakes, and much more advancing at speed, it’s crucial to keep pace and ensure that known harms are addressed. Yet to date, responses by platforms and others to online violence against women and girls have been reactive and slow. Crucially, they have largely left out the voices of the people most affected, including those living outside of the US.
On March 13th, we partnered with CNN’s As Equals to host a Technology Salon looking at what happens when girls and women’s rights are not protected online.
Our lead discussants were: Ladan Anoushfar, CNN As Equals Editor; Pallabi Munsi, CNN Reporter, doxing in Myanmar; Isabelle Amazon-Brown, The MERL Tech Initiative; Abeera Akhtar, Digital Product Developer, UNICEF; and Shanna Marzilli, President and CEO of Plan International USA
Top takeaways from the discussion
Women and girls are dealing with online violence in every sphere
This includes ‘every day’ harassment just for using the internet and mobile devices as well as targeted violence aimed at silencing women activists, politicians, and journalists. When women and girls are driven offline because of harassment and violence, it makes them less visible overall. Plan International and CNN As Equals produced a report on this topic in 2024. They found that:
- 58% of girls and women surveyed reported being harassed or abused online
- 75% of the 600 those interviewed had been exposed to unwanted sexual content, from ages as young as 13
- One in four who were abused online felt physically unsafe as a result
- One in five who were abused online stopped engaging in politics or current affairs
- One in five who were buried in misinformation said it reduced their trust in election results
- Platforms are not doing enough to address this issue — reporting online violence often leads nowhere
We need to listen more to girls and young women who are navigating online violence
Young women interviewed for the Plan-CNN report generally felt that adults were unequipped to help them address online violence. Adults often misunderstand how seamless “online and offline” are and frame it as two separate spheres, whereas for many young people “what happens at school is also happening in our bedrooms,” and there is little difference between online and offline, explained one Salon discussant.
Another discussant cautioned against being overprotective of young people or trying to ban devices or use of social media and AI applications. Research with over 100 girls and young women conducted for Girl Effect in four countries looked at how this group understood and conceptualized AI, what concerns they had with commercial AI tools, and how NGOs building tech tools should protect them. Girls’ optimism around the ways they could use AI was sometimes at odds with NGO narratives that focus on the dangers of AI. “When development organizations work to protect girls and women, we need to create spaces for their own voices and agency. How can they use these tools for themselves? Are the issues we are defining the same ones that girls are raising? Are we being too paternalistic – or in this case maternalistic — in our approaches?”
Girls participating in the Girl Effect study had varying concerns depending on their geographic location and context. For example, in South Africa they were concerned about AI and labor issues whereas in Ethiopia they were more worried about the environment. It’s important for those working at NGOs to acknowledge the variety of levels of exposure to AI and different narratives about AI when working with girls and young women, when designing ethical principles and guidelines, and when designing tech for girls and women to use as part of programs. We also need to remember that civil society tech is a drop in the bucket compared to commercial tools that people around the world are using. Girls basically told the study authors “It’s great that you are building safe tools for us, but you also need to teach us how to use commercial tech safely.”
A youth representative attending the Salon agreed. Like it or not, she said, adoption of AI among young people is skyrocketing. She described that at her university, “everyone around me uses AI for creativity, for innovation. My professor asked the class last year ‘how many of you use ChatGPT?’ and it was maybe a few kids. This semester, everyone raised their hand — they use it more than Google.”
Framing tends to be too Western-centric
A key point that surfaced at the Salon was that the conversations, the problem framing, and the solutions offered are too Western-centric. Tech-facilitated gender-based violence, including image-based abuse, looks different depending on culture and context. Yet work to combat it tends to take a highly Western centric view and misses aspects of intersectionality.
In addition, tech-facilitated gender-based violence is not even recognized as abuse in many parts of the world. A Salon participant from Zimbabwe explained that “there’s no recognized connection between online and offline abuse at Gender Help Desks.” If a girl or woman reports someone using a sexualized slur against them in their native language, officials “just stare at you. They are not getting it.”
A participant from Sierra Leone lamented that her home country “doesn’t even have laws to protect women and girls from online violence. There’s no sense of what it even looks like. If someone sends you a picture you don’t want to see and you report it, the police will laugh in your face. They don’t even see it as a crime. They will say ‘and what exactly should I arrest this person for?’” She wondered how to close the gap from a discussion in New York City and the women and girls back home. “How do we criminalize some of this stuff? How do we create policies and engage governments to make them create laws so that people understand this is a crime?” She highlighted that the conversations about this issue are happening without everyone at the table. “What’s going on in Kenya is different from what’s happening in Liberia or Guinea.”
Another participant shared nuances of what girls in Afghanistan are concerned about in terms of being online and experiencing online violence. “They are constantly being monitored online and on any devices they use. They are scared about things being connected to their phone and email, and they worry a lot. There is fear associated even with things like looking up ‘how is my body changing during puberty?’”
Closing out this part of the discussion, someone raised the topic of tech-facilitated elder abuse as an issue of growing concern that shouldn’t be ignored. “We don’t tend to think of older adults as sexual beings or as targets of rape. Tech-facilitated elder abuse is something we are only now becoming aware of.” Older people can be harassed or threatened via text or email, as well as tracked and surveilled. Tech-facilitated elder abuse workers also see cases of human trafficking for benefits and financially motivated abuse. It’s often perpetrated by an adult family member suffering from substance abuse or mental illness, where the family member has access to devices and personal information that are used against the older adult without their knowledge.
Non-Consensual Intimate Imagery Abuse is top of mind for many
Much of the conversation at the Salon trended towards the problem of non-consensual intimate imagery (NCII) abuse, an area where definitions have been culturally narrow in the past. In many places, said one Salon participant, an image doesn’t need to include nudity for it to be dangerous for a woman or a girl. Another person agreed, “Online violence is related to dignity. What does dignity mean in local context? An abusive image doesn’t have to be a total nude. Maybe just exposing a forearm is a challenge.” Safety by design if done from the Global North misses a lot of the nuances in other contexts.
The term ‘Non-Consensual Intimate Imagery’ has replaced earlier terminology (‘revenge porn’) in an effort to capture some of these nuances and more adequately describe this type of abuse. Trust and Safety Teams working with some of the major platforms have contextualized NCII in different countries and contexts to include any photo taken in an ‘intimate’ setting and then shared outside of that setting without consent and often with an intent is to harass, extort or shame.
Passing legislation related to NCII is challenging because of the various scenarios in which NCII may appear. It can be an adolescent sharing photos of another adolescent. The original image might have been sent consensually to one person who then shares non-consensually with a few – or a lot of – others, or publicly online. Such an image might also have been acquired as part of an adult grooming a child, or as part of an effort to sextort a child, and NCII can veer into child sexual abuse material (CSAM). Adult women are also affected by NCII, as noted earlier in this post. One participant at the Salon gave the example of a female politicians who decided not to run for office after receiving threats that intimate images of her would be put online.
Some Salon participants highlighted the nuances in these various scenarios and urged the development of adequate state laws that don’t criminalize kids as felons for creating creating or passing along images to a couple of friends. They advocated for diversion programs for underage kids while also addressing the CSAM industry and organized crime. Others said that survivors often feel upset that people are put into ‘diversion’ or rehabilitation programs instead of being prosecuted. “Survivors want them to be punished.”
Families dealing with NCII are also often lost in a maze of processes. “There is no one organization that helps you deal with it from start to finish.” The question for one Salon participant was how to make the perpetrator pay for harming another person in this way. For the victim of NCII, it will follow them forever. It’s important that the perpetrator knows that what they did is wrong. “Survivors feel that they are empowered by getting a claim and a suit.” A key point was the need for national resources to support those affected by NCII in each stage of the process.
In the US, a proposed “Take it Down” Act is gaining traction. It would require reported images to be taken down within 48 hours. Many of the organizations at the Salon are in favor of the Act. Others have criticized the effort, saying there are loopholes and concerns related to how the current US administration might abuse the Act to take down other types of content.
What we can do
To conclude on a hopeful note we discussed what we might do to address these issues, with some ideas as follows:
- Help young people see their value and their humanity. We need to teach people, especially young people, to see the humanity in others, suggested one Salon participant. This is a major role for parents, who may be both the most scared and the most empowered in a child’s life. “What do I teach my daughter? Her rights are important. She needs to know to tell someone if someone is harming her.” We need to work more with children and young people on healthy interpersonal relationships so that there is less violence and abuse.
- Identify and share simple ways for young people to protect themselves. We might not have adequate tech or legal solutions, said one person at the Salon, but we do need harm reduction. We can teach young people to protect themselves and mitigate risk. For example, “if you’re going to share a nude, at least don’t put your face in the photo.”
- Get well-known people to engage as advocates for addressing online gender-based violence. One Salon participant gave the example of Nighat Dad, who started the Digital Rights Foundation in Pakistan, and is on Facebook’s oversight board. Her work and her place as a well-known figure has pushed this issue forward in Pakistan. Deep fakes reported to Pakistan’s cybercrime unit get a quick response.
- Counter tech harms with tech. Journalists can conduct open-source investigation and use AI to help more quickly find and analyze the data and narratives that are running on social media. This can help with reporting back on what is happening on platforms and how much violence against women and girls there is, said one person.
- Offer direct services. One participant said they are working on a tech self-safety diagnostic tool for older adults and offering the option to speak with a professional if they are being harmed.
Technology Salons run under Chatham House Rule, so no attribution has been made in this post. If you’d like to join us for a Salon, sign up here. If you’d like to suggest a topic please get in touch! This Salon was co-hosted and sponsored by CNN As Equals. Please contact us if you would like to discuss sponsoring a Salon or offering financial support for our work!