Tag Archives: Emily Tomkys

We have a data problem

by Emily Tomkys, ICT in Programmes at Oxfam GB

Following my presentation at MERL Tech, I have realised that it’s not only Oxfam who have a data problem; many of us have a data problem. In the humanitarian and development space, we collect a lot of data – whether via mobile phone or a paper process, the amount of data each project generates is staggering. Some of this data goes into our MIS (Management Information Systems), but all too often data remains in Excel spreadsheets on computer hard drives, unconnected cloud storage systems or Access and bespoke databases.

(Watch Emily’s MERL Tech London Lightning Talk!)

This is an issue because the majority of our programme data is analysed in silos on a survey-to-survey basis and at best on a project-to-project basis. What about when we want to analyse data between projects, between countries, or even globally? It would currently take a lot of time and resources to bring data together in usable formats. Furthermore, issues of data security, limited support for country teams, data standards and the cost of systems or support mean there is a sustainability problem that is in many people’s interests to solve.

The demand from Oxfam’s country teams is high – one of the most common requests the ICT in Programme Team receive centres around databases and data analytics. Teams want to be able to store and analyse their data easily and safely; and there is growing demand for cross border analytics. Our humanitarian managers want to see statistics on the type of feedback we receive globally. Our livelihoods team wants to be able to monitor prices at markets on a national and regional scale. So this motivated us to look for a data solution but it’s something we know we can’t take on alone.

That’s why MERL Tech represented a great opportunity to check in with other peers about potential solutions and areas for collaboration. For now, our proposal is to design a data hub where no matter what the type of data (unstructured, semi-structured or structured) and no matter how we collect the data (mobile data collection tools or on paper), our data can integrate into a database. This isn’t about creating new tools – rather it’s about focusing on the interoperability and smooth transition between tools and storage options.  We plan to set this up so data can be pulled through into a reporting layer which may have a mixture of options for quantitative analysis, qualitative analysis and GIS mapping. We also know we need to give our micro-programme data a home and put everything in one place regardless of its source or format and make it easy to pull it through for analysis.

In this way we can explore data holistically, spot trends on a wider scale and really know more about our programmes and act accordingly. Not only should this reduce our cost of analysis, we will be able to analyse our data more efficiently and effectively. Moreover, taking a holistic view of the data life cycle will enable us to do data protection by design and it will be easier to support because the process and the tools being used will be streamlined. We know that one tool does not and cannot do everything we require when we work in such vast contexts, so a challenge will be how to streamline at the same time as factoring in contextual nuances.

Sounds easy, right? We will be starting to explore our options and working on the datahub in the coming months. MERL Tech was a great start to make connections, but we are keen to hear from others about how you are approaching “the data problem” and eager to set something up which can also be used by other actors. So please add your thoughts in the comments or get in touch if you have ideas!

How to Develop and Implement Responsible Data Policies

nigera-ict

A friend reminded me at the MERL Tech Conference that a few years ago when we brought up the need for greater attention to privacy, security and ethics when using ICTs and digital data in humanitarian and development contexts, people pointed us to Tor, encryption and specialized apps. “No, no, that’s not what we mean!” we kept saying. “This is bigger. It needs to be holistic. It’s not just more tools and tech.”

So, even if as a sector we are still struggling to understand and address all the different elements of what’s now referred to as “Responsible Data” (thanks to the great work of the Engine Room and key partners), at least we’ve come a long way towards framing and defining the areas we need to tackle. We understand the increasing urgency of the issue that the volume of data in the world is increasing exponentially and the data in our sector is becoming more and more digitalized.

This year’s MERL Tech included several sessions on Responsible Data, including Responsible Data Policies, the Human Element of the Data Cycle, The Changing Nature of Informed Consent, Remote Monitoring in Fragile Environments and plenary talks that mentioned ethics, privacy and consent as integral pieces of any MERL Tech effort.

The session on Responsible Data Policies was a space to share with participants why, how, and what policies some organizations have put in place in an attempt to be more responsible. The presenters spoke about the different elements and processes their organizations have followed, and the reasoning behind the creation of these policies. They spoke about early results from the policies, though it is still early days when it comes to implementing them.

What do we mean by Responsible Data?

Responsible data is about more than just privacy or encryption. It’s a wider concept that includes attention to the data cycle at every step, and puts the rights of people reflected in the data first:

  • Clear planning and purposeful collection and use of data with the aim of improving humanitarian and development approaches and results for those we work with and for
  • Responsible treatment of the data and respectful and ethical engagement with people we collect data from, including privacy and security of data and careful attention to consent processes and/or duty of care
  • Clarity on data sharing – what data, from whom and with whom and under what circumstances and conditions
  • Attention to transparency and accountability efforts in all directions (upwards, downwards and horizontally)
  • Responsible maintenance, retention or destruction of data.

Existing documentation and areas to explore

There is a huge bucket of concepts, frameworks, laws and policies that already exist in various other sectors and that can be used, adapted and built on to develop responsible approaches to data in development and humanitarian work. Some of these are in conflict with one another, however, and those conflicts need to be worked out or at least recognized if we are to move forward as a sector and/or in our own organizations.

Some areas to explore when developing a Responsible Data policy include:

  • An organization’s existing policies and practices (IT and equipment; downloading; storing of official information; confidentiality; monitoring, evaluation and research; data collection and storage for program administration, finance and audit purposes; consent and storage for digital images and communications; social media policies).
  • Local and global laws that relate to collection, storage, use and destruction of data, such as: Freedom of information acts (FOIA); consumer protection laws; data storage and transfer regulations; laws related to data collection from minors; privacy regulations such as the latest from the EU.
  • Donor grant requirements related to data privacy and open data, such as USAID’s Chapter 579 or International Aid Transparency Initiative (IATI) stipulations.

Experiences with Responsible Data Policies

At the MERL Tech Responsible Data Policy session, organizers and participants shared their experiences. The first step for everyone developing a policy was establishing wide agreement and buy-in for why their organizations should care about Responsible Data. This was done by developing Values and Principles that form the foundation for policies and guidance.

Oxfam’s Responsible Data policy has a focus on rights, since Oxfam is a rights-based organization. The organization’s existing values made it clear that ethical use and treatment of data was something the organization must consider to hold true to its ethos.

It took around six months to get all of the global affiliates to agree on the Responsible Program Data policy, a quick turnaround compared to other globally agreed documents because all the global executive directors recognized that this policy was critical.

A core point for Oxfam was the belief that digital identities and access will become increasingly important for inclusion in the future, and so the organization did not want to stand in the way of people being counted and heard. However, it wanted to be sure that this was done in a way that balanced and took privacy and security into consideration.

The policy is a short document that is now in the process of operationalization in all the countries where Oxfam works. Because many of Oxfam’s affiliate headquarters reside in the European Union, it needs to consider the new EU regulations on data, which are extremely strict, for example, providing everyone with an option for withdrawing consent.

This poses a challenge for development agencies that normally do not have the type of detailed databases on ‘beneficiaries’ as they do on private donors. Shifting thinking about ‘beneficiaries’ and treating them more as clients may be in order as one result of these new regulations. As Oxfam moves into implementation, challenges continue to arise.

For example, data protection in Yemen is different than data protection in Haiti. Knowing all the national level laws and frameworks and mapping these out alongside donor requirements and internal policies is extremely complicated, and providing guidance to country staff is difficult given that each country has different laws.

Girl Effect’s policy has a focus on privacy, security and safety of adolescent girls, who are the core constituency of the organization.

The policy became clearly necessary because although the organization had a strong girl safeguarding policy and practice, the effect of digital data had not previously been considered, and the number of programs that involve digital tools and data is increasing. The Girl Effect policy currently has four core chapters: privacy and security during design of a tool, service or platform; content considerations; partner vetting; and MEAL considerations.

Girl Effect looks at not only the privacy and security elements, but also aims to spur thinking about potential risks and unintended consequences for girls who access and use digital tools, platforms and content. One core goal is to stimulate implementers to think through a series of questions that help them to identify risks. Another is to establish accountability for decisions around digital data.

The policy has been in process of implementation with one team for a year and will be updated and adapted as the organization learns. It has proven to have good uptake so far from team members and partners, and has become core to how the teams and the wider organization think about digital programming. Cost and time for implementation increase with the incorporation of stricter policies, however, and it is challenging to find a good balance between privacy and security, the ability to safely collect and use data to adapt and improve tools and platforms, and user friendliness/ease of use.

Catholic Relief Services has an existing set of eight organizational principles: Sacredness and Dignity of the human person; Rights and responsibilities; Social Nature of Humanity; The Common Good; Subsidiarity; Solidarity; Option for the Poor; Stewardship.

It was a natural fit to see how these values that are already embedded in the organization could extend to the idea of Responsible Data. Data is an extension of the human person, therefore it should be afforded the same respect as the individual. The principle of ‘common good’ easily extends to responsible data sharing.

The notion of subsidiarity says that decision-making should happen as close as possible to the place where the impact of the decision will be the strongest, and this is nicely linked with the idea of sharing data back with communities where CRS works and engaging them in decision-making. The option for the poor urges CRS to place a preferential value on privacy, security and safety of the data of the poor over the data demands of other entities.

The organization is at the initial phase of creating its Responsible Data Policy. The process includes the development of the values and principles, two country learning visits to understand the practices of country programs and their concerns about data, development of the policy, and a set of guidelines to support staff in following the policy.

USAID recently embarked on its process of developing practical Responsible Data guidance to pair with its efforts in the area of open data. (See ADS 579). More information will be available soon on this initiative.

Where are we now?

Though several organizations are moving towards the development of policies and guidelines, it was clear from the session that uncertainties are the order of the day, as Responsible Data is an ethical question, often relying on tradeoffs and decisions that are not hard and fast. Policies and guidelines generally aim to help implementers ask the right questions, sort through a range of possibilities and weigh potential risks and benefits.

Another critical aspect that was raised at the MERL Tech session was the financial and staff resources that can be required to be responsible about data. On the other hand, for those organizations receiving funds from the European Union or residing in the EU or the UK (where despite Brexit, organizations will likely need to comply with EU Privacy Regulations), the new regulations mean that NOT being responsible about data may result in hefty fines and potential legal action.

Going from policy to implementation is a challenge that involves both capacity strengthening in this new area as well as behavior change and a better understanding of emerging concepts and multiple legal frameworks. The nuances by country, organization and donor make the process difficult to get a handle on.

Because staff and management are already overburdened, the trick to developing and implementing Responsible Data Policies and Practice will be finding ways to strengthen staff capacity and to provide guidance in ways that do not feel overwhelmingly complex. Though each situation will be different, finding ongoing ways to share resources and experiences so that we can advance as a sector will be one key step for moving forward.

This post was written with input from Maliha Khan, Independent Consultant; Emily Tomkys, Oxfam GB; Siobhan Green, Sonjara and Zara Rahman, The Engine Room.

Rethinking Informed Consent in Digital Development

informed-digtial-consent

Most INGOs have not updated their consent forms and policies for many years, yet the growing use of technology in our work, for many different purposes, raises many questions and insecurities that are difficult to address.

  • Our old ways of requesting and managing consent need to be modernized to meet the new realities of digital data and the changing nature of data.
  • Is informed consent even possible when data is digital and/or opened?
  • Do we have any way of controlling what happens with that data once it is digital?
  • How often are organizations violating national and global data privacy laws?
  • Can technology be part of the answer?

At the MERL Tech conference, we looked at these issues in a breakout session on rethinking consent in the digital age.

What Is Consent?

Let’s take a moment to clarify what kind of consent we are talking about in this post. Being clear on this point is important because there are many synchronous conversations on consent in relation to technology.

For example there are people exploring the use of the consent frameworks or rhetoric in ICT user agreements – asking whether signing such user agreements can really be considered consent. There are others exploring the issue of consent for content distribution online, in particular personal or sensitive content such as private videos and photographs.

And while these (and other) consent debates are related and important to this post, what we are specifically talking about is how we, our organizations and projects, address the issue of consent when we are collecting and using data from those who participate in programs or monitoring, evaluation, research and learning (MERL) that we are implementing.

This is as timely as ever because introducing new technologies and kinds of data means we need to change how we build consent into project planning and implementation. In fact, it gives us an amazing opportunity to build consent into our projects in ways that our organizations may not have considered in the past.

While it used to be that informed consent was the domain of frontline research staff, the reality is that getting informed consent – where there is disclosure, voluntariness, comprehension and competence of the data subject – is the responsibility of anyone ‘touching’ the data.

informed-consent

Two Examples of Digital Consent

Over the past two years, Girl Effect has been incorporating a number of mobile and digital tools into its programs. These include both the Girl Effect Mobile (GEM) and the Technology Enabled Girl Ambassadors (TEGA) programs.

Girl Effect Mobile is a global digital platform that is active in 49 countries and 26 languages. It is being developed in partnership with Facebook’s Free Basics initiative. GEM aims to provide a platform that connects girls to vital information, entertaining content and to each other.

Girl Effect’s digital privacy, safety and security policy directs the organization to review and revise its terms and conditions to ensure that they are ‘girl-friendly’ and respond to local context and realities, and that in addition to protecting the organization (as many T&Cs are designed to do), they also protect girls and their rights.

The GEM terms and conditions were initially a standard T&C. They were too long to expect girls to look at them on a mobile, the language was legalese, and they seemed one-sided. So the organization developed a new T&C with simplified language and removed some of the legal clauses that were irrelevant to the various contexts in which GEM operates.

Consent language was added to cover polls and surveys, since Girl Effect uses the platform to conduct research and for it’s monitoring, evaluation and learning work. In addition, summary points are highlighted in a shorter version of the T&Cs with a link to the full T&Cs. Girl Effect also develops short articles about online safety, privacy and consent as part of the GEM content as a way of engaging girls with these ideas as well.

TEGA is a girl-operated mobile-enabled research tool currently operating in Northern Nigeria. It uses data-collection techniques and mobile technology to teach girls aged 18-24 how to collect meaningful, honest data about their world in real time. TEGA provides Girl Effect and partners with authentic peer-to-peer insights to inform their work.

Because Girl Effect was concerned that girls being interviewed may not understand the consent they were providing during the research process, they used the mobile platform to expand on the consent process. They added a feature where the TEGA girl researchers play an audio clip that explains the consent process. Afterwards, girls who are being interviewed answer multiple-choice follow up questions to show whether they have understood what they have agreed to.

(Note: The TEGA team report that they have incorporated additional consent features into TEGA based on examples and questions shared in our session).

Oxfam, in addition to developing out their Responsible Program Data Policy, has been exploring ways in which technology can help address contemporary consent challenges.

The organization had doubts on how much its informed consent statement (which explains who the organization is, what the research is about and why Oxfam is collecting data as well as asks whether the participant is willing to be interviewed) was understood and whether informed consent is really possible in the digital age.

All the same, the organization wanted to be sure that the consent information was being read out in its fullest by enumerators (the interviewers). There were questions about what the variation might be on this between enumerators as well as in different contexts and countries of operation.

To explore whether communities were hearing the consent statement fully, Oxfam is using mobile data collection with audio recordings in the local language and using speed violations to know whether the time spent on the consent page is sufficient, according to the length of the audio file played. This is by no means foolproof but what Oxfam has found so far is that the audio file is often not played in full and or not at all.

Efforts like these are only the beginning, but they help to develop a resource base and stimulate more conversations that can help organizations and specific projects think through consent in the digital age.

Additional resources include this framework for Consent Policies developed at a Responsible Data Forum gathering.

Other Digital Consent Ideas

Additionally, because of how quickly technology and data use is changing, one idea that was shared was that rather than using informed consent frameworks, organizations may want to consider defining and meeting a ‘duty of care’ around the use of the data they collect.

This can be somewhat accomplished by through the creation of organizational-level Responsible Data Policies. There are also interesting initiatives exploring new ways of enabling communities to define consent themselves – like this data licenses prototype.

duty-of-care

The development and humanitarian sectors really needs to take notice, adapt and update their thinking constantly to keep up with technology shifts. We should also be doing more sharing about these experiences. By working together on these types of wicked challenges, we can advance without duplicating our efforts.

This post is co-authored by Emily Tomkys, Oxfam GB; Danna Ingleton, Amnesty International; and Linda Raftree, Independent.