Tag Archives: Responsible Data

Thoughts from MERL Tech UK

merltech_uk-2016Post by Christopher Robert, Dobility (Survey CTO)

MERL Tech UK was held in London this week. It was a small, intimate gathering by conference standards (just under 100 attendees), but jam-packed full of passion, accumulated wisdom, and practical knowledge. It’s clear that technology is playing an increasingly useful role in helping us with monitoring, evaluation, accountability, research, and learning – but it’s also clear that there’s plenty of room for improvement. As a technology provider, I walked away with both more inspiration and more clarity for the road ahead.

Some highlights:

  • I’ve often felt that conferences in the ICT4D space have been overly-focused on what’s sexy, shiny, and new over what’s more boring, practical, and able to both scale and sustain. This conference was markedly different: it exceeded even the tradition of prior MERL Tech conferences in shifting from the pathology of “pilotitus” to a more hard-nosed focus on what really works.
  • There was more talk of data responsibility, which I took as another welcome sign of maturation in the space. This idea encompasses much beyond data security and the honoring of confidentiality assurances that we at Dobility/SurveyCTO have long championed, and it amounted to a rare delight: rather than us trying to push greater ethical consideration on others, for once we felt that our peers were pushing us to raise the bar even further. My own ideas in terms of data responsibility were challenged, and I came to realize that data security is just one piece of a larger ethical puzzle.
  • There are far fewer programs and projects re-inventing the wheel in terms of technology, which is yet another welcome sign of maturation. This is helping more resources to flow into the improvement and professionalization of a small but diverse set of technology platforms. Too much donor money still seems to be spent on technologies that have effective, well-established, and sustainable options available, but it’s getting better.
  • However, it’s clear that there are still plenty of ways to re-invent the wheel, and plenty of opportunities for greater collaboration and learning in the space. Most organizations are having to go it alone in terms of procuring and managing devices, training and supporting field teams, designing and monitoring data-collection activities, organizing and managing collected data, and more. Some larger international organizations who adopted digital technologies early have built up some impressive institutional capacity – but every organization still has its gaps and challenges, later adopters don’t have that historical capacity from which to draw, and smaller organizations don’t have the same kind of centralized institutional capacity.
  • Fortunately, MERL Tech organizers and participants like Oxfam GB and World Bank DIME have not only built tremendous internal capacity, but also been extremely generous in thinking through how to share that capacity with others. They share via their blogs and participation in conferences like this, and they are always thinking about new and more effective ways to share. That’s both heartening and inspiring.

I loved the smaller, more intimate nature of MERL Tech UK, but I have quickly come to somewhat regret that it wasn’t substantially larger. My first London day post-MERL-Tech was spent visiting with some other SurveyCTO users, including a wonderfully-well-attended talk on data quality at the Zoological Society of London, a meeting with some members of Imperial College London’s Schistosomiasis Control Initiative, and a discussion about some new University of Cambridge efforts to improve data and research on rare diseases in the UK. Later today, I’ll meet with some members of the TUMIKIA project team at the London School of Hygiene and Tropical Medicine, and in retrospect I now wish that all of these others had been at MERL Tech. I’m trying to share lessons as best I can, but it’s obvious that so many other organizations could both contribute to and profit from the kinds of conversations and sharing that were happening at MERL Tech.

Personally, I’ve always been distrustful of product user conferences as narrow, ego-driven, sales-and-marketing kinds of affairs, but I’m suddenly seeing how a SurveyCTO user conference could make real (social) sense. Our users are doing such incredible things, learning so much in the process, building up so much capacity – and so many of them are also willing to share generously with others. The key is providing mechanisms for that sharing to happen. At Dobility, we’ve just kept our heads down and stayed focused on providing and supporting affordable, accessible technology, but now I’m seeing that we could play a greater role in facilitating greater progress in the space. With thousands of SurveyCTO projects now in over 130 countries, the amount of learning – and the potential social benefits to sharing more – is enormous. We’ll have to think about how we can get better and better about helping. And please comment here if you have ideas for us!

Thanks again to Oxfam GB, Comic Relief, and everybody else who made MERL Tech UK possible. It was a wonderful event.

How to Develop and Implement Responsible Data Policies

nigera-ict

A friend reminded me at the MERL Tech Conference that a few years ago when we brought up the need for greater attention to privacy, security and ethics when using ICTs and digital data in humanitarian and development contexts, people pointed us to Tor, encryption and specialized apps. “No, no, that’s not what we mean!” we kept saying. “This is bigger. It needs to be holistic. It’s not just more tools and tech.”

So, even if as a sector we are still struggling to understand and address all the different elements of what’s now referred to as “Responsible Data” (thanks to the great work of the Engine Room and key partners), at least we’ve come a long way towards framing and defining the areas we need to tackle. We understand the increasing urgency of the issue that the volume of data in the world is increasing exponentially and the data in our sector is becoming more and more digitalized.

This year’s MERL Tech included several sessions on Responsible Data, including Responsible Data Policies, the Human Element of the Data Cycle, The Changing Nature of Informed Consent, Remote Monitoring in Fragile Environments and plenary talks that mentioned ethics, privacy and consent as integral pieces of any MERL Tech effort.

The session on Responsible Data Policies was a space to share with participants why, how, and what policies some organizations have put in place in an attempt to be more responsible. The presenters spoke about the different elements and processes their organizations have followed, and the reasoning behind the creation of these policies. They spoke about early results from the policies, though it is still early days when it comes to implementing them.

What do we mean by Responsible Data?

Responsible data is about more than just privacy or encryption. It’s a wider concept that includes attention to the data cycle at every step, and puts the rights of people reflected in the data first:

  • Clear planning and purposeful collection and use of data with the aim of improving humanitarian and development approaches and results for those we work with and for
  • Responsible treatment of the data and respectful and ethical engagement with people we collect data from, including privacy and security of data and careful attention to consent processes and/or duty of care
  • Clarity on data sharing – what data, from whom and with whom and under what circumstances and conditions
  • Attention to transparency and accountability efforts in all directions (upwards, downwards and horizontally)
  • Responsible maintenance, retention or destruction of data.

Existing documentation and areas to explore

There is a huge bucket of concepts, frameworks, laws and policies that already exist in various other sectors and that can be used, adapted and built on to develop responsible approaches to data in development and humanitarian work. Some of these are in conflict with one another, however, and those conflicts need to be worked out or at least recognized if we are to move forward as a sector and/or in our own organizations.

Some areas to explore when developing a Responsible Data policy include:

  • An organization’s existing policies and practices (IT and equipment; downloading; storing of official information; confidentiality; monitoring, evaluation and research; data collection and storage for program administration, finance and audit purposes; consent and storage for digital images and communications; social media policies).
  • Local and global laws that relate to collection, storage, use and destruction of data, such as: Freedom of information acts (FOIA); consumer protection laws; data storage and transfer regulations; laws related to data collection from minors; privacy regulations such as the latest from the EU.
  • Donor grant requirements related to data privacy and open data, such as USAID’s Chapter 579 or International Aid Transparency Initiative (IATI) stipulations.

Experiences with Responsible Data Policies

At the MERL Tech Responsible Data Policy session, organizers and participants shared their experiences. The first step for everyone developing a policy was establishing wide agreement and buy-in for why their organizations should care about Responsible Data. This was done by developing Values and Principles that form the foundation for policies and guidance.

Oxfam’s Responsible Data policy has a focus on rights, since Oxfam is a rights-based organization. The organization’s existing values made it clear that ethical use and treatment of data was something the organization must consider to hold true to its ethos.

It took around six months to get all of the global affiliates to agree on the Responsible Program Data policy, a quick turnaround compared to other globally agreed documents because all the global executive directors recognized that this policy was critical.

A core point for Oxfam was the belief that digital identities and access will become increasingly important for inclusion in the future, and so the organization did not want to stand in the way of people being counted and heard. However, it wanted to be sure that this was done in a way that balanced and took privacy and security into consideration.

The policy is a short document that is now in the process of operationalization in all the countries where Oxfam works. Because many of Oxfam’s affiliate headquarters reside in the European Union, it needs to consider the new EU regulations on data, which are extremely strict, for example, providing everyone with an option for withdrawing consent.

This poses a challenge for development agencies that normally do not have the type of detailed databases on ‘beneficiaries’ as they do on private donors. Shifting thinking about ‘beneficiaries’ and treating them more as clients may be in order as one result of these new regulations. As Oxfam moves into implementation, challenges continue to arise.

For example, data protection in Yemen is different than data protection in Haiti. Knowing all the national level laws and frameworks and mapping these out alongside donor requirements and internal policies is extremely complicated, and providing guidance to country staff is difficult given that each country has different laws.

Girl Effect’s policy has a focus on privacy, security and safety of adolescent girls, who are the core constituency of the organization.

The policy became clearly necessary because although the organization had a strong girl safeguarding policy and practice, the effect of digital data had not previously been considered, and the number of programs that involve digital tools and data is increasing. The Girl Effect policy currently has four core chapters: privacy and security during design of a tool, service or platform; content considerations; partner vetting; and MEAL considerations.

Girl Effect looks at not only the privacy and security elements, but also aims to spur thinking about potential risks and unintended consequences for girls who access and use digital tools, platforms and content. One core goal is to stimulate implementers to think through a series of questions that help them to identify risks. Another is to establish accountability for decisions around digital data.

The policy has been in process of implementation with one team for a year and will be updated and adapted as the organization learns. It has proven to have good uptake so far from team members and partners, and has become core to how the teams and the wider organization think about digital programming. Cost and time for implementation increase with the incorporation of stricter policies, however, and it is challenging to find a good balance between privacy and security, the ability to safely collect and use data to adapt and improve tools and platforms, and user friendliness/ease of use.

Catholic Relief Services has an existing set of eight organizational principles: Sacredness and Dignity of the human person; Rights and responsibilities; Social Nature of Humanity; The Common Good; Subsidiarity; Solidarity; Option for the Poor; Stewardship.

It was a natural fit to see how these values that are already embedded in the organization could extend to the idea of Responsible Data. Data is an extension of the human person, therefore it should be afforded the same respect as the individual. The principle of ‘common good’ easily extends to responsible data sharing.

The notion of subsidiarity says that decision-making should happen as close as possible to the place where the impact of the decision will be the strongest, and this is nicely linked with the idea of sharing data back with communities where CRS works and engaging them in decision-making. The option for the poor urges CRS to place a preferential value on privacy, security and safety of the data of the poor over the data demands of other entities.

The organization is at the initial phase of creating its Responsible Data Policy. The process includes the development of the values and principles, two country learning visits to understand the practices of country programs and their concerns about data, development of the policy, and a set of guidelines to support staff in following the policy.

USAID recently embarked on its process of developing practical Responsible Data guidance to pair with its efforts in the area of open data. (See ADS 579). More information will be available soon on this initiative.

Where are we now?

Though several organizations are moving towards the development of policies and guidelines, it was clear from the session that uncertainties are the order of the day, as Responsible Data is an ethical question, often relying on tradeoffs and decisions that are not hard and fast. Policies and guidelines generally aim to help implementers ask the right questions, sort through a range of possibilities and weigh potential risks and benefits.

Another critical aspect that was raised at the MERL Tech session was the financial and staff resources that can be required to be responsible about data. On the other hand, for those organizations receiving funds from the European Union or residing in the EU or the UK (where despite Brexit, organizations will likely need to comply with EU Privacy Regulations), the new regulations mean that NOT being responsible about data may result in hefty fines and potential legal action.

Going from policy to implementation is a challenge that involves both capacity strengthening in this new area as well as behavior change and a better understanding of emerging concepts and multiple legal frameworks. The nuances by country, organization and donor make the process difficult to get a handle on.

Because staff and management are already overburdened, the trick to developing and implementing Responsible Data Policies and Practice will be finding ways to strengthen staff capacity and to provide guidance in ways that do not feel overwhelmingly complex. Though each situation will be different, finding ongoing ways to share resources and experiences so that we can advance as a sector will be one key step for moving forward.

This post was written with input from Maliha Khan, Independent Consultant; Emily Tomkys, Oxfam GB; Siobhan Green, Sonjara and Zara Rahman, The Engine Room.