Tag Archives: policy

5 tips for operationalizing Responsible Data policy

By Alexandra Robinson and Linda Raftree

MERL and development practitioners have long wrestled with complex ethical, regulatory, and technical aspects of adopting new data approaches and technologies. The topic of responsible data has gained traction over the past 5 years or so, and a handful of early adopters have developed and begun to operationalize institutional RD policies. Translating policy into practical action, however, can feel daunting to organizations. Constrained budgets, complex internal bureaucracies, and ever-evolving technology and regulatory landscapes make it hard to even know where to start. 

The Principles for Digital Development provide helpful high level standards, and donor guidelines (such as USAID’s Responsible Data Considerations) offer additional framing. But there’s no one-size-fits-all policy or implementation plan that organizations can simply copy and paste in order to tick all the responsible data boxes. 

We don’t think organizations should do that anyway, given that each organization’s context and operating approach is different, and policy means nothing if it’s not rolled out through actual practice and behavior change!

In September, we hosted a MERL Tech pre-workshop on Operationalizing Responsible Data to discuss and share different ways of turning responsible data policy into practice. Below we’ve summarized some tips shared at the workshop. RD champions in organizations of any size can consider these when developing and implementing RD policy.

1. Understand Your Context & Extend Empathy

  • Before developing policy, conduct a non-punitive assessment (a.k.a. a landscape assessment, self-assessment or staff research process) on existing data practices, norms, and decision-making structures . This should engage everyone who will using or affected by the new policies and practices. Help everyone relax and feel comfortable sharing how they’ve been managing data up to now so that the organization can then improve. (Hint: avoid the term ‘audit’ which makes everyone nervous.)
  • Create ‘safe space’ to share and learn through the assessment process:
    • Allow staff to speak anonymously about their challenges and concerns whenever possible
    • Highlight and reinforce promising existing practices
    • Involve people in a ‘self-assessment’
    • Use participatory workshops (e.g. work with a team to map a project’s data flows or conduct a Privacy Impact Assessment or a Risk-Benefits Assessment) – this allows everyone who participates to gain RD awareness while also learning new practical tools along with highlighting any areas that need attention. The workshop lead or “RD champion” can also then get a better sense of the wider organizations knowledge, attitudes and practices as related to RD
    • Acknowledge (and encourages institutional leaders to affirm) that most staff don’t have “RD expert” written into their JDs; reinforce that staff will not be ‘graded’ or evaluated on skills they weren’t hired for.
  • Identify organizational stakeholders likely to shape, implement, or own aspects of RD policy and tailor your engagement strategies to their perspectives, motivations, and concerns. Some may feel motivated financially (avoiding fines or the cost of a data breach); others may be motivated by human rights or ethics; whereas some others might be most concerned with RD with respect to reputation, trust, funding and PR.
  • Map organizational policies, major processes (like procurement, due diligence, grants management), and decision making structures to assess how RD policy can be integrated into these existing activities.

2. Consider Alternative Models to Develop RD Policy 

  • There is no ‘one size fits all’ approach to developing RD policy. As the (still small, but promising) number of organizations adopting policy grows, different approaches are emerging. Here are some that we’ve seen:
    • Top-down: An institutional-level policy is developed, normally at the request of someone on the leadership team/senior management. It is then adapted and applied across projects, offices, etc. 
      • Works best when there is strong leadership buy-in for RD policy and a focal point (e.g. an ‘Executive Sponsor’) coordinating policy formation and navigating stakeholders
    • Bottom-up: A group of staff are concerned about RD but do not have support or interest from senior leadership, so they ‘self-start’ the learning process and begin shaping their own practices, joining together, meeting, and communicating regularly until they have wider buy-in and can approach leadership with a use case and budget request for an organization-wide approach.
      • Good option if there is little buy-in at the top and you need to build a case for why RD matters.
    • Project- or Team-Generated: Development and application of RD policies are piloted within a targeted project or projects or on one team. Based on this smaller slice of the organization, the project or team documents its challenges, process, and lessons learned to build momentum for and inform the development of future organization-wide policy. 
      • Promising option when organizational awareness and buy-in for RD is still nascent and/or resources to support RD policy formation and adoption (staff, financial, etc.) are limited.
    • Hybrid approach: Organizational policy/policies are developed through pilot testing across a reasonably-representative sample of projects or contexts. For example, an organization with diverse programmatic and geographical scope develops and pilots policies in a select set of country offices that can offer different learning and experiences; e.g., a humanitarian-focused setting, a development-focused setting, and a mixed setting; a small office, medium sized office and large office; 3-4 offices in different regions; offices that are funded in various ways; etc.  
      • Promising option when an organization is highly decentralized and works across a diverse country contexts and settings. Supports the development of approaches that are relevant and responsive to diverse capacities and data contexts.

3. Couple Policy with Practical Tools, and Pilot Tools Early and Often

  • In order to translate policy into action, couple it with practical tools that support existing organizational practices. 
  • Make sure tools and processes empower staff to make decisions and relate clearly to policy standards or components; for example:
    • If the RD policy includes a high-level standard such as, “We ensure that our partnerships with technology companies align with our RD values,” give staff tools and guidance to assess that alignment. 
  • When developing tools and processes, involve target users early and iteratively. Don’t worry if draft tools aren’t perfectly formatted. Design with users to ensure tools are actually useful before you sink time into tools that will sit on a shelf at best, and confuse or overburden staff at worst. 

4. Integrate and “Right-Size” Solutions 

  • As RD champions, it can be tempting to approach RD policy in a silo, forgetting it is one of many organizational priorities. Be careful to integrate RD into existing processes, align RD with decision-making structures and internal culture, and do not place unrealistic burdens on staff.
  • When building tools and processes, work with stakeholders to develop responsibility assignment charts (e.g. RACI, MOCHA) and determine decision makers.
  • When developing responsibility matrices, estimate the hours each stakeholder (including partners, vendors, and grantees) will dedicate to a particular tool or process. Work with anticipated end users to ensure that processes:
    • Can realistically be carried out within a normal workload
    • Will not excessively burden staff and partners
    • Are realistically proportionate to the size, complexity, and risk involved in a particular investment or project

5. Bridge Policy and Behavior Change through Accompaniment & Capacity Building 

  • Integrating RD policy and practices requires behavior change and can feel technically intimidating to staff. Remember to reassure staff that no one (not even the best resourced technology firms!), has responsible data mastered, and that perfection is not the goal.
  • In order to feel confident using new tools and approaches to make decisions, staff need knowledge to analyze information. Skills and knowledge required will be different according to role, so training should be adapted accordingly. While IT staff may need to know the ins and outs of network security, general program officers certainly do not. 
  • Accompany staff as they integrate RD processes into their work. Walk alongside them, answering questions along the way, but more importantly, helping staff build confidence to develop their own internal RD compass. That way the pool of RD champions will grow!

What approaches have you seen work in your organization?

Data Security and Privacy – MERL Tech presentation spurs action

By Stacey Berlow of Project Balance. The original was posted on Project Balance’s blog.

I had the opportunity to attend MERL Tech (September 7-8, 2017 Washington, DC). I was struck by the number of very thoughtful and content driven sessions. Coming from an IT/technology perspective, it was so refreshing to hear about the intersection of technology and humanitarian programs and how technology can provide the tools and data to positively impact decision making.
.
One of the sessions, “Big data, big problems, big solutions: Incorporating responsible data principles in institutional data management” was particularly poignant. The session was presented by Paul Perrin from University of Notre Dame, Alvaro Cobo & Jeff Lundberg from Catholic Relief Services and Gillian Kerr from LogicalOutcomes. The overall theme of the presentation was that in the field of evaluation and ICT4D, we must be thoughtful, diligent and take responsibility for protecting people’s personal and patient data; the potential risk for having a data breach is very high.

PaulPerrinDataRisk

Paul started the session by highlighting the fact that data breaches which expose our personal data, credit card information and health information have become a common occurrence. He brought the conversation back to monitoring and evaluation and research and the gray area between the two, leading to confusion about data privacy. Paul’s argument is that evaluation data is used for research later in a project without proper approval of those receiving services. The risk for misuse and incorrect data handling increases significantly.

Alvaro and Jeff talked about a CRS data warehousing project and how they have made data security and data privacy a key focus. The team looked at the data lifecycle – repository design, data collection, storage, utilization, sharing and retention/destruction – and they are applying best data security practices throughout. And finally, Gillian described the very concerning situation that at NGOs, M&E practitioners may not be aware of data security and privacy best practices or don’t have the funds to correctly meet minimum security standards and leave this critical data aspect behind as “too complicated to deal with.”

The presentation team advocates for the following:

  • Deeper commitment to informed consent
  • Reasoned use of identifiers
  • Need to know vs. nice to know
  • Data security and privacy protocols
  • Data use agreements and protocols for outside parties
  • Revisit NGO primary and secondary data IRB requirements

This message resonated with me in a surprising way. Project Balance specializes in developing data collection applications, data warehousing and data visualization. When we embark on a project we are careful to make sure that sensitive data is handled securely and that client/patient data is de-identified appropriately. We make sure that client data can only be viewed by those that should have access; that tables or fields within tables that hold identifying information are encrypted. Encryption is used for internet data transmission and depending on the application the entire database may be encrypted. And in some cases the data capture form that holds a client’s personal and identifying information may require that the user of the system re-log in.

After hearing the presentation I realized Project Balance could do better. As part of our regular software requirements management process, we will now create a separate and specialized data security and privacy plan document, which will enhance our current process. By making this a defined requirements gathering step, the importance of data security and privacy will be highlighted and will help our customers address any gaps that are identified before the system is built.

Many thanks to the session presenters for bringing this topic to the fore and for inspiring me to improve our engagement process!

Tools, tips and templates for making Responsible Data a reality

by David Leege, CRS; Emily Tomkys, Oxfam GB; Nina Getachew, mSTAR/FHI 360; and Linda Raftree, Independent Consultant/MERL Tech; who led the session “Tools, tips and templates for making responsible data a reality.

The data lifecycle.
The data lifecycle.

For this year’s MERL Tech DC, we teamed up to do a session on Responsible Data. Based on feedback from last year, we knew that people wanted less discussion on why ethics, privacy and security are important, and more concrete tools, tips and templates. Though it’s difficult to offer specific do’s and don’ts, since each situation and context needs individualized analysis, we were able to share a lot of the resources that we know are out there.

To kick off the session, we quickly explained what we meant by Responsible Data. Then we handed out some cards from Oxfam’s Responsible Data game and asked people to discuss their thoughts in pairs. Some of the statements that came up for discussion included:

  • Being responsible means we can’t openly share data – we have to protect it
  • We shouldn’t tell people they can withdraw consent for us to use their data when in reality we have no way of doing what they ask
  • Biometrics are a good way of verifying who people are and reducing fraud

Following the card game we asked people to gather around 4 tables with a die and a print out of the data lifecycle where each phase corresponded to a number (Planning = 1, collecting = 2, storage = 3, and so on…). Each rolled the die and, based on their number, told a “data story” of an experience, concern or data failure related to that phase of the lifecycle. Then the group discussed the stories.

For our last activity, each of us took a specific pack of tools, templates and tips and rotated around the 4 tables to share experiences and discuss practical ways to move towards stronger responsible data practices.

Responsible data values and principles

David shared Catholic Relief Services’ process of developing a responsible data policy, which they started in 2017 by identifying core values and principles and how they relate to responsible data. This was based on national and international standards such as the Humanitarian Charter including the Humanitarian Protection Principles and the Core and Minimum Standards as outlined in Sphere Handbook Protection Principle 1; the Protection of Human Subjects, known as the “Common Rule” as laid out in the Department of Health and Human Services Policy for Protection of Human Research Subjects; and the Digital Principles, particularly Principle 8 which mandates that organizations address privacy and security.

As a Catholic organization, CRS follows the principles of Catholic social teaching, which directly relate to responsible data in the following ways:

  • Sacredness and dignity of the human person – we will respect and protect an individual’s personal data as an extension of their human dignity;
  • Rights and responsibilities – we will balance the right to be counted and heard with the right to privacy and security;
  • Social nature of humanity – we will weigh the benefits and risks of using digital tools, platforms and data;
  • Common good – we will open data for the common good only after minimizing the risks;
  • Subsidiarity – we will prioritize local ownership and control of data for planning and decision-making;
  • Solidarity – we will work to educate inform and engage our constituents in responsible data approaches;
  • Option for the poor – we will take a preferential option for protecting and securing the data of the poor; and
  • Stewardship – we will responsibly steward the data that is provided to us by our constituents.

David shared a draft version of CRS’ responsible data values and principles.

Responsible data policy, practices and evaluation of their roll-out

Oxfam released its Responsible Program Data Policy in 2015. Since then, they have carried out six pilots to explore how to implement the policy in a variety of countries and contexts. Emily shared information on these these pilots and the results of research carried out by the Engine Room called Responsible Data at Oxfam: Translating Oxfam’s Responsible Data Policy into practice, two years on. The report concluded that the staff that have engaged with Oxfam’s Responsible Data Policy find it both practically relevant and important. One of the recommendations of this research showed that Oxfam needed to increase uptake amongst staff and provide an introductory guide to the area of responsible data.  

In response, Oxfam created the Responsible Data Management pack, (available in English, Spanish, French and Arabic), which included the game that was played in today’s session along with other tools and templates. The card game introduces some of the key themes and tensions inherent in making responsible data decisions. The examples on the cards are derived from real experiences at Oxfam and elsewhere, and they aim to generate discussion and debate. Oxfam’s training pack also includes other tools, such as advice on taking photos, a data planning template, a poster of the data lifecycle and general information on how to use the training pack. Emily’s session also encouraged discussion with participants about governance and accountability issues like who in the organisation manages responsible data and how to make responsible data decisions when each context may require a different action.

Emily shared the following resources:

A packed house for the responsible data session.
A packed house for the responsible data session.

Responsible data case studies

Nina shared early results of four case studies mSTAR is conducting together with Sonjara for USAID. The case studies are testing a draft set of responsible data guidelines, determining whether they are adequate for ‘on the ground’ situations and if projects find them relevant, useful and usable. The guidelines were designed collaboratively, based on a thorough review and synthesis of responsible data practices and policies of USAID and other international development and humanitarian organizations. To conduct the case studies, Sonjara, Nina and other researchers visited four programs which are collecting large amounts of potentially sensitive data in Nigeria, Kenya and Uganda. The researchers interviewed a broad range of stakeholders and looked at how the programs use, store, and manage personally identifiable data (PII). Based on the research findings, adjustments are being made to the guidelines. It is anticipated that they will be published in October.

Nina also talked about CALP/ELAN’s data sharing tipsheets, which include a draft data-sharing agreement that organizations can adapt to their own contracting contracting documents. She circulated a handout which identifies the core elements of the Fair Information Practice Principles (FIPPs) that are important to consider when using PII data.  

Responsible data literature review and guidelines

Linda mentioned that a literature review of responsible data policy and practice has been done as part of the above mentioned mSTAR project (which she also worked on). The literature review will provide additional resources and analysis, including an overview of the core elements that should be included in organizational data guidelines, an overview of USAID policy and regulations, emerging legal frameworks such as the EU’s General Data Protection Regulation (GDPR), and good practice on how to develop guidelines in ways that enhance uptake and use. The hope is that both the Responsible Data Literature Review and the of Responsible Data Guidelines will be suitable for adopting and adapting by other organizations. The guidelines will offer a set of critical questions and orientation, but that ethical and responsible data practices will always be context specific and cannot be a “check-box” exercise given the complexity of all the elements that combine in each situation. 

Linda also shared some tools, guidelines and templates that have been developed in the past few years, such as Girl Effect’s Digital Safeguarding Guidelines, the Future of Privacy Forum’s Risk-Benefits-Harms framework, and the World Food Program’s guidance on Conducting Mobile Surveys Responsibly.

More tools, tips and templates

Check out this responsible data resource list, which includes additional tools, tips and templates. It was developed for MERL Tech London in February 2017 and we continue to add to it as new documents and resources come out. After a few years of advocating for ‘responsible data’ at MERL Tech to less-than-crowded sessions, we were really excited to have a packed room and high levels of interest this year!