Big Data, Big Responsibilities
by Catherine Gwin
Big data comes with big responsibilities, where both the funder and recipient organization have ethical and data security obligations.
Big data allows organizations to count and bring visibility to marginalized populations and to improve on decision-making. However, concerns of data privacy, security and integrity pose challenges within data collection and data preservation. What does informed consent look like in data collection? What are the potential risks we bring to populations? What are the risks of compliance?
Throughout the MERL Tech DC panel, “Big Data, Big Responsibilities,” Mollie Woods, Senior Monitoring, Evaluation and Learning (MEL) Advisor of ChildFund International and Michael Roytman, Founder and Board Director of Dharma Platform, unpacked some of the challenges based on their experiences. Sam Scarpino, Dharma’s Chief Strategy Officer, served as the session moderator, posing important questions about this area.
The session highlighted three takeaways organizations should consider when approaching data security.
1) Language Barriers between Evaluators and Data Scientists
Both Roytman and Woods agreed that the divide between evaluators and data scientists is the lack of knowledge of the others’ field. How do you ask a question when you didn’t know you had to?
In Woods’ experience, the Monitoring and Evaluation team and IT team each have a role in data security, but work independently. The evolving field of M+E inhibits time for staying attuned to what data security needs. Additionally, the organization’s limited resources can impede the IT team from supporting programmatic data security.
A potential solution ChildFund has considered is investing in an IT person with a focus on MERL who has experience and knowledge in the international or humanitarian sphere. However, many organizations fall short when it comes to financing data security. In addition, identifying an individual with these skills can be challenging.
2) Data Collection
Data breaches exposes confidential information, which puts vulnerable populations at risk of exploitative use of their data and potential harm. As we gather data, this constitutes a question about what informed consent looks like? Are we communicating the risks to beneficiaries of releasing their personal information?
In Woods’ experience, ChildFund approaches data security through a child-safeguarding lens across stakeholders and program participants, where all are responsible for data security. Its child safeguarding policy entails data security protocol and privacy; however, Woods mentioned the dissemination and implementation across countries is a lingering question. Many in-country civil society organizations lack capacity, knowledge, and resources to implement data security protocols, especially if they are working in a country context that does not have laws, regulations or frameworks related to data security and privacy. Currently, ChildFund is advocating for refresher trainings on policy for all involved global partnerships to be updated on organizational changes.
3) Data Preservation
The issue of data breaches is a privacy concern when organizations’ data includes sensitive information of individuals. This puts beneficiaries at-risk of exploitation by bad actors. Roytman explained that there are specific actors, risks, and threats that affect specific kinds of data; though, humanitarian aid organizations are not always a primary target. Nonetheless, this shouldn’t distract organizations from potential risks, but open discussion around how to identify and mitigate risks?
Protecting sensitive data requires a proper security system, something that not all platforms provide, especially if they are free. Ultimately, security is a financial investment that requires time in order to avoid and mitigate risks and potential threats. In order to increase support and investment in security, ChildFund is working with Dharma to pilot a small program to demonstrate the use of big data analytics with a built in data security system.
Roytman suggested approaching ethical concerns by applying the CIA Triad: Confidentiality, Availability and Integrity. There will always be tradeoffs, he said. If we don’t properly invest in data security and mitigate potential risks, there will be additional challenges to data collection. If we don’t understand data security, how can we ensure informed consent?
Many organizations find themselves doing more harm than good due to lack of funding. Big data can be an inexpensive approach to collecting large quantities of data, but if it leads to harm, there is a problem. This is is a complex issue to resolve, however, as Roytman concluded, the opposite of complexity is not simplicity, but rather transparency.
See Dharma’s blog post about this session here.
Related Resources and Articles