Tag Archives: security

Chain Reaction: How Does Blockchain Fit, if at All, into Assessments of Value for Money of Education Projects?

by Cathy Richards

In this panel, “Chain Reaction: How does blockchain fit, if at all, into assessments of value for money of education projects,” hosted by Christine Harris-Van Keuren of Salt Analytics, panelists gave examples of how they’ve used blockchain to store activity and outcomes data and to track the flow of finances. Valentine Gandhi from The Development Café served as the discussant.

Value for money analysis (or benefit-cost analysis, cost-economy, cost-effectiveness, cost-efficiency, or cost-feasibility) is defined as an evaluation of the best use of scarce resources to achieve a desired outcome. In this panel, participants examined the value for money of blockchain by taking on an aspect of an adapted value-for-money framework. The framework takes into account resources, activities, outputs, and outcomes. Panel members were specifically asked to explain what they gained and lost by using blockchain as well as whether they had to use blockchain at all.

Ben Joakim is the founder and CEO of Disberse, a new financial institution built on distributed ledger technology. Disberse aims to ensure greater privacy and security for the aid sector — which serves some of the most vulnerable communities in the world. Joakim notes that in the aid sector, traditional banks are often slow and expensive, which can be detrimental during a humanitarian crisis. In addition, traditional banks can lack transparency, which increases the potential for the mismanagement and misappropriation of funds. Disberse works to tackle those problems by creating a financial institution that is not only efficient but also transparent and decentralised, thus allowing for greater impact with available resources. Additionally, Disberse allows for multi-currency accounts, foreign currency exchanges, instant fund transfers, end-to-end traceability, donation capabilities, regulatory compliance, and cash transfer systems. Since inception, Disberse has delivered pilots in several countries including Swaziland, Rwanda, Ukraine, and Australia.

David Mikhail of UNCDF discussed the organization’s usage of blockchain technologies in the Nepal remittance corridor. In 2017 alone, Nepal received $6.9 billion in remittances. These funds are responsible for 28.4% of the country’s GDP. One of the main challenges for Nepali migrant families is a lack of financial inclusion characterized by credit interest rates as high as 30%, lack of a documented credit history, and lack of sufficient collateral. Secondarily, families have a difficult time building capital once they migrate. Between the high costs of migration, high-interest rate loans, non-stimulative spending that impacts their ability to save and invest, and lack of credit history make it difficult for migrants to break free of the poverty cycle. Due to this, the organization asked itself whether it could create a new credit product tied to remittances to provide capital and fuel domestic economic development. In theory, this solution would drive financial inclusion by channeling remittances through the formal sector. The product would not only leverage blockchain in order to create a documented credit history, but it would also direct the flow of remittances into short and long-term savings or credit products that would help migrants generate income and assets. 

Tara Vassefi presented on her experience at Truepic, a photo and video verification platform that aims to foster a healthy civil society by pushing back against disinformation. They do this by bolstering the value of authentic photos through the use of verified pixel data from the time of capture and through the independent verification of time and location metadata. Hashed references to time, date, location and exact pixelation are stored on the blockchain. The benefits of using this technology are that the data is immutable and it adds a layer of privacy and security to media. The downsides include marginal costs and the general availability of other technologies. Truepic has been used for monitoring and evaluation purposes in Syria, Jordan, Uganda, China, and Latin America to remotely monitor government activities and provide increased oversight at a lower cost. They’ve found that this human-centric approach, which embeds technology into existing systems, can close the trust gap currently found in society.

Being data driven… can it be more than a utopia?

by Emily Tomkys Valteri, the ICT in Program Accountability Project Manager at Oxfam GB. In her role, Emily drives Oxfam’s thinking on the use of Information and Communications Technologies (ICT) for accountability and supports staff with applications of ICTs within their work. 

Every day the human race generates enough data to fill 10 million blu-ray discs and if you stacked them up it would be four times the height of the Eiffel tower. Although the data we process at Oxfam is tiny in comparison, sometimes the journey towards being “data driven” feels like following the yellow brick road to The Emerald City. It seems like a grand ideal, but for anyone who knows the film, inflated expectations are set to be dashed. Does data actually help organisations like Oxfam better understand the needs of communities affected by disaster or poverty? Or do we need to pull back the curtain and manage our expectations about getting the basics right? When there are no ruby slippers, we need to understand what it is we can do with data and improve the way data is managed and analysed across countries and projects.

The problem

Oxfam works in over 90 countries using a variety of different data management and analysis tools that are developed or purchased in country. In the past, we have experimented with software licenses and database expertise, but we have started aiming for a more joined up approach. It’s our belief that good systems which build in privacy by design can help us stay true to values in our rights based Responsible Program Data Policy and Information Systems Data Security guidelines – which are about treating those people whom data is about with dignity and respect.

One of our most intractable challenges is that Oxfam’s data is analysed in system silos. Data is usually collected and viewed through a project level lens. Different formats and data standards make it difficult to compare across countries, regions or even globally. When data remains in source systems, trying to analyse between different systems is long and manual, meaning that any meta analysis is rarely done. One of the key tenants of Responsible Data is to only collect data you can use and to make the most of that information to effectively meet people’s needs. Oxfam collects a lot of valuable data and we think we need to do more with it: analyse more efficiently, effectively, at national and beyond level to drive our decision making in our programmes.

The solution

In response, Oxfam has begun creating the DataHub: a system which integrates programme data into a standard set of databases and presents it to a reporting layer for analysis. It bakes in principles of privacy and compliance with new data protection laws by design. Working with our in-house agile software development team we conducted four tech sprints, each lasting two weeks. Now we have the foundations. One of our standard data collection tools, SurveyCTO, is being pushed via a webhook into our unstructured database, Azure Cosmos DB. Within this database, the data is organised into collections, currently set up by country. From here, the data can be queried using Power BI and presented to programme teams for analysis. Although we only have one source system into quantitative analysis for now, the bigger picture will have lots of source systems and a variety of analysis options available.

To get to where we are today, Oxfam’s ICT in Programme team worked closely with the Information Systems teams to develop a solution that was in line with strategy and future trends. Despite the technology being new to Oxfam, the solution is relatively simple and we ensured good process, interoperability and that tools available to us were fit for purpose. This collaborative approach gave us the organisational support to prioritise these activities as well as the resources required to carry them out.

This journey wasn’t without its challenges, some of which are still being worked on. The EU General Data Protection Regulation (GDPR) coming into force in May 2018, and Oxfam has had to design the DataHub with this in mind. At this stage, data is anonymised during integration and so no Personally Identifiable Information (PII) enters the DataHub due to a series of configurations and processes we have put in place. Training and capacity is another challenge, we need to encourage a culture of valuing the data. This will only be of benefit to teams and the organisation if they make use of the system, investing time and resources to learning it.

We are excited about the potential of the DataHub and the success we have already had in setting up the infrastructure to enable more efficient data analysis and more responsive programming as well as save resources. We are keen to work with and share ideas with others. We know there is a lot of work ahead to foster a data driven organisation but we’re starting to feel, with the right balance of technology, process and culture it’s more realistic than we might have first hoped.