Tag Archives: oxfam gb

Being data driven… can it be more than a utopia?

by Emily Tomkys Valteri, the ICT in Program Accountability Project Manager at Oxfam GB. In her role, Emily drives Oxfam’s thinking on the use of Information and Communications Technologies (ICT) for accountability and supports staff with applications of ICTs within their work. 

Every day the human race generates enough data to fill 10 million blu-ray discs and if you stacked them up it would be four times the height of the Eiffel tower. Although the data we process at Oxfam is tiny in comparison, sometimes the journey towards being “data driven” feels like following the yellow brick road to The Emerald City. It seems like a grand ideal, but for anyone who knows the film, inflated expectations are set to be dashed. Does data actually help organisations like Oxfam better understand the needs of communities affected by disaster or poverty? Or do we need to pull back the curtain and manage our expectations about getting the basics right? When there are no ruby slippers, we need to understand what it is we can do with data and improve the way data is managed and analysed across countries and projects.

The problem

Oxfam works in over 90 countries using a variety of different data management and analysis tools that are developed or purchased in country. In the past, we have experimented with software licenses and database expertise, but we have started aiming for a more joined up approach. It’s our belief that good systems which build in privacy by design can help us stay true to values in our rights based Responsible Program Data Policy and Information Systems Data Security guidelines – which are about treating those people whom data is about with dignity and respect.

One of our most intractable challenges is that Oxfam’s data is analysed in system silos. Data is usually collected and viewed through a project level lens. Different formats and data standards make it difficult to compare across countries, regions or even globally. When data remains in source systems, trying to analyse between different systems is long and manual, meaning that any meta analysis is rarely done. One of the key tenants of Responsible Data is to only collect data you can use and to make the most of that information to effectively meet people’s needs. Oxfam collects a lot of valuable data and we think we need to do more with it: analyse more efficiently, effectively, at national and beyond level to drive our decision making in our programmes.

The solution

In response, Oxfam has begun creating the DataHub: a system which integrates programme data into a standard set of databases and presents it to a reporting layer for analysis. It bakes in principles of privacy and compliance with new data protection laws by design. Working with our in-house agile software development team we conducted four tech sprints, each lasting two weeks. Now we have the foundations. One of our standard data collection tools, SurveyCTO, is being pushed via a webhook into our unstructured database, Azure Cosmos DB. Within this database, the data is organised into collections, currently set up by country. From here, the data can be queried using Power BI and presented to programme teams for analysis. Although we only have one source system into quantitative analysis for now, the bigger picture will have lots of source systems and a variety of analysis options available.

To get to where we are today, Oxfam’s ICT in Programme team worked closely with the Information Systems teams to develop a solution that was in line with strategy and future trends. Despite the technology being new to Oxfam, the solution is relatively simple and we ensured good process, interoperability and that tools available to us were fit for purpose. This collaborative approach gave us the organisational support to prioritise these activities as well as the resources required to carry them out.

This journey wasn’t without its challenges, some of which are still being worked on. The EU General Data Protection Regulation (GDPR) coming into force in May 2018, and Oxfam has had to design the DataHub with this in mind. At this stage, data is anonymised during integration and so no Personally Identifiable Information (PII) enters the DataHub due to a series of configurations and processes we have put in place. Training and capacity is another challenge, we need to encourage a culture of valuing the data. This will only be of benefit to teams and the organisation if they make use of the system, investing time and resources to learning it.

We are excited about the potential of the DataHub and the success we have already had in setting up the infrastructure to enable more efficient data analysis and more responsive programming as well as save resources. We are keen to work with and share ideas with others. We know there is a lot of work ahead to foster a data driven organisation but we’re starting to feel, with the right balance of technology, process and culture it’s more realistic than we might have first hoped.

 

 

We have a data problem

by Emily Tomkys, ICT in Programmes at Oxfam GB

Following my presentation at MERL Tech, I have realised that it’s not only Oxfam who have a data problem; many of us have a data problem. In the humanitarian and development space, we collect a lot of data – whether via mobile phone or a paper process, the amount of data each project generates is staggering. Some of this data goes into our MIS (Management Information Systems), but all too often data remains in Excel spreadsheets on computer hard drives, unconnected cloud storage systems or Access and bespoke databases.

(Watch Emily’s MERL Tech London Lightning Talk!)

This is an issue because the majority of our programme data is analysed in silos on a survey-to-survey basis and at best on a project-to-project basis. What about when we want to analyse data between projects, between countries, or even globally? It would currently take a lot of time and resources to bring data together in usable formats. Furthermore, issues of data security, limited support for country teams, data standards and the cost of systems or support mean there is a sustainability problem that is in many people’s interests to solve.

The demand from Oxfam’s country teams is high – one of the most common requests the ICT in Programme Team receive centres around databases and data analytics. Teams want to be able to store and analyse their data easily and safely; and there is growing demand for cross border analytics. Our humanitarian managers want to see statistics on the type of feedback we receive globally. Our livelihoods team wants to be able to monitor prices at markets on a national and regional scale. So this motivated us to look for a data solution but it’s something we know we can’t take on alone.

That’s why MERL Tech represented a great opportunity to check in with other peers about potential solutions and areas for collaboration. For now, our proposal is to design a data hub where no matter what the type of data (unstructured, semi-structured or structured) and no matter how we collect the data (mobile data collection tools or on paper), our data can integrate into a database. This isn’t about creating new tools – rather it’s about focusing on the interoperability and smooth transition between tools and storage options.  We plan to set this up so data can be pulled through into a reporting layer which may have a mixture of options for quantitative analysis, qualitative analysis and GIS mapping. We also know we need to give our micro-programme data a home and put everything in one place regardless of its source or format and make it easy to pull it through for analysis.

In this way we can explore data holistically, spot trends on a wider scale and really know more about our programmes and act accordingly. Not only should this reduce our cost of analysis, we will be able to analyse our data more efficiently and effectively. Moreover, taking a holistic view of the data life cycle will enable us to do data protection by design and it will be easier to support because the process and the tools being used will be streamlined. We know that one tool does not and cannot do everything we require when we work in such vast contexts, so a challenge will be how to streamline at the same time as factoring in contextual nuances.

Sounds easy, right? We will be starting to explore our options and working on the datahub in the coming months. MERL Tech was a great start to make connections, but we are keen to hear from others about how you are approaching “the data problem” and eager to set something up which can also be used by other actors. So please add your thoughts in the comments or get in touch if you have ideas!