Tag Archives: agile

An Agile MERL Manifesto

By Calum Handforth, Consultant at Agriculture, Learning and Impacts Network (ALINe)

Too often in MERL, we lead with a solution instead of focusing on the problem itself.  These solutions are often detailed and comprehensive, but not always aligned to the interests of beneficiaries and projects – or the realities of contexts.

At the Agriculture, Learning and Impacts Network (ALINe), we’re exploring an agile approach to MERL. An approach centred on MERL users, that’s able to generate rapid and actionable insights. It’s an iterative approach to respond to the challenges and realities of implementing MERL. It’s about learning and responding fast.

The ‘Agile’ approach has its roots in project management, where it’s usually linked to the development of digital tools. Agile was a response to what were seen to be bloated and inefficient ways of delivering software – and improvements – to users. It focuses on the importance of being user-centred. It’s about piloting and iterating to deliver products that customers need, and responding to change instead of trying to specify everything at the outset. These concepts were defined in the Agile Manifesto that launched this movement. The Agile approach is now used to design, develop and deliver a huge amount of technology, software, and digital tools.

So, should we be thinking about an Agile MERL Manifesto? And what should this contain?  We’ve got three main ideas that drive much of our work:

First, put the user at the heart of MERL. We need to know our audience, and their contexts and realities. We need to build MERL tools and approaches that align with these insights, and aim for co-design wherever possible. We need to properly understand the needs of our users and the problem(s) our MERL tools need to solve. This is also the case with the results that we’re generating: are they easy to understand and presented in a helpful format; are they actionable; can they be easily validated; are they integrated into ongoing project management processes; and are they tailored to the specific roles of different users in a system? And who needs to use the data to make what decisions?

With these foundations, our MERL tools need to be working to identify the right impact – whether it’s about the ‘big numbers’ of people reached or incomes increased, or at the level of outcomes and intermediate changes. The latter are particularly useful as these insights are often more actionable from a day-to-day management or decision-making perspective. We also need to be measuring over the right timeframe to capture these impacts or changes.

Second, collect the data that matters. We’ve all seen cases where surveys or other tools have been used that ask all the questions – except the right one. So we need to strip everything back and make sure that we can get the right data for the right decisions to be made. This is where feedback systems, which we have focused on extensively, can be important. These tend to focus on asking a smaller number of questions more frequently to understand the views and perspectives of users so as to inform decision-making.

Recently, we’ve worked on the monitoring of mobile phone-delivered agricultural and nutritional information across six countries. As part of this, we ran regular ‘Rapid Feedback Surveys’ that provided the User Experience team at each of the Mobile Network Operators with a platform to ask users questions on their experience with the service. This enabled actionable improvements to the service – for example, being able to tweak content or service design in order to better meet the needs of users. We’ve also been using the Progress Out Of Poverty Index (PPI) – a 10-question poverty measurement tool customised for more than 50 countries – to gain some valuable demographic insights to ensure that the project is reaching the correct beneficiaries.

More widely, in order to understand how different agricultural technologies promoted by the public extension system in Ethiopia are working out for farmers, we developed a lightweight tool called the ‘technology tracker’ to gather perceptual feedback from smallholder farmers about these technologies. The tools ask a short set of questions on key dimensions of technology performance (ease of understanding, cost of materials, labour requirements, production quantity and quality and profitability) along with the major challenges faced. This allows government workers to easily compare different technologies and diagnose the key problems to be addressed to make the technologies more successful.

These ideas are gaining increased traction in international development, as in the case of the Lean Data approach being explored by the impact investment fund, Acumen.

Third, be responsive to change. Methodologies don’t always work out. So adapt to the challenges thrown at you, and understand that methodologies shouldn’t be static – we need continual refinement to ensure that we’re always measuring the things that matter as problems and realities shift. We need to be thinking of iteration as central to the process of developing MERL tools and systems as opposed to continuing to focus on the big-up-front-design approach. However, since the process of figuring out what the right data is can be complex, starting with something simple but useful and iterating to refine it can help. In Agile, there’s the concept of the Minimum Viable Product – what’s your basic offering in order to generate suitable value for your customers? In Agile MERL, what should be the Minimum Viable Tool to get the insights that we need? It’s about starting with lightweight practical tools that solve immediate problems and generate value, rather than embarking on a giant and unwieldy system before we’ve managed to gain any traction and demonstrate value.

Agile MERL is about both the design of MERL systems and the wider piece of learning from the data that is generated. This is also about learning when things don’t work out. To borrow a tech-phrase from the environment that Agile initially grew out of: fail fast, and fail often. But learn from failure. This includes documenting it. The social enterprise One Acre Fund use a Failure Template to record how and why interventions and approaches didn’t work. They also have failure reports available on their website. Transparency is important here, too, as the more that these insights are shared, the more effective all of our work can be. There will be less duplication, and interventions will be based on stronger evidence of what works and what doesn’t.  There’s an important point here, too, about organisational culture being responsive to this approach – and we need to be pushing donors to understand the realities of MERL: it’s rarely ever perfect.  

Agile MERL, as with any other MERL approach, is not a panacea. It’s part of the wider MERL toolkit and there are limitations with this approach. In particular, one needs to ensure that in the quest for lean data collection, we are still getting valid insights that are robust enough to be used by decision-makers on critical issues. Moreover, while change and iteration need to be embraced, there is still a need to create continuous and comparable datasets. In some cases, the depth or comprehensiveness of research required may prevent a more lightweight approach. However, even in these situations the core tenets of Agile MERL remain relevant and MERL should continue to be user-driven, useful and iterative. We need to be continuously testing, learning and adapting.

These are our initial thoughts, which have guided some of our recent projects. We’re increasingly working on projects that use Agile-inspired tools and approaches: whether tech, software or data-driven development. We feel that MERL can learn from the Agile project management environments that these digital tools were designed in, which have used the Agile Manifesto to put users at the centre. 

Agile MERL, and using tools like mobile phones and tablets for data collection, democratise MERL through making it more accessible and useful. Not every organisation can afford to conduct a $1m Household Survey, but most organisations can use approaches like the 10-question PPI survey, the Rapid Feedback Survey or the technology tracker in some capacity. Agile MERL stops MERL from being a tick-box exercise. Instead, it can help users recognise the importance of MERL, and encourage them to put data and evidence-based learning at the heart of their work.

Watch Calum’s MERL Tech video below!