Tag Archives: research

Using analytics as a validation tool: rethinking quality and reliability of data collection

Rebecca Rumbul, the Head of Research at My Society, gave a Lightning Talk at MERL Tech London in which she described the potential for using Google Analytics as a tool for informing and validating research.

First, she explained her organization’s work. Broadly speaking, My Society is a non-profit social enterprise with a mission to invent and popularize digital tools that enable citizens to exert power over institutions and decision makers. She noted that her organization exists solely online, and that as a result it gathers a significant amount of data from their software’s users in the 44 countries where they operate.

My Society is currently using this data to research and examine whether it is worth continuing to pursue civic technology. To do this, they are taking rational and measured approaches designed to help them evaluate and compare their products and to see to what extent they have valuable real world effects.

One tool that Rebecca’s organization makes extensive use of is Google Analytics. Google Analytics allows My Society’s research team to see who is using their software, where they are from, if they are returning users or new ones, and the number of sessions happening at one time. Beyond this, it also provides basic demographic information. Basically, Google Analytics alone gives them ample data to work with.

One application of this data is to take trends that emerge and use them to frame new research questions. For example, if more women than men are searching for a particular topic on a given day, this phenomenon could merit further exploration.

Additionally, it can act as a validation tool. For example, if the team wants to conduct a new survey, Google Analytics provides a set of data that can complement the results from that survey. It enables one to cross-check the survey results with Google’s data to determine the extent to which the survey results may or may not have suffered from errors like self-selection bias. With it, one can develop a better sense on whether there are issues with the research or if the data can be relied upon.

Google Analytics, despite having its flaws, enables one to think more deeply about their data, have frank discussions and frame research questions. All of this is can be very valuable to evaluation efforts in the development sector.

For more, see Rebecca’s Lightning Talk below!

Six priorities for the MERL Tech community

by Linda Raftree, MERL Tech Co-organizer

IMG_4636Participants at the London MERL Tech conference in February 2017 crowdsourced a MERL Tech History timeline (which I’ve shared in this post). Building on that, we projected out our hopes for a bright MERL Tech Future. Then we prioritized our top goals as a group (see below). We’ll aim to continue building on these as a sector going forward and would love more thoughts on them.

  1. Figure out how to be responsible with digital data and not put people, communities, vulnerable groups at risk. Subtopics included: share data with others responsibly without harming anyone; agree minimum ethical standard for MERL and data collection; agree principles for minimizing data we collect so that only essential data is captured, develop duty of care principles for MERL Tech and digital data; develop ethical data practices and policies at organization levels; shift the power balance so that digital data convenience costs are paid by orgs, not affected populations; develop a set of quality standards for evaluation using tech
  2. Increase data literacy across the sector, at individual level and within the various communities where we are working.
  3. Overcome the extraction challenge and move towards true downward accountability. Do good user/human centered design and planning together, be ‘leaner’ and more user-focused at all stages of planning and MERL. Subtopics included: development of more participatory MERL methods; bringing consensus decision-making to participatory MERL; realizing the potential of tech to shift power and knowledge hierarchies; greater use of appreciative inquiry in participatory MERL; more relevant use of tech in MERL — less data, more empowering, less extractive, more used.
  4. Integrate MERL into our daily opfor blogerations to avoid the thinking that it is something ‘separate;’ move it to the core of operations management and make sure we have the necessary funds to do so; demystify it and make it normal! Subtopics included that: we’ve stopped calling “MERL” a “thing” and the norm is to talk about monitoring as part of operations; data use is enabling real-time coordination; no more paper based surveys.
  5. Improve coordination and interoperability as related to data and tools, both between organizations and within organizations. Subtopics included: more interoperability; more data-sharing platforms; all data with suitable anonymization is open; universal exchange of machine readable M&E Data (e.g., standards? IATI? a platform?); sector-wide IATI compliance; tech solutions that enable sharing of qualitative and quantitative data; systems of use across agencies; e.g., to refer feedback; coordination; organizations sharing more data; interoperability of tools. It was emphasized that donors should incentivize this and ensure that there are resources to manage it.
  6. Enhance user-driven and accessible tech that supports impact and increases efficiency, that is open source and can be built on, and that allows for interoperability and consistent systems of measurement and evaluation approaches.

In order to move on these priorities, participants felt we needed better coordination and sharing of tools and lessons among the NGO community. This could be through a platform where different innovations and tools are appropriately documented so that donors and organizations can more easily find good practice, useful tools and get a sense of ‘what’s out there’ and what it’s being used for. This might help us to focus on implementing what is working where, when, why and how in M&E (based on a particular kind of context) rather than re-inventing the wheel and endlessly pushing for new tools.

Participants also wanted to see MERL Tech as a community that is collaborating to shape the field and to ensure that we are a sector that listens, learns, and adopts good practices. They suggested hosting MERL Tech events and conferences in ‘the South;’ and building out the MERL Tech community to include greater representation of users and developers in order to achieve optimal tools and management processes.

What do you think – have we covered it all? What’s missing?