5 Insights from MERL Tech 2016


By Katherine Haugh, a visual note taker who summarizes content in a visually simple manner while keeping the complexity of the subject matter. Originally published on Katherine’s blog October 20, 2015 and here on ICT Works January 18th, 2016. 

MT1MT2

Recently, I had the opportunity to participate in the 2015 MERL Tech conference that brought together over 260 people from 157 different organizations. I joined the conference as a “visual note-taker,” and I documented the lightning talks, luncheon discussions, and breakout sessions with a mix of infographics, symbols and text.

Experiencing several “a-ha” moments myself, I thought it would be helpful to go a step further than just documenting what was covered and add some insights on my own. Five clear themes stood out to me: 1) There is such a thing as “too much data”2) “Lessons learned” is like a song on repeat 3) Humans > computers 4) Sharing is caring 5) Social impact investment is crucial.

1) There is such a thing as “too much data.”

MERLTech 2015 began with a presentation by Ben Ramalingham, who explained that, “big data is like teenage sex. No one knows how to do it and everyone thinks that everyone else is doing it.” In addition to being the most widely tweeted quote at the conference and eliciting a lot of laughter and nods of approval, Ben’s point was well-received by the audience. The fervor for collecting more and more data has been, ironically, limiting the ability of organizations to meaningfully understand their data and carry out data-driven decision-making.

Additionally, I attended the breakout session on “data minimalism” with Vanessa CorlazzoliMonalisa Salib, from USAID LEARN, and Teresa Crawford that further emphasized this point.

The session covered the ways that we can identify key learning questions and pinpoint need-to-have-data (not nice-to-have-data) to be able to answer those questions. [What this looks like in practice: a survey with onlyfive questions. Yes, just five questions.] This approach to data collection enforces the need to think critically each step of the way about what is needed and absolutely necessary, as opposed to collecting as much as possible and then thinking about what is “usable” later.

2) “Lessons learned” is like a song on repeat.

Similar to a popular song, the term “lessons learned” has been on repeat for many M&E practitioners (including myself). How many reports have we seen that conclude with lessons learned that are never actually learned? Having concluded my own capstone project with a set of “lessons learned,” I am at fault for this as well. In her lightning talk on “Lessons Not Learned in MERL,” Susan Davis explained that, “while it’s OK to re-invent the wheel, it’s not OK to re-invent a flat tire.”

It seems that we are learning the same “lessons” over and over again in the M&E-tech field and never implementing or adapting in accordance with those lessons. Susan suggested we retire the “MERL” acronym and update to “MERLA” (monitoring, evaluation, research, learning and adaptation).

How do we bridge the gap between M&E findings and organizational decision-making? Dave Algoso has some answers. (In fact, just to get a little meta here: Dave Algoso wrote about “lessons not learned” last year at M&E Tech and now we’re learning about “lessons not learned” again at MERLTech 2015. Just some food for thought.). A tip from Susan for not re-inventing a flat wheel: find other practitioners who have done similar work and look over their “lessons learned” before writing your own. Stay tuned for more on this at FailFest 2015 in December!

3) Humans > computers.

Who would have thought that at a tech-related conference, a theme would be the need for more human control and insight? Not me. That’s for sure! A funny aside: I have (for a very long time) been fearful that the plot of the Will Smith movie, “I-Robot” would become a reality. I now feel slightly more assured that this won’t happen, given that there was a consensus at this conference and others on the need for humans in the M&E process (and in the world). As Ben Ramalingham so eloquently explained, “you can’t use technology to substitute humans; use technology to understand humans.”

4) Sharing is caring.

Circling back to the lessons learned on repeat point, “sharing is caring” is definitely one we’ve heard before. Jacob Korenblum emphasized the need for more sharing in the M&E field and suggested three mechanisms for publicizing M&E results: 1)Understanding the existing eco-system (i.e. the decision between using WhatsApp in Jordan or in Malawi) 2) Building feedback loops directly into M&E design and 3) Creating and tracking indicators related to sharing. Dave Algoso also expands on this concept in TechChange’s TC111 course on Technology for Monitoring and Evaluation; Dave explains that bridging the gaps between the different levels of learning (individual, organizational, and sectoral) is necessary for building the overall knowledge of the field, which spans beyond the scope of a singular project.

5) Social impact investment is crucial.

I’ve heard this at other conferences I’ve attended, like the Millennial Action Project’s Congressional Summit on Next Generation Leadership and many others.  As a panelist on “The Future of MERLTech: A Donor View,” Nancy McPherson got right down to business: she addressed the elephant in the room by asking questions about “who the data is really for” and “what projects are really about.” Nancy emphasized the need for role reversal if we as practitioners and researchers are genuine in our pursuit of “locally-led initiatives.” I couldn’t agree more. In addition to explaining that social impact investing is the new frontier of donors in this space, she also gave a brief synopsis of trends in the evaluation field (a topic that my brilliant colleague Deborah Grodzicki and I will be expanding on. Stay tuned!)