Tag Archives: data

Tips for solar charging your data collection

Post by Julia Connors of Voltaicsystems. Email Julia with questions: julia@voltaicsystems.com

What is solar for M&E?

Solar technology can be extremely useful for M&E projects in areas with minimal or inconsistent access to power. Portable solar chargers can eliminate power constraints and keep phones and tablets reliably charged up in the field.

In this post we’ll discuss:

  • How to decide if solar is right for your project
  • How to properly size a solar charging system to meet your needs

Do you really need solar?

In many cases solar is not necessary and will simply add complexity and costs to your project. If your team can return every day to a central location with access to power, then the battery power of the tablet is sufficient in most scenarios. If not, we recommend implementing standard power saving tips to reduce power consumption during time out collecting data.

POWER SAVING TIPS

SolarforMEPhoto

If you do have daily access to the grid but find that users need to recharge at least once while out or need to spend more than one day without power, then add an external battery pack. This cost-effective option allows your team to have extra power without carrying a full solar charging system. To size a battery for your needs, skip down to ‘Step 3’ below.

If you don’t have reliable access to grid power, the next section will help you determine which size solar charging system is best for you.

Sizing your solar charger system

The key to making solar successful in your project is finding the best system for your needs. If a system is underpowered then your team can still run out of power when they’re collecting data. On the other hand, if your system is too powerful it will be heavier and more expensive than needed. We recommend the following three steps for sizing your solar needs:

  1. Estimate your daily power consumption
  2. Determine your minimum solar panel size
  3. Determine your minimum battery size

Step 1: Estimate your daily power consumption

Once you have chosen the device you will be using in the field, it’s easy to determine your daily power consumption. First you’ll need to figure out the size of your device’s battery (in Watt hours). This can often be found by looking on the back of the battery itself or doing a quick Google search to find your device’s technical specifications.

Next, you’ll need to determine your battery usage per day. For example, if you use half of your device’s battery on a typical day of data collection, then your usage is 50%. If you need to recharge twice in one day, then your usage is 200%.

Once you have those numbers, use the formula below to find your daily power consumption:

Size of Device’s Battery (Wh) x Battery Usage (per day) =

Daily Power Consumption (Wh/day)

Step 2: Determine your minimum solar panel size

The larger your device, the bigger the solar panel (measured in Watts) you’ll need. This is because larger solar panels can generate more power from the sun than smaller panels. To determine the best solar panel size for your needs, use our formula below:

Daily Power Consumption (from Step 1) / Expected Hours of Good Sun*

x 2 (Standard Power Loss Variable) =

Solar Panel Minimum (Watts)

*We typically use 5 hours as a baseline for good sun and then adjust up or down depending on the conditions. High temperatures, clouds, or shading will reduce the power produced by the panel.

Since solar conditions change frequently throughout the day, we recommend choosing a panel that is 2-4 times the minimum size required.

SolarforMEPhoto2

Step 3: Determine minimum battery size

External batteries offer extra power storage so that your device will be charged when you need it. The battery acts as a perfect backup on cloudy and rainy days so it’s important to choose the right size for your device.

It can vary, but typically about 30% of power is lost in the transfer from the external battery to your device. Therefore, to determine the battery capacity needed for one day of use, we’ll use our power consumption data from Step 1 and divide by 0.7 (100% – 30% power loss).

Watt hours per day / 0.7 hours =

Watt battery capacity needed for 1 day of use

SolarforMEPhoto3

Picking the right system for your project

Now that you’ve done the math, you’re one step closer to choosing a solar charging system for your project. Since solar chargers come in many different forms, the last step to determining your perfect system is to think about how your team will be using the solar chargers in their work. It’s important to factor in storage for device/cables and how the user will be carrying the system.

Most users aren’t that technical, so having a pack that stores the battery and the device can simplify their experience (rather than handing over a battery and a panel that they need to figure how to organize during their day). By simply finding the right style and size, you’ll experience higher usage rates and make your team’s solar-powered data collection go more smoothly.

We have a data problem

by Emily Tomkys, ICT in Programmes at Oxfam GB

Following my presentation at MERL Tech, I have realised that it’s not only Oxfam who have a data problem; many of us have a data problem. In the humanitarian and development space, we collect a lot of data – whether via mobile phone or a paper process, the amount of data each project generates is staggering. Some of this data goes into our MIS (Management Information Systems), but all too often data remains in Excel spreadsheets on computer hard drives, unconnected cloud storage systems or Access and bespoke databases.

(Watch Emily’s MERL Tech London Lightning Talk!)

This is an issue because the majority of our programme data is analysed in silos on a survey-to-survey basis and at best on a project-to-project basis. What about when we want to analyse data between projects, between countries, or even globally? It would currently take a lot of time and resources to bring data together in usable formats. Furthermore, issues of data security, limited support for country teams, data standards and the cost of systems or support mean there is a sustainability problem that is in many people’s interests to solve.

The demand from Oxfam’s country teams is high – one of the most common requests the ICT in Programme Team receive centres around databases and data analytics. Teams want to be able to store and analyse their data easily and safely; and there is growing demand for cross border analytics. Our humanitarian managers want to see statistics on the type of feedback we receive globally. Our livelihoods team wants to be able to monitor prices at markets on a national and regional scale. So this motivated us to look for a data solution but it’s something we know we can’t take on alone.

That’s why MERL Tech represented a great opportunity to check in with other peers about potential solutions and areas for collaboration. For now, our proposal is to design a data hub where no matter what the type of data (unstructured, semi-structured or structured) and no matter how we collect the data (mobile data collection tools or on paper), our data can integrate into a database. This isn’t about creating new tools – rather it’s about focusing on the interoperability and smooth transition between tools and storage options.  We plan to set this up so data can be pulled through into a reporting layer which may have a mixture of options for quantitative analysis, qualitative analysis and GIS mapping. We also know we need to give our micro-programme data a home and put everything in one place regardless of its source or format and make it easy to pull it through for analysis.

In this way we can explore data holistically, spot trends on a wider scale and really know more about our programmes and act accordingly. Not only should this reduce our cost of analysis, we will be able to analyse our data more efficiently and effectively. Moreover, taking a holistic view of the data life cycle will enable us to do data protection by design and it will be easier to support because the process and the tools being used will be streamlined. We know that one tool does not and cannot do everything we require when we work in such vast contexts, so a challenge will be how to streamline at the same time as factoring in contextual nuances.

Sounds easy, right? We will be starting to explore our options and working on the datahub in the coming months. MERL Tech was a great start to make connections, but we are keen to hear from others about how you are approaching “the data problem” and eager to set something up which can also be used by other actors. So please add your thoughts in the comments or get in touch if you have ideas!

5 Insights from MERL Tech 2016

By Katherine Haugh, a visual note taker who summarizes content in a visually simple manner while keeping the complexity of the subject matter. Originally published on Katherine’s blog October 20, 2015 and here on ICT Works January 18th, 2016. 

MT1MT2

Recently, I had the opportunity to participate in the 2015 MERL Tech conference that brought together over 260 people from 157 different organizations. I joined the conference as a “visual note-taker,” and I documented the lightning talks, luncheon discussions, and breakout sessions with a mix of infographics, symbols and text.

Experiencing several “a-ha” moments myself, I thought it would be helpful to go a step further than just documenting what was covered and add some insights on my own. Five clear themes stood out to me: 1) There is such a thing as “too much data”2) “Lessons learned” is like a song on repeat 3) Humans > computers 4) Sharing is caring 5) Social impact investment is crucial.

1) There is such a thing as “too much data.”

MERLTech 2015 began with a presentation by Ben Ramalingham, who explained that, “big data is like teenage sex. No one knows how to do it and everyone thinks that everyone else is doing it.” In addition to being the most widely tweeted quote at the conference and eliciting a lot of laughter and nods of approval, Ben’s point was well-received by the audience. The fervor for collecting more and more data has been, ironically, limiting the ability of organizations to meaningfully understand their data and carry out data-driven decision-making.

Additionally, I attended the breakout session on “data minimalism” with Vanessa CorlazzoliMonalisa Salib, from USAID LEARN, and Teresa Crawford that further emphasized this point.

The session covered the ways that we can identify key learning questions and pinpoint need-to-have-data (not nice-to-have-data) to be able to answer those questions. [What this looks like in practice: a survey with onlyfive questions. Yes, just five questions.] This approach to data collection enforces the need to think critically each step of the way about what is needed and absolutely necessary, as opposed to collecting as much as possible and then thinking about what is “usable” later.

2) “Lessons learned” is like a song on repeat.

Similar to a popular song, the term “lessons learned” has been on repeat for many M&E practitioners (including myself). How many reports have we seen that conclude with lessons learned that are never actually learned? Having concluded my own capstone project with a set of “lessons learned,” I am at fault for this as well. In her lightning talk on “Lessons Not Learned in MERL,” Susan Davis explained that, “while it’s OK to re-invent the wheel, it’s not OK to re-invent a flat tire.”

It seems that we are learning the same “lessons” over and over again in the M&E-tech field and never implementing or adapting in accordance with those lessons. Susan suggested we retire the “MERL” acronym and update to “MERLA” (monitoring, evaluation, research, learning and adaptation).

How do we bridge the gap between M&E findings and organizational decision-making? Dave Algoso has some answers. (In fact, just to get a little meta here: Dave Algoso wrote about “lessons not learned” last year at M&E Tech and now we’re learning about “lessons not learned” again at MERLTech 2015. Just some food for thought.). A tip from Susan for not re-inventing a flat wheel: find other practitioners who have done similar work and look over their “lessons learned” before writing your own. Stay tuned for more on this at FailFest 2015 in December!

3) Humans > computers.

Who would have thought that at a tech-related conference, a theme would be the need for more human control and insight? Not me. That’s for sure! A funny aside: I have (for a very long time) been fearful that the plot of the Will Smith movie, “I-Robot” would become a reality. I now feel slightly more assured that this won’t happen, given that there was a consensus at this conference and others on the need for humans in the M&E process (and in the world). As Ben Ramalingham so eloquently explained, “you can’t use technology to substitute humans; use technology to understand humans.”

4) Sharing is caring.

Circling back to the lessons learned on repeat point, “sharing is caring” is definitely one we’ve heard before. Jacob Korenblum emphasized the need for more sharing in the M&E field and suggested three mechanisms for publicizing M&E results: 1)Understanding the existing eco-system (i.e. the decision between using WhatsApp in Jordan or in Malawi) 2) Building feedback loops directly into M&E design and 3) Creating and tracking indicators related to sharing. Dave Algoso also expands on this concept in TechChange’s TC111 course on Technology for Monitoring and Evaluation; Dave explains that bridging the gaps between the different levels of learning (individual, organizational, and sectoral) is necessary for building the overall knowledge of the field, which spans beyond the scope of a singular project.

5) Social impact investment is crucial.

I’ve heard this at other conferences I’ve attended, like the Millennial Action Project’sCongressional Summit on Next Generation Leadership and many others.  As a panelist on “The Future of MERLTech: A Donor View,” Nancy McPherson got right down to business: she addressed the elephant in the room by asking questions about “who the data is really for” and “what projects are really about.” Nancy emphasized the need for role reversal if we as practitioners and researchers are genuine in our pursuit of “locally-led initiatives.” I couldn’t agree more. In addition to explaining that social impact investing is the new frontier of donors in this space, she also gave a brief synopsis of trends in the evaluation field (a topic that my brilliant colleague Deborah Grodzicki and I will be expanding on. Stay tuned!)