Tag Archives: visualization

Five lessons learned from applying design thinking to data use

by Amanda Makulec, Data Visualization Lead, Excella Consulting and Barb Knittel, Research, Monitoring & Evaluation Advisor, John Snow Inc. Amanda and Barb led “How the Simpsons Make Data Use Happen” at MERL Tech DC.

MERL-DesignforDataUse 1

Workshopping ways to make data use happen.

Human centered design isn’t a new concept. We’ve heard engineers, from aerospace to software, quietly snicker as they’ve seen the enthusiasm for design thinking explode within the social good space in recent years. “To start with the end user in mind? Of course! How else would you create a product someone wants to use?”

However, in our work designing complex health information systems, dashboards, and other tools and strategies to improve data use, the idea of starting with the end user does feel relatively new.

Thinking back to graduate school nearly ten years ago, dashboard design classes focused on the functional skills, like how to use a pivot table in Excel, not on the complex processes of gathering user requirements to design something that could not only delight the end user, but be co-designed with them.

As part of designing for data use and data visualization design workshops, we’ve collaborated with design firms to find new ways to crack the nut of developing products and processes that help decisionmakers use information. Using design thinking tools like ranking exercises, journey maps, and personas has helped users identify and find innovative ways to address critical barriers to data use.

If you’re thinking about integrating design thinking approaches into data-centered projects, here are our five key considerations to take into account before you begin:

  1. Design thinking is a mindset, not a workshop agenda. When you’re setting out to incorporate design thinking into your work, consider what that means throughout the project lifecycle. From continuous engagement and touchpoints with your data users to
  1. Engage the right people – you need a diverse range of perspectives and experiences to uncover problems and co-create solutions. This means thinking of the usual stakeholders using the data at hand, but also engaging those adjacent to the data. In health information systems, this could be the clinicians reporting on the registers, the mid-level managers at the district health office, and even the printer responsible for distributing paper registers.
  1. Plan for the long haul. Don’t limit your planning and projections of time, resources, and end user engagement to initial workshops. Coming out of your initial design workshops, you’ll likely have prototypes that require continued attention to functionally build and implement.
  1. Focus on identifying and understanding the problem you’ll be solving. You’ll never be able to solve every problem and overcome every data use barrier in one workshop (or even in one project). Work with your users to develop a specific focus and thoroughly understand the barriers and challenges from their perspectives so you can tackle the most pressing issues (or choose deliberately to work on longer term solutions to the largest impediments).
  1. The journey matters as much as the destination. One of the greatest ah-ha moments coming out of these workshops has been from participants who see opportunities to change how they facilitate meetings or manage teams by adopting some of the activities and facilitation approaches in their own work. Adoption of the prototypes shouldn’t be your only metric of success.

The Designing for Data Use workshops were funded by (1) USAID and implemented by the MEASURE Evaluation project and (2) the Global Fund through the Data Use Innovations Fund. Matchboxology was the design partner for both sets of workshops, and John Snow Inc. was the technical partner for the Data Use Innovations sessions. Learn more about the process and learning from the MEASURE Evaluation workshops in Applying User Centered Design to Data Use Challenges: What we Learned and see our slides from our MERL Tech session “The Simpsons, Design, and Data Use” to learn more.

The Good, the Bad, and the Ugly of Using IATI Results Data

This is a cross-post from Taryn Davis of Development Gateway. The original was published here on September 19th, 2017. Taryn and Reid Porter led the “Making open data on results useful” session at MERL Tech DC.

It didn’t surprise me when I learned that — when Ministry of Finance officials conduct trainings on the Aid Management Platform for Village Chiefs, CSOs and citizens throughout the districts of Malawi — officials are almost immediately asked:

“What were the results of these projects? What were the outcomes?”

It didn’t just matter what development organizations said they would do — it also mattered what they actually did.

We’ve heard the same question echoed by a number of agriculture practitioners interviewed as part of the Initiative for Open Ag Funding.  When asked what information they need to make better decisions about where and how to implement their own projects, many replied:

“We want to know — if [others] were successful — what did they do? If they weren’t successful, what shouldn’t we do?”

This interest in understanding what went right (or wrong) came not from wanting to point fingers, but from genuine desire to learn. In considering how to publish and share data, the importance of — and interest in! — learning cannot be understated.

At MERL Tech DC earlier this month, we decided to explore the International Aid Transparency Initiative (IATI) format,  currently being used by organizations and governments globally for publishing aid and results data. For this hands-on exercise, we printed different types of projects from the D-Portal website, including any evaluation documents included in the publication. We then asked participants to answer the following questions about each project:

  1. What were the successes of the project?
  2. What could be replicated?
  3. What are the pitfalls to be avoided?
  4. Where did it fail?

Taryn Davis leading participants through using IATI results data at MERLTech DC

We then discussed whether participants were (or were not) able to answer these questions with the data provided. Here is the Good, the Bad, and the Ugly of what participants shared:

The Good

  1. Many were impressed that this data — particularly the evaluation documents — were even shared and made public, not hidden behind closed doors.
  2. For those analyzing evaluation documents, the narrative was helpful for answering our four questions, versus having just the indicators without any context.
  3. One attendee noted that this data would be helpful in planning project designs for business development purposes.

The Bad

  1. There were challenges with data quality — for example, some data were missing units, making it difficult to identify — was the number “50” a percent, a dollar amount, or another unit?
  2. Some found the organizations’ evaluation formats easier to understand than what was displayed on D-portal. Others were given evaluations with a more complex format, making it difficult to identify key takeaways.  Overall, readability varied, and format matters. Sometimes less columns is more ( readable). There is a fine line between not enough information (missing units), and a fire hose of information (gigantic documents).
  3. Since the attachments included more content in narrative format, they were more helpful in answering our four questions than just the indicators that were entered in the IATI standard.
  4. There were no visualizations for a quick takeaway on project success. A visual aid would help understand “successes” and “failures” quicker without having spend as much time digging and comparing, and could then spend more time looking at specific cases and focusing on the narrative.
  5. Some data was missing time periods, making it hard to know how relevant it would be for those interested in using the data.
  6. Data was often disorganized, and included spelling mistakes.

The Ugly

  1. Reading data “felt like reading the SAT”: challenging to comprehend.
  2. The data and documents weren’t typically forthcoming about challenges and lessons learned.
  3. Participants weren’t able to discern any real, tangible learning that could be practically applied to other projects.

Fortunately, the “Bad” elements can be relatively easily addressed. We’ve spent time reviewing results data for organizations published in IATI, providing feedback to improve data quality, and to make the data cleaner and easier to understand.

However, the “ugly” elements  are really key for organizations that want to share their results data. To move beyond a “transparency gold star,” and achieve shared learning and better development, organizations need to ask themselves:

“Are we publishing the right information, and are we publishing it in a usable format?”

As we noted earlier, it’s not just the indicators that data users are interested in, but how projects achieved (or didn’t achieve) those targets. Users want to engage in the “L” in Monitoring, Evaluation, and Learning (MERL). For organizations, this might be as simple as reporting “Citizens weren’t interested in adding quinoa to their diet so they didn’t sell as much as expected,” or “The Village Chief was well respected and supported the project, which really helped citizens gain trust and attend our trainings.”

This learning is important both for organizations internally, enabling them to understand and learn from the data; it’s also important for the wider development community. In hindsight, what do you wish you had known about implementing an irrigation project in rural Tanzania before you started? That’s what we should be sharing.

In order to do this, we must update our data publishing formats (and mindsets) so that we can answer questions like, “How did this project succeed? What can be replicated? What are the pitfalls to avoid? Where did it fail?” Answering these kinds of questions — and enabling actual learning — should be a key goal for all project and programs; and it should not feel like an SAT exam every time we do so.

Image Credit: Reid Porter, InterAction