Tag Archives: mobile

Can digital tools be used to help young mothers in Kenya form new habits?

Guest post from Haanim Galvaan, Content Designer at Every1Mobile

A phone is no longer just a phone. It’s your connection to the rest of the world, it’s your personal assistant, and now, it’s your best friend who gives you encouragement and reinforcement for your good habits.

At least that’s what mobile phones have become for those who make use of habit-boosting apps.

If you’re trying to quit smoking and want to build a streak of puff-free days, the HabitBull app can help you do that. Want to establish a habit in your team that makes use of social accountability? Try Habitica. Do you want positive reinforcement for your activities in a motivational, rewarding voice? Productive is the app for that.

But what if you’re a young mum, living in the urban slums of Nairobi and you want to improve the health and wellbeing of your children? Try U Afya’s 10-Day Challenge.

U Afya is an online community for young mothers and mothers-to-be to learn about topics related to health, hygiene and family life. The site takes a holistic approach to giving young mothers the knowledge and confidence they need to enact certain healthy behaviours. It’s a place to discuss, give and receive advice, take free online courses, and now, to establish good habits with a custom-built habit tracking tool.

The 10-Day Handwashing Challenge was launched using new habit-tracking functionality. Users were encouraged to perform an activity related to handwashing each day, e.g. wash your hands with soap for 20 seconds. The challenges were formulated around the Lifebuoy “5 Key Moments” model. Participants were required to log their activity on the site by completing a survey.

Each day the site fed users a different hygiene-related tip, as well as links to additional content. At the end of the challenge, users were pushed to take a pledge and make a commitment to handwashing.

U Afya’s Habit Tracker is different from other habit boosting apps in that it is not an app! It has been built onto a low-data usage site that has been optimised for the data-sensitive target audience in the Nairobi slums. The tracker provides a rich, visual experience, which makes use of simple functionality compatible on both feature phone and smartphone.

We created a sense of urgency.

Users were required to log their activity for 10 days within a 30-day period. Attaching a “deadline” added a measure of urgency to the activity. There is no space for procrastination. The message is: establish your habit now or you never will!

It is based on behaviour change levers.

The 10-Day Handwashing Challenge and its accompanying content around the site were all based on the behaviour change approach employed by Lifebuoy in Way of Life, namely Awareness, Commitment, Reinforcement and Reward.

The approach was executed in the following ways:

Awareness: Introducing the handwashing theme with engaging, educational content that linked to and from the 10-Day Handwashing Challenge:

  • Diseases caused by lack of handwashing (article)
  • 5 Tips for washing your hands correctly (article)
  • Global Handwashing Day! – The 5 Key times to wash our hands (article)
  • How much do you know about handwashing? (quiz)

Commitment: Encouraging users to take the Handwashing Pledge

Reinforcement: Habit tracker, come back to self-report your daily activity

Reward: Participants stood the chance to win a hygiene gift bag

Contents of the hygiene gift bag given to 5 winners.

The results

86 users started the challenge and 26 users completed it within the 30-day challenge period. That makes a completion rate of 30% overall. Considering that users had to return to the challenge 10 times, the response rate is quite high.

The biggest drop-off happened between Day 1 and Day 2, with 28 users falling away and drop-off rates decreased gradually throughout the 10 days. The graph below shows that most users who made it to day 5 ended up completing the challenge. Only 11 users dropped off between Day 5 and Day 10.

26 out of 86 users created a habit.

In addition to participatory data, additional feedback was gathered by interspersing survey questions into the challenge. This additional questioning determined that 91% of challenge-takers feel they can afford to buy soap for their families.

Feedback:

Users had overwhelmingly positive feedback about the challenge.

“It was so educating and hygienically I have improved. It’s now a routine to me, washing hands in any case”

Learnings:

Keep it simple

It’s not always necessary to create a fancy app to push a new activity. The U Afya 10-Day Challenge was built on a platform that users are already familiar with. By building it into their current environment, it offered them something new and exciting on their visit.

Users were required to do one thing each day and report it with one action i.e. taking a single-question survey. Requiring minimal effort from your users can maximise uptake.

Overall the approach was simplicity. Simplicity in the design of the functionality, simplicity in the daily action and simplicity in creating a habit.

With this approach the U Afya 10-Day Handwashing Challenge helped 26 young mothers to create a new habit of washing their hands every day at key moments.

Conclusion:

This first iteration of U Afya’s 10-Day Handwashing Challenge was a pilot, but the results suggest that it is possible to use low-cost, low-tech means to encourage habit formation. It is also possible for sophisticated behaviour change theory and practice to reach some of the most vulnerable groups, using the very phones they have in their hands.

It is also a useful tool to help us to understand the impact of our behaviour change campaigns in the real world.

Next steps

All the user feedback and learnings mentioned above will be analysed to understand how the approach can be strengthened to reach even more people, increase compliance and and encourage positive habit creation.

Cost-benefit comparisons of IVR, SMS, and phone survey methods

In his MERL Tech London Lightning Talk back in February, Jan Liebnitzky of Firetail provided a research-backed assessment of the costs and benefits of using interactive voice response surveys (IVR), SMS surveys, and phone surveys for MERL purposes.

First, he outlined the opportunities and challenges of using phones for survey research:

  • They are a good means for providing incentives. And research shows that incentives don’t have to be limited to airtime credits. The promise of useful information is sometimes the best motivator for respondents to participate in surveys.
  • They are less likely to reach subgroupsThough mobile phones are ubiquitous, one challenge is that groups like women, illiterate people and people in low-connectivity areas do not always have access to them. Thus, phones may not be as effective as one would hope for reaching the people most often targeted by aid programs.
  • They are scalable and have expansive reach. Scripting and outsourcing phone-based surveys to call centers takes time and capacity. Fixed costs are high, while marginal costs for each new question or respondent is low. This means that they can be cost effective (compared to on the ground surveys) if implemented at a large scale or in remote and high risk areas with problematic access.

Then, Jan shared some strategies for using phones for MERL purposes:

1. Interactive Voice Response Surveys

    • These are pre-recorded and automated surveys. Respondent can reply to them by voice or with the numerical keypad.
    • IVR has been used in interactive radio programs in Tanzania, where listening posts were established for the purpose of interacting with farmers. Listening posts are multi-channel, web-based platforms that gather and analyze feedback and questions from farmers that listen to particular radio shows. The radio station will run the IVR, and farmers can call in to the radio show to submit their questions or responses. These are effective because they are run through a trusted radio shows. However, it is important that farmers receive answers for the questions they ask, as this incentivizes future participation.

2. SMS Surveys

    • These make use of mobile messaging capabilities to send questions and receive answers. Usually, the SMS survey respondent will either choose between fixed multiple choice answers or write a freeform response. Responses, however, are limited to 160 characters.
    • One example of this is U-Reporter, a free SMS social monitoring tool for community participation in Uganda. Polls are sent to U-Reporters who answer back in real time, and the results are then shared back with the community.

3. Phone Surveys

    • Phone surveys are run through call centers by enumerators. They function like face to face interview, but over the phone.
    • As an example, phone surveys were used as a monitoring tool by an agriculture extension services provider. Farmers in the area subscribed to receive texts from the provider with tips about when and how to plant crops. From the list of subscribers, prospective respondents were sampled and in-country call centers were contracted to call up to 1,000 service users to inquire about quality of service, behaviour changes and adoption of new farming technologies.
    • The challenges here were that the data were only as good as call staff was trained. Also, there was a 80% drop off rate, partly due to the language limitations of call staff.

Finally, Jan provided a rough cost and effectivity assessment for each method:

  • IVR survey: medium cost, high response
  • SMS survey: low cost, low response
  • Phone survey: high cost, medium response

Jan closed with a question: What is the value of these methods for MERL?

His answer: The surveys are quick and dirty and, to their merit, they produce timely data from remote areas at a reasonable cost. If the data is made use of, it can be effective for monitoring. However, these methods are not yet adequate for use in evaluation.

For more, watch Jan’s Lightning Talk below!