Monday
August 5
2013

Jennifer Liebschutz

Measure for Measure Series: Going Digital with Impact Measurement

Editor’s note: Measure for Measure is a NextBillion series that focuses on trends, tools and viewpoints in impact measurement. Check out The Big Idea page for additional posts in the series.

Over Digital Divide Data’s (DDD) 12-year history, measuring the social impact of our enterprise has been essential. Our organization creates digital jobs for low-income youth in developing countries and enables them to complete higher education. Our primary goal is to enable them to earn sustained higher incomes; we need to know how well we’re achieving this objective. The information we glean from our current beneficiaries and alumni helps us learn how our program impacts participants in the short- and long-term, in turn helping us to continually improve our program. Our donors and supporters also value this data, as it clarifies how their donations make a difference.

In the past, DDD hired an outside research firm to interview our alumni and current program participants face-to-face. This process was time-consuming for both the interviewers and interview subjects, and it was difficult for us to obtain data from alumni living in far-flung provinces and rural areas. An additional downside was the relatively high cost of this survey process.

This year, we switched to an online survey. We’ve just completed data collection and don’t yet have full results. However, it’s clear that the new process has addressed many of these issues and ensured that we can reach a high percentage of our target population. And, we already have a big advantage, in that much of our survey population is computer savvy.

To make the switch to digital, we streamlined the previous survey, eliminating unnecessary questions and simplifying others so no explanations were required. We translated each question into the local languages so respondents could answer without assistance, and we tested each question for clarity. To ensure program alumni completed the survey, we added incentives: a tablet computer for one lucky winner, plus mobile phone credit for a few others. We continued to engage an independent research firm, but in a more limited role, to ensure that our survey methodology was sound, to format the survey, to reach out to alumni by phone and email, and to collect responses.

The process has been an extremely effective way for us to collect data. Almost 90 percent of the alumni we contacted completed the survey. We attribute the high response rate to a continued positive connection to our organization, the ease of completing the survey, as well as the prize incentive. Collecting data online also simplified the next steps, as the data entered online can be easily exported to a spreadsheet or other software to analyze the results.

Nevertheless, data collection is not without challenges. Some of our alumni change phone numbers and email addresses frequently, making it difficult for us to maintain contact. Others live in rural areas and do not have access to a computer, so we had to administer the survey by phone. Some have moved to other countries for work and we’ve lost contact. A few were simply not interested in participating in our survey. These factors make it challenging for us to collect statistics from the population who participated in our program five or more years ago.

We hope to reach even more of our program alumni in the future. To do this, we’ll manage our alumni contact database better, as well as better maintain our alumni network by holding frequent gatherings. In addition to improving data collection, we believe this will help our alumni feel a stronger sense of community, network professionally and support one another, and understand why we need their continued feedback to improve our program for future beneficiaries.

As an organization whose business is to deliver digital services to numerous clients, it seems fitting that we are now collecting data from our beneficiaries digitally. We are optimistic based on our initial experience with online data collection of impact data, and we look forward to analyzing the results and sharing more about what we’re learning.

Jennifer Liebschutz is the development associate for Digital Divide Data (DDD) in Phnom Penh. She identifies fundraising opportunities in Asia, works to increase DDD’s regional visibility, and assists with alumni and external relations.

Categories
Education, Impact Assessment
Tags
Impact Assessment, measurement, skill development