Interviews

Wednesday
July 10
2019

James Militzer

Turning Impact Measurement on its Head: A Q&A with 60 Decibels Co-Founders Sasha Dichter and Tom Adams

What’s the best way to measure the impact of an enterprise or initiative? And how can this data be gathered in a cost-effective way – and utilized to improve services? The social sector has always struggled to answer those questions, and this has contributed to an unfortunate reality: Many entrepreneurs and investors still don’t have useful data about the impact of their work – or a clear idea of how to acquire it.

As the sector has grown, so have the consequences of this uncertainty. Yet despite the efforts of multiple organizations, there is still no commonly accepted approach for gathering and reporting high-quality impact data, or setting benchmarks for impact. This is making it difficult for stakeholders to compare the effectiveness of different business models and investments.

Acumen recently announced an intriguing new effort to address this challenge: It’s spinning out a standalone social enterprise called 60 Decibels, based upon its Lean Data approach to impact measurement. Under the leadership of Sasha Dichter and Tom Adams, 60 Decibels (which is named after the volume of human speech) aims to leverage mobile technology to communicate directly with low-income customers and beneficiaries, providing high-quality impact data both quickly and efficiently. Through this approach, Acumen and 60 Decibels hope to “set a new standard for how impact is measured by social enterprises, impact investors and the international development sector as a whole” – an ambitious undertaking, to be sure. We spoke with Dichter and Adams about the new enterprise’s innovative model, the challenges it’s likely to face, and the potential benefits it offers to investors, enterprises and the customers they serve.

 

James Militzer: Talk a bit about the origin of 60 Decibels: How did Acumen’s work in Lean Data lay the groundwork for this new social enterprise?

Sasha Dichter

Sasha Dichter: It’s probably best to answer this by starting at the beginning. Since its founding, Acumen always knew that better data on impact would be fundamental to fulfilling its potential to create positive change for the lives of low-income customers. Of course, data itself doesn’t create impact, but it’s essential for us to both understand and improve our performance.

Acumen helped launch and build some of the first impact measurement infrastructure in our sector, including the IRIS standards. We’d also with developed some of our own measurement methodologies such as the Best Alternative Charitable Option (BACO), as well as developing a software platform called PULSE for capturing social, financial and operational data.

But, 10 years into Acumen’s work, in 2012, we were getting stuck. We knew what we wanted to measure, in terms of lists of better-defined metrics, but how to get that data was eluding us. We’d ask our investees for the data, but they were busy building businesses, and they had neither the time nor the teams to figure out how to get the sort of social impact data we were looking for. Round and round we’d go, making asks for data, but ending up with little more than sales figures to talk about impact. Information about who was being reached and how much their lives were, or were not, improving was either missing or, at best, anecdotal.

 

Tom Adams: Lean Data grew out of a desire to turn this issue on its head: Rather than make an ‘ask’ of our investees, we wanted to provide a data ‘offer’: We had internal capabilities to measure impact, and wanted to see how we could make a new form of impact measurement — which became Lean DataSM — one of Acumen’s most valuable post-investment services. For Acumen’s new Lean Data team, the investees became our clients. We focused on how to make gathering impact data both fast and affordable, and – most importantly – to make it be of genuine interest and value to our investees.

Seeing the success of this, starting in 2015, early philanthropic funders of Lean Data – primarily Omidyar and DfID – asked us to help measure their impact. External demand kept growing, and we began to see the many ways that Lean Data could add value to investors and companies, getting them the social impact data that they wanted but that seemed out of reach.

 

JM: Why did you decide to launch 60 Decibels as a spun-off social enterprise, rather than an initiative within Acumen? What are the advantages (and any disadvantages) of that approach?

SD: We created Lean Data as a division within Acumen, and grew it there over five years. That approach worked very well: We had an organizational DNA that cared about impact, funding to do our work and develop the approach, and a ready portfolio within which we could test and iterate on this approach.

The decision to spin out was a strategic one about the trajectory of Lean Data. The question we collectively asked was: What path should we take to have the greatest chance at large-scale impact?

We quickly realized that the best way to achieve this scale was as an independent, standalone entity. For leading organizations that do similar work, this objectivity and independence is key: LEED certification isn’t done by real estate developers, nor is Fair Trade certification done by Starbucks.

It’s only been a few months since we spun out of Acumen, so it’s early days to really know what the advantages and disadvantages are. Of course, there were great advantages to being part of an established, highly regarded organization such as Acumen, and there are new opportunities (and challenges) in being independent. We’ve been able to raise external funding as a new organization, which gives us the capital to invest in growth of offering. And, of course, there’s a degree of focus and clarity that comes with being an independent organization, and a new sort of fire that’s under you to succeed!

 

JM: In the 60 Decibels white paper, you describe it as “a proposal for how to reboot impact measurement.” Why is impact measurement in need of rebooting – what’s currently not working, how is 60 Decibels better, and why is it so important for social business/impact investing to find a solution to this challenge now?

TA: To our mind something weird has been going on.

In general, we’ve never lived in a time when people are more excited about data. Plus, investors and businesses are thinking ever-more keenly about the wider social effects of their work. But somehow when you put data and impact in the same phrase, it becomes something “nice-to-have” rather than mission critical – something we ought to do rather than something we’re excited about.

Tom Adams

Think about it: This thing that’s meant to tell us whether we’re succeeding at our collective mission and purpose has been cast as a heavy, complex compliance activity that’s to be endured, when what it should be is a key driver of performance.

So, to start, we think impact measurement needs to feel different: It should feel essential, something that’s going to answer our most important questions and that’s going to be immediately useful in our day-to-day. It should feel exciting: ‘Great! What do our customers have to say? How are they doing? Have things improved?’ These are the questions we all care about that have gotten lost in all the false complexity we’ve created, as a sector, around measuring impact.

The truth is, as long as impact measurement feels like a tax or a necessary evil, we’ll minimize it: We try to do as little of it as possible because it’s considered a distraction. On the other hand, if impact measurement is value-creating, then it has the chance to become core to how we do business, how we deploy capital, how we operate.

Imagine if everyone had rich impact data, embedded in their investing and operating models, around which they were making daily decisions. We’ve no doubt that in that world we’d do a much better job aligning capital and other resources to create more impact.

And why now? Because the world is crying out for a more equitable, more inclusive, more stakeholder-oriented version of capitalism. But how can we achieve that without better ways to account for and manage the social value we’re creating?

 

JM: Walk us through a typical 60 Decibels impact measurement effort, from an entrepreneur’s perspective: What exactly do they need to do to acquire and utilize this data, and how do you support them in refocusing any existing measurement strategies they may have?

SD: One of the most important things for us is to make our projects as burden-free as possible for entrepreneurs. The people we work for are busy, and so are their teams! A project can require as little as 2-3 hours from our customers, and at the end they have great customer impact data at their fingertips.

This is possible because we’re doing remote surveys, mostly phone calls, over mobile phones, so it doesn’t require a lot of logistical or operational support from the company.

Also, because we’ve done hundreds of projects, we have battle-tested questions that we know work, so the process of figuring out what to ask is quite straightforward.

Our clients’ job is just to do two things: First, they need to tell us how they think about social impact so we can understand what they want to measure; and second, they need to share customer contact details with us, so we can contact their customers. The rest is up to us, and then it’s our job to get them their results in 4-6 weeks so they can actually use the data we collect to make decisions.

 

JM: How does the process look from an investor’s perspective: What role will they ideally play in conducting, facilitating or financing these measurement efforts?

SD: We’re seeing lots of interest from impact investors who want richer, more actionable customer data about the social impact of their investment portfolios. Investors also have the advantage of having large numbers of companies they’re working with, so we structure investor projects to be optimized for comparative impact performance data, which is particularly powerful.

To give an example, last fall we worked with Omidyar Network to do an ‘Education Sprint,’ conducting 24 Lean Data Projects across 11 countries, speaking to a total of 4,800 customers. We gathered, synthesized and presented this data in just 12 weeks!

Most interesting were the ‘clusters’ of results — groups of companies with similar business and impact models where we could compare their impact results. That becomes an immediate, concrete opportunity to learn and adjust, based on who the best and worst performers are for specific impact metrics.

In terms of the role investors play, in addition to funding the projects, our point of contact often runs point on stakeholder management, expectation-setting, and getting buy-in. As we said before, impact measurement has a pretty bad reputation with entrepreneurs, so convincing them that this will be different can still take a bit of time and effort!

 

JM: How does the process look from the customer’s perspective? What do they need to do, how will they be encouraged to do it – and is there any risk that their self-reported impact results might be skewed toward the positive, due to a desire to avoid conflict with the valued business partners who are requesting the data?

TA: It’s worth clearing some things up about self-reported data, as I think there is generally some confusion about it. Firstly, by their nature, a lot of questions about impact will require humans to report on them: Much of what we think of as social impact measurement is perception-based. Secondly, most of the data from evaluations we assume are ‘robust’ was generated in precisely the same way – surveys.

So there’s nothing about self-reporting that’s inherently problematic. However, one can certainly do surveys, remote or otherwise, either well or poorly. Because we have such a high volume of shorter projects, we have a lot of chance to iterate and improve, and that’s helped us improve our technique a lot over the last four years — and we expect to keep on improving!

When we speak to customers, we tell them that we’re a third party calling on behalf of a company that wants to understand their perspective. Customers always have the option to remain anonymous, or not to participate. Getting back to this question of skewed results, we’ve seen that this is less likely when you’re calling as a third party, and when customers have the opportunity to stay anonymous. Finally, one of the most important things we do is to write questions that don’t have any obvious ‘right’ answers that will put the customer in a better or worse light if they answer in a given way. This helps minimized skews in the results, as does triangulating questions (asking two to three questions whose responses usually have a high positive correlation) which helps pick up on that sort of bias.

That said, of course self-reported data will, on occasion, have some bias built into it. The most important thing is to minimize, understand and be transparent about that bias. Moreover, even when there is potential for bias, this is where the benchmarks and standardized surveys are so valuable. Even if all the data was hypothetically skewed by a percent or two, you’d still be able to see how you compare against others. That is a much more powerful driver of decisions and actions — which, lest we forget, ultimately is what this is all about.

 

JM: In your white paper, you say you’re building ‘some of the first benchmarks of impact performance. These benchmarks can be used by both enterprises and investors to compare their impact performance against their peers.’ This would seem to require widespread adoption throughout the sector to be truly effective: Do you see 60 Decibels becoming the standard impact measurement approach, and if so, how can it succeed in attaining this level of adoption, in a market where many other measurement options exist?

SD: We do hope that, within the sectors in which we operate — energy, agriculture, financial inclusion, fintech, workforce development and healthcare, along with clusters of projects in property rights, governance and emerging technology — that our standardized questions start to be widely adopted. As this happens, they can become a part of the standards of impact measurement that we collectively are developing as a sector.

Of course, we won’t ever be the only impact measurement approach, nor should we be. We’re solving a very specific challenge: We want to make it easy for people trying to create social impact to listen to the people who matter most.

Our hope is that by making it easy, we can create a shift in the default behavior in our sector: that if you care about impact and you have customers, it should be expected that you listen to them. We hope that the questions we’re developing, as well as the infrastructure we’re building to make it burden-free to talk to these customers, gets our sector to that new default behavior more quickly.

If that happens, the data we’re generating will help fill all the impact frameworks — whether you’re using the Impact Management Project, aligning to the SDGs, or using another approach — with much richer data. Think of it as the ‘Intel Inside’ of today’s and tomorrow’s impact measurement approaches!

 

JM: In terms of popularizing the 60 Decibels approach, do you see it as something that will happen mainly due to impact investors requesting/requiring it, or to entrepreneurs independently realizing that it’s in their best business interests to acquire this data?

TA: In practice, we expect it will be a bit of both: that’s what changing expectations means. However, unless entrepreneurs feel like this is really useful to them, we don’t think we’ll ever get the sort of widespread, meaningful adoption we need. Conversely, if we can make this valuable to investors, adoption will be fast and self-reinforcing, because it will be fueled by competition.

 

JM: What are your thoughts on another prominent effort to simplify and standardize impact measurement, TPG Rise Fund’s Impact Multiple of Money? What are the advantages of 60 Decibels’ approach, and how much space is there for multiple approaches, if the goal is sector-wide standardization?

SD: TPG Rise Fund’s Impact Multiple of Money (IMM), now housed at another spinoff, Y Analytics, builds on a great history in our sector, particularly the SROI work done by Jed Emerson and many others. It’s really powerful to think about quantifying social value in this way — and it’s a great bridge to be able to speak in the language of dollars and to have universal, comparable metrics.

We think there’s great complementarity between IMM and Lean Data. Especially pre-investment, IMM gives clarity on potential impact based on business projections and the best available academic research. That said, our view is that this sort of approach, grounded in available third-party research — which is the approach we used until 2013 at Acumen — should be complemented by gathering end-customer data.

This isn’t a small point. In practice, time and again we’ve seen that impact is highly context-specific: The impact of a solar panel in India, where there’s a national grid but intermittent power, is very different from the impact of the same panel in Tanzania (where most people live off the grid). So we’d see an IMM analysis as a great starting point for what impact could be, and we’d love to see that data supplemented with the sort of customer data 60 Decibels gathers from each company, to see what actual impact those customers are experiencing. That difference between expectation and reality is everything to the customers we aim to serve.

This additional data will be important for two reasons. First, it will tell investors their actual realized impact. Just as you wouldn’t judge the financial success of investment based on the potential financial return — by looking at similar businesses — you can’t determine the actual impact performance of your investment just by looking at an IMM based on data from a study.

Second, by gathering additional data from customers and giving that data to operating companies or NGOs, we empower these organizations to adjust what they do to better serve these customers to create more impact. That’s a virtuous cycle we can create over the life of an investment.

 

JM: The 60 Decibels white paper talks about how ‘management of social impact gets overpowered by the more rigorous accountability of operational and financial targets’ for many enterprises. Are better measurement tools enough to change this dynamic, absent a broader shift in mentality among entrepreneurs and investors, that puts impact on (at least) equal footing with financial returns? (In other words: Is effective impact measurement not happening because it’s hard, or because it’s inevitably a lower priority for investors/entrepreneurs than their financial bottom line?)

TA: We think this whole framing of impact versus financial returns is a bit of a red herring. If someone creates a business, and their ‘why’ is to meaningfully address a social problem, they care a heck of a lot about moving the needle on that problem.

The issue isn’t about making people care more, it’s about where accountability lies: With terrible impact data and great operational and financial data, you inevitably end up prioritizing the latter in your day-to-day work.

I think we’d argue even more strongly that until we’ve got equivalently good information about impact performance, it will be impossible for impact to be treated as seriously as financial performance.

Of course, the important thing is that people get serious about the collection of that impact data in the first place. In the past it was possible to say “Well, it’s just too difficult.” We hope to prove that it isn’t, whilst simultaneously showing just how useful this data truly is.

If you’re an entrepreneur who started a company or a non-profit to make a change in the world, then you care about that change. If we can give you great data that actually helps you run your company better, and lets you know how you’re doing on creating that change… well, I don’t think there’s a social entrepreneur in the world who wouldn’t want to get their hands on that data.

 

James Militzer is an editor at NextBillion.

 

Image courtesy of 60 Decibels.

 


 

 

Categories
Impact Assessment, Investing, Social Enterprise
Tags
data, impact investing, impact measurement, nonprofits, social enterprise