Guest Articles

September 20

Heather Esper

Four Insights for Better Measurement: What Businesses Can Learn from the Public and Nonprofit Sectors

Whether it’s a Fortune 500 corporation with a team of accountants or a small and growing business (SGB) tracking sales in a ledger, the business sector collects and uses data very differently than the public and nonprofit sectors. In the latter, comprehensive monitoring, evaluation and learning plans increasingly are the norm, and the use of adaptive management continues to grow. In contrast, businesses of all sizes lack the time for these measurement approaches, and SGBs often lack the financial resources as well. When businesses do collect data, it needs to create value by informing an operational decision or capturing a key impact that the company can use in marketing.

In my experience working in both the business and nonprofit worlds, each sector can learn something from how the other uses data to improve organizational performance. In a new white paper I co-wrote with Eric Svaan, “Improving Organization Performance Through Cross-Sector Learning: Lessons from Measurement,” I discuss how businesses, particularly SGBs, could benefit from applying some of the measurement practices common in the public and nonprofit worlds to their work. Below, I’ll share four of those insights.


1. Strengthen the analysis of qualitative data

Although businesses regularly collect qualitative data, such as open-ended feedback and input from customers and other stakeholders, they rarely rigorously process that data. Some choose to quickly scan through qualitative data to inform their takeaways, which can lead to biased interpretations: Without systematic analysis, there is a risk of drawing conclusions from the data that merely reinforce their already-existing opinions. Others go a step further and count how many stakeholders fell into different categories of responses to survey questions. For example, as a first step, companies may categorize responses to a particular open-ended question as positive, negative or neutral. As a second step, they could look at the demographics of the stakeholders who gave positive vs. negative vs. neutral responses, in order to develop ideas for how they can adjust their business operations to improve these results in the future.  

At the Performance Measurement and Improvement group of the William Davidson Institute (WDI), we believe businesses can benefit from a more comprehensive approach, using structured coding practices to systematically process qualitative data. Coding helps organize, sort, group and summarize topics, so key ideas and themes can emerge from the gathered data. There are a number of qualitative analysis programs available with some robust low-cost options, including Dedoose, MAXQDA, NVivo and ATLAS.ti. Once data is coded in a qualitative analysis program, users can create different data displays to identify trends and patterns. These data displays are useful ways to systematically review qualitative data in order to draw unbiased conclusions. Team-wide qualitative coding processes also help to avoid bias when multiple staff members are involved in interpreting the information.


2. Consider what perspectives are missing

Businesses often collect data only from current employees, customers and/or suppliers. This omits a key perspective: unsatisfied current/former customers and employees who have left the company. To avoid this, enterprises can conduct stakeholder mapping and analysis in order to identify all key stakeholders in their business. Once the stakeholder map is completed, companies can quickly detect if there are certain groups they may have left out during past data collection. From there, they can consider additional investigation within stakeholder groups to apply toward improving business operations.  

 Businesses also sometimes fail to include important stakeholder perspectives due to a lack of engagement on the stakeholder’s behalf. But whether the stakeholder is internal or external, if their input is valuable to the business it is beneficial to put in the extra effort to get their perspective. In these situations, it can be helpful to use creative data collection techniques, such as participatory methods, to capture targeted information from these individuals. These methods could include requiring staff to attend a “pause and reflect” session to capture their perspectives on what is and isn’t working, or having a “data party” where the business incentivizes customers (by providing a small amount of money, refreshments or product discounts) to help the company understand the “how and why” behind the data it is collecting.


3. Include a holistic set of key financial and social metrics

Unless it is required, most businesses choose not to track social metrics: They tend to prioritize financial metrics. Enterprise managers may see social evaluation activities as a box to check to provide accountability, rather than a way to improve their operations. But in our work with clients at WDI’s Performance Measurement and Improvement group, we’ve learned that tracking social metrics can often deliver financial value. That’s why we always seek opportunities to examine how social metrics influence financial metrics. For example, in an impact assessment we conducted, we found that employee self-efficacy (i.e., employees’ belief in their ability to do different things) was related to both retention and increased sales. Thus, we recommended the business increase its focus on developing employee self-efficacy (such as through training sessions or supervisors’ interactions with employees) with the aim of further increasing retention and sales. For businesses that are interested in expanding their measurement efforts beyond financial metrics, a good first step in identifying potential social metrics worth tracking is to develop a theory of change.


4. Leverage learning agendas to organize learning

A growing trend in the non-profit and public sectors is to develop learning agendas in order to prioritize learning questions — i.e., gaps in the knowledge of the company and/or broader field. Learning agendas are living documents that leverage feedback loops to adapt learning questions and their associated activities and products based on new evidence. Businesses should consider using learning agendas to prioritize and organize learning questions based on the company’s broader strategy and needs. Doing so will help the enterprise focus on the most important questions for its operations and success. A learning agenda can also be a useful tool for companies to share with funders and academics interested in working with them. Doing so allows companies to clearly communicate what research is important to their operations, and can reduce their likelihood of getting distracted by investigating less essential questions. A learning agenda can also be helpful when a company’s current funder approaches it with a new request for data. In this case, a learning agenda can allow the enterprise to clearly demonstrate whether the partner’s request aligns with its priorities — and if it doesn’t, the agenda can help the business explain why it’s unable to pursue the requested research. An additional value of using learning agendas is to avoid the sometimes-piecemeal nature of problem-solving. A learning agenda can serve as a strategic document that can help companies not only decide the order in which to answer questions, but also determine how to integrate and disseminate learning across the business. In short, a learning agenda empowers enterprises to be strategic and intentional with their learning. 

For businesses, and especially SGBs, there is tremendous value in adopting measurement and learning practices from the public and nonprofit sectors. With these tools and others, business leaders who may be wary of jumping into the often-overwhelming sea of data can safely test the waters — and improve their decision-making in the process. If you are interested in discussing these tools and concepts further, please contact me here.


Heather Esper is Director of the Performance Measurement and Improvement team at the William Davidson Institute at the University of Michigan (WDI). WDI is NextBillion’s parent organization.


Photo courtesy of Freddy Fam.




Impact Assessment
business development, data, impact measurement, William Davidson Institute