In 2014, City Year—the well-known national education nonprofit that leverages young adults in national service to help students and schools succeed—was outgrowing the methods it used for collecting, managing, and using performance data. As the organization established its strategy for long-term impact, leaders identified a business problem: The current system for data collection and use would need to evolve to address the more-complex challenges the organization was undertaking. Staff throughout the organization were citing pain points one might expect, including onerous manual data collection, and long lag times to get much-needed data and reports on student attendance, grades, and academic and social-emotional assessments. After digging deeper, leaders realized they couldn’t fix the organization’s challenges with technology or improved methods without first addressing more fundamental issues. They saw City Year lacked a common “language” for the data it collected and used. Staff varied widely in their levels of data literacy, as did the scope of data-sharing agreements with the 27 urban school districts where City Year was working at the time. What’s more, its evaluation group had gradually become a default clearinghouse for a wide variety of service requests from across the organization that the group was neither designed nor staffed to address. The situation was much more complex than it appeared.

With significant technology roadmap decisions looming, City Year engaged with us to help it develop its data strategy. Together we came to realize that these symptoms were reflective of a single issue, one that exists in many organizations: City Year’s focus on data wasn’t targeted to address the very different kinds of decisions that each staff member—from the front office to the front lines—needed to make. Its strategy served national-level needs well—where data were used for broad, aggregated, periodic tracking to inform reports to funders and evaluate overall program effectiveness. But reports and dashboards built to respond to top-down view requirements didn’t provide the operational insights the majority of users needed. In the field, reporting wasn’t optimized to support the work of City Year’s 3,000 AmeriCorps members, who were providing direct academic and social-emotional supports to students in nearly 300 schools. To drive better outcomes, they needed access to a consistent, high-quality data set, on a more frequent basis, and in a format that would help them monitor an individual student’s progress and make decisions about that student’s intervention needs. That real-time, on-the-ground decision-making is fundamental to City Year’s ability to improve educational outcomes for students and schools. Yet delivery and evaluation of services lacked a consistent data set, and measurement structures and standards didn’t support this core activity.

We’ve seen many organizations face similar challenges as they strive to measure their social impact and make timely adjustments for continuous improvement. Given the high level of effort required to carry out social impact measurement, the resulting measures ought to provide leaders with useful, relevant information that makes decision-making clearer and easier. Yet for so many organizations, measurement and evaluation has become an albatross. We see many leaders’ well-intentioned efforts to measure performance become saddled with unrealistic expectations, imprecise tools, and misaligned incentives. All too commonly, measurement activities drift away from what should be their central goal: to help individuals across the organization make better decisions.

Many of us in the social sector have probably seen elements of this dynamic. Many organizations create impact reports designed to satisfy external demands from donors, but these reports have little relevance to the operational or strategic choices the organizations face every day, much less address harder-to-measure, system-level outcomes. As a result, over time and in the face of constrained resources, measurement is relegated to a compliance activity, disconnected from identifying and collecting the information that directly enables individuals within the organization to drive impact. Gathering data becomes an end in itself, rather than a means of enabling ground-level work and learning how to improve the organization’s impact.

Are you enjoying this article? Read more like this, plus SSIR's full archive of content, when you subscribe.

Overcoming this all-too-common “measurement drift” requires that we challenge the underlying orthodoxies that drive it and reorient measurement activities around one simple premise: Data should support better decision-making. This enables organizations to not only shed a significant burden of unproductive activity, but also drive themselves to new heights of performance.

In the case of City Year, leaders realized that to really take advantage of existing technology platforms, they needed a broader mindset shift. Through our work together, City Year was able to flip a number of unspoken orthodoxies about data that had gradually and unintentionally built up within the organization:

  1. From “metrics should describe backward-looking impact” to “metrics can and should also enable forward-looking learning.” City Year shifted from reviewing performance data at the end of a marking period or school year to collecting student grades and assessment scores as soon as they post at the school—weekly in many cases—and mapping them to see trends.
  2. From “the measurement and evaluation group should handle data collection and measurement” to “staff at all levels of the organization should have access to data, comfort with data, and see data as integral to discovering insights for action.” City Year implemented training, broader system access, and enhanced data security and privacy protocols to help ensure that school-based staff could monitor student performance to improve intervention strategies.
  3. From “collecting data primarily to measure impact” to “analytics should be in the hands of practitioners to strengthen impact.” City Year developed real-time student-progress monitoring reports to be used by AmeriCorps members and school-based “impact managers” to ensure that the right students were supported with the right interventions.
  4. From “all users are a priority” to “the priority users are those directly supporting schools and students.” City Year worked with districts to establish expanded data requests and data-sharing agreements that provide automated access to individual student-level performance data rather than only aggregated school-level data. 

City Year resolved to put its most critical decision-makers—its AmeriCorps members who serve nearly 200,000 students every day—at the center of the organization’s approach to measurement. We helped City Year solve a business need around its impact data approach that emphasized real-time monitoring more than backward-looking measurement. This ultimately required changes to data protocols, processes, and behaviors. And it provided clarity around the collection and management of data that would most effectively support City Year’s broader goals.

Two years later, City Year has fundamentally shifted its approach to measurement and data. It has invested in new capabilities that enable school districts to directly export student performance data into their management platform using secure automated methods. This data is then combined with the intervention data AmeriCorps members upload via a mobile app. Perhaps most importantly, the organization now views frontline AmeriCorps members and the program staff who support them as primary users to consider when making data and analysis decisions. Among AmeriCorps members, data previously available just once or twice a year to assess impact—such as intervention dosage levels, student academic performance results, and social emotional assessment information—is now available to make real-time adjustments to services. And it has driven results: City Year has seen a 50 percent increase in students receiving the target number of hours of english language arts (ELA) support, for example, and the performance of students receiving ELA interventions has been 1.7 times higher than their expected growth rate. This fundamental tool allows AmeriCorps members to more closely monitor student progress with the requisite security to improve both their daily decision-making and their impact on students.

City Year’s journey shows how galvanizing an organization around a decision-centric approach to measurement can turn measurement and evaluation into a powerful tool for driving greater impact. It also raises questions about how other organizations have tackled similar challenges and what insights might be uncovered from analyzing the issue more broadly. 

Support SSIR’s coverage of cross-sector solutions to global challenges. 
Help us further the reach of innovative ideas. Donate today.

Read more stories by Tracie Neuhaus & Jarasa Kanok.