
Strengthening Alternative 911 Emergency Response
Alternative emergency response programs have the potential to connect people in crisis to teams of unarmed responders who provide compassionate care and needed services, reduce overreliance on law enforcement, and decrease strain on other first responders. As jurisdictions establish these programs, leaders must use data in real time for performance management and continuous improvement, which can increase programs’ potential to deliver on the promise of alternative response. For example, an alternative emergency response program might aim to connect individuals experiencing a mental health crisis to ongoing counseling. By tracking both the number of people referred to mental health services and if they actually access treatment, program leaders can better understand if the program is connecting people to the supports they need to avoid future mental health crises. At the same time, members of the public and policymakers can gauge whether the program is delivering on its promise.
The Harvard Kennedy School Government Performance Lab’s (GPL) data-driven performance management tools help alternative emergency response program leaders and their partners identify the most important data to measure, develop a deeper understanding of program performance, and take informed action to improve outcomes.
Measuring and reviewing data is crucial to driving change throughout an alternative emergency response program — not just at one moment in time, such as the conclusion of a pilot or the launch of a program expansion. Program managers and their partners can use data for:
To determine which metrics to prioritize measuring, alternative emergency response program leaders should compare their program’s goals to the metrics provided below and establish which of them align with the program’s intended impact.
Evaluation of Alternative Emergency Response Programs: To date, there are a small number of rigorous evaluations of alternative emergency response programs. To evaluate the impact of an alternative response program, outcomes must be compared to what would have happened in its absence. When there are not enough resources to dispatch alternative response to all eligible calls, comparing outcomes between eligible calls that receive alternative response and those that do not may allow jurisdictions to estimate a program’s effect. Other approaches that jurisdictions can use to estimate the effect of alternative response include comparing outcomes between the area where the program was launched and a similar, nearby geographic area that does not have alternative response services; or comparing outcomes between times of day when the alternative response team is active to when they are not active. If you are interested in evaluation of alternative emergency response programs, please contact us at govlab@hks.harvard.edu.
The tables below include 29 common, actionable metrics used by alternative emergency response teams to assess and improve their programs. The GPL selected these metrics based on its experience conducting projects with more than 30 jurisdictions in the Alternative 911 Emergency Response Implementation Cohort and by reviewing publicly available data from 17 alternative emergency response programs. This list is not exhaustive and will be updated as jurisdictions test additional metrics.
Metrics can help alternative response program leaders answer core questions about their program operations and the people they serve. Each metric below is organized by the core question it addresses:
Each metric also includes information on three characteristics, listed below.
Strengthening Alternative 911 Emergency Response
Strengthening Alternative 911 Emergency Response
Strengthening Alternative 911 Emergency Response