Anchor institutions, or large, place-based organizations such as universities and hospitals, have always played a significant economic role in their communities. These anchors – which can also include religious institutions, municipal governments, community foundations, sports teams, arts and cultural organizations and major corporations – are often the largest employers, purchasers and real estate developers in their community.
After interviewing over 70 anchor institutions for a recent research project funded by the Surdna Foundation, ICIC found that only a handful are measuring their impact on the local community. Surprisingly, none of the anchors interviewed have established a comprehensive set of metrics to measure the returns to their organization from their community engagement.
But measuring shared value – the community impact and the benefits flowing to the organization from its community-development initiatives – matters.
Given this finding, ICIC convened a group of experts who discussed ways to advance the adoption of measuring shared value. During yesterday’s What Works for Cities webinar, Ted Howard of the Democracy Collaborative and Tony Sorrentino of the University of Pennsylvania joined ICIC Senior Vice President Kim Zeuli to discuss this issue in greater detail.
As Zeuli discussed during her presentation, measuring ROI matters because it establishes a business case for community investment. Gathering metrics helps show the value of adopting anchor strategies. It also allows for the expansion of anchor commitments. For instance, if a university invests 5% of its total procurement dollars into the immediate neighborhood, and the university benefits by improving its supplier network and stabilizing supplies, it’s easier to make the case for additional investment of procurement dollars. When anchors fail to collect data, it’s just as easy for decision-makers to cut anchor strategies from their budgets. This is especially important as leadership shifts; many anchors have indicated their community development strategies have been successful as a result of strong leadership. Gathering and assessing data also provides the motivation for greater anchor collaboration.
The Democracy Collaborative created an “Anchor Dashboard” that anchor organizations can implement to track the impact that initiatives are having on the community. The Dashboard includes metrics for economic development; health, safety and environment; and community building and education.
Ted Howard highlighted how University Hospitals (UH) in Cleveland has begun to measure its impact. Recognizing its importance to the local economy – UH is the second-largest employer in Northeast Ohio with more than 20,000 employees and $800 million in purchasing – UH committed to funneling its dollars into the local economy. As UH embarked on the construction of $1.2 billion worth of new facilities, it set out benchmarks for local procurement. UH exceeded expectations: 93% of construction dollars went to local vendors, 17% of which went to minority-owned businesses. UH had a third-party organization verify its numbers to ensure the accuracy of data.
The University of Pennsylvania is a national leader in anchor-led community development. In the 1970s-80s, Penn struggled to attract students, researchers and faculty because its West Philadelphia neighborhood had fallen into such a state of disrepair. Penn decided that if the City was unable to take charge and lead its revitalization, Penn would have to take on the responsibility itself.
As Tony Sorrentino discussed during the webinar, in 1994, Penn created its “West Philadelphia Initiatives” around five areas of concern: cleanliness and safety; high quality housing; commercial development; economic inclusion and high quality public education. Using Census data, Penn examined three indicators of neighborhood change: demographic trends, socio-economic trends and housing trends. The results showed that by 2000, and even more so by 2010, the West Philadelphia neighborhood was improving.
In response to some criticism of causing gentrification, Penn analyzed data from the University City neighborhood, the area of a new K-8 school, and then the University City neighborhood excluding the K-8 catchment. Data showed that crime was down and homeownership, commercial development and local purchasing had all increased. But importantly, the shift in racial and socio-economic composition was correlated to the creation of a new K-8 school—not “Penn-trification” as some had argued. Without the data, Penn would not have been able to communicate these trends with concerned residents.
Anchor institutions may be hesitant to quantify the performance of their community initiatives; some may worry that results will not be impressive enough or that the results will be misinterpreted. However, understanding the return to both the institution and the community is critical.
Zeuli provided a set of recommendations for advancing the adoption of metrics by anchors. When executing anchor strategies, organizations should:
In addition, others in the field should develop a standard set of internal indicators for each anchor to allow for benchmarking. Given that none of the anchors ICIC interviewed are currently measuring their ROI, we cannot expect to see a shift in behavior overnight. Yet to make anchor strategies sustainable, it’s critical they do so.