Designing evaluations that support sustainable partnerships between funders and community agencies
On May 29th, during the Canadian Evaluation Society’s annual conference, Senior Researcher Molly Doan gave a presentation on designing evaluations that support sustainable partnerships between funders and community agencies.
The presentation, titled Building bridges between funders, community agencies, and evaluators: lessons learned from an evaluation of United Way Greater Toronto’s Youth Success Strategy (YSS), was prepared in partnership with United Way of Greater Toronto and based on a case study of the YSS Program Grants Stream evaluation.
Over the past two years, United Way Greater Toronto and Blueprint have worked in collaboration with community agencies to design and implement an evaluation of the YSS Program Grants Stream to generate evidence about what works in youth employment and training and build agency capacity for evaluation, learning, and continuous improvement.
During the presentation, Molly facilitated a conversation on how evaluation can be used to build sustainable bridges between funders and community agencies by fostering two-way communication and a commitment to shared learning.
Four key lessons from the YSS evaluation were discussed during the session:
1. Changing the conversation around evaluation
Agencies want to use data and evaluation to inform their work, but don’t always have the time, resources, or capacity to do so. This is becoming more and more of a challenge as grantees receive support from a greater number of funders, each with their own data collection requirements. To shift the evaluation mindset of both funders and agencies away from compliance and towards a culture of learning and continuous development, it is important for evaluators to create a safe, open space for all parties to share lessons learned, discuss challenges, identify strengths, and explore opportunities and areas of improvement.
2. Taking a dual-client approach
The funder-grantee power dynamic can make it challenging to have honest conversations about programs. As evaluators, it is very important to ensure that we aren’t just building our own relationships with funders and agencies, but also supporting the relationships and communication between them. By playing the role of the neutral intermediary and focusing on understanding and bridging the needs of agencies and funders, we can help frame the conversation in a way that supports collective learning and the achievement of common goals.
3. Devoting time to collaboration and co-design
Agencies generally want to be involved in the design of an evaluation framework. Collaboration and co-design can increase agency buy-in and sense of ownership, foster a culture of shared responsibility, and streamline the process by honing in on what is really critical to measure, and what really reflects the reality of the programs involved. But as most agencies have resource constraints, it is important to be strategic about how and when this collaboration occurs. A phased approach to evaluation design may include learning about the goals and context of each organization individually, and then bringing them together to ensure everything important has been captured once you have a preliminary draft of evaluation questions, outcome areas, and the data collection approach. A smaller ‘advisory group’ can then co-design and pilot data collection tools with regular feedback loops scheduled as needed.
4. Investing in workable data solutions
Transitioning from a fragmented data collection and reporting approach to a centralized online data system with real-time reporting increases both transparency and access to data and evaluation results. This makes it easier for agencies to meet their reporting requirements, while providing them with more time to focus on program delivery. By incorporating the perspectives of multiple agencies into a common evaluation framework, all parties receive better insights into the progress and success of their programs.
You can read more about the YSS Program Grants Stream evaluation here.