Skip to main content
Previous Blog Home Next

What Does Developmental Evaluation Look Like?

If you are entrepreneurial and a thrill seeker then evaluating social innovation is for you! It is like getting into a pontoon to shoot the rapids on river - fast paced, unpredictable, with the funder, grantees and the evaluators all in the same boat.

FSG’s webinar on the article “Evaluating Social Innovation” featuring Hallie Preskill, John Cawley, Meg Long, and moderated by Tanya Beer of the Center for Evaluation Innovation attracted over 300 attendees who wanted to learn more about evaluating innovations using a developmental evaluation (DE) approach. More than 75 questions were submitted by webinar participants, many more than could be addressed during the call. So, we would like to continue the conversation in a blog series on the practice of DE.

A number of questions asked us to further describe what the DE experience is really like. You asked “What did the evaluation team look like?”, “How are grantees involved?” and “How does the funder work with the evaluation team? Last week, we posted a blog post on the importance of listening in the practice of DE. This week, Meg and John share their experiences to answer your questions.

“How is the practice of developmental evaluation different from other evaluation approaches?”

The difference begins with the composition of the evaluation team and how it interacts with funders and grantees. The specific experience of the OMG Center on Collaborative Learning in evaluating the Community Partnerships Portfolio provides an illustration. Meg says, “The OMG team included eight people: a Director, two project managers, three coordinators, a Director of Research and an analyst for the quantitative work. Since there were two initiatives within one portfolio, CLIP and PPS, we had a project manager assigned to each. In addition to managing the workplan and the team, the project manager maintained close contact with the intermediary. Every one of our team members, with the exception of the Director of Research, participated in field work and ongoing qualitative data collection. The Director of Research oversaw quantitative data collection and maintained a close relationship with all of the institutional research and designated data partners from each site.”

A DE approach can be strengthened by utilizing a cross-disciplinary team. Meg explains, “Our team includes sociologists, organizational development experts, individuals trained in family therapy, culturally relevant evaluation (CRE), anthropologists, facilitators/trainers, in addition to higher education and community development expertise.”

DE often requires intensive field work to collect data and sharing learning. “As time passed we made sure that team members got to visit a variety of sites to stimulate cross-site thinking and an understanding and appreciation for the depth and variety of sites’ approaches,” explains Meg. “There was always one team member that was ‘assigned’ to a site and remained with that site for the duration of the investment.”

A similar cross-site learning approach was taken during the developmental evaluation of YouthScape funded by the J.W. McConnell Family Foundation. As John explains, “Local developmental evaluators were coached by a national developmental evaluator who chaired regular calls designed to problem solve and to provide peer support. The details were not shared nationally (ensuring confidentiality and trust with local sites), but the national developmental evaluator would share patterns on calls and briefing notes with the funder and backbone organization. It allowed the funder and backbone organization to address these more systemic issues in a timely way and to take course corrections. “

Lastly, program staff often find themselves having an increased level of involvement in DE. As Meg describes “Grantees, as well as program staff, need to be aware that they will be involved very differently, including in helping co-design data collection tools, interpret findings, commenting on written products, discussing and refining outcomes, and continuously keeping the team informed of shifting assumptions and contextual influences. It is a very intimate relationship [between the evaluator and the staff running the program being evaluated].”

Thank you Meg and John, for providing us with a richer picture of what the DE team experience looks like in practice.

In Part II of the blog series we’ll continue to examine what DE can and does look like in practice by focusing on the “products” of DE. How is communication different when conducting or commissioning DE?

Katelyn Mack

Former Director, FSG