By Kelly M. Hannum and Jara Dean-Coffey
This article is the first in a two-part series on developing an evaluative mindset.
Conducting evaluations in foundations can be tricky and, frankly, not everyone’s idea of a good time. Through evaluation we seek to document events and experiences, gather feedback, assess impact, and where possible, and find the “truth” and determine value in varied, often conflicting, experiences and perspectives. Things can get confusing and awkward and in the end is it even worth it?
Our position is that while specific evaluations may not be worth it, developing an evaluative mindset is worth the effort. In fact, we propose that an evaluative mindset, which is regularly engaging in evaluative thinking, is necessary for meaningful evaluation and that it is an important part of the suite of leadership skills. Before we dive in deeper, we want to distinguish evaluation from evaluative thinking. Evaluation is an applied inquiry process that uses systematic processes to determine something’s merit, worth, and/or significance (Fournier, 2005).
“critical thinking applied in the context of evaluation, motivated by an attitude of inquisitiveness and a belief in the value of evidence, that involves identifying assumptions, posing thoughtful questions, pursuing deeper understanding through reflection and perspective taking, and informing decisions in preparation for action.”
In short, evaluation focuses on the tasks of gathering information to make a judgment while evaluative thinking is being strategically curious and thinking critically in service of your stated aims and values within a specific context. Ideally, the two go together, but they don’t have to and often don’t. In our view not enough emphasis has been on evaluative thinking, and we suggest that foundation staff (and their nonprofit partners) would benefit from developing an evaluative mindset and that doing so is often more important than evaluation activities. Without an evaluation mindset, evaluation activities are unlikely to be worth the effort.
Most of us who deal with evaluation results have been in a room at some point looking at data or a report uncertain about the purpose of it and wondering who, if anyone, really cares. We may have results, but do they tell us anything we need to know or can use? Both of us have worked with organizations awash in data and no better for it. There are long surveys created on the fly with lots of questions that result in unwieldy databases. Staff often don’t have the time nor the skills to crunch all this data into useable intel. There are data collection processes that burden people and systems. It takes time. It takes money. It also uses up goodwill, particularly when working with and in communities that are getting surveys and such from lots of other folks. And in many cases it isn’t delivering what is needed. On the other hand there are the highly focused systems that gather only data with a known use (such as an annual reporting of activities or website testimonials) or follows a well-known approach, but fails to provide information that truly “gets at” value created from different perspectives or information that fuels useful reflection on practice.
For evaluation activities to be useful, the activities need reflect and be integrated with a Foundations values and strategy. There needs to be a culture that seeks, makes sense of, and uses data efficiently and effectively. To do that well there has to be a shared (or at least transparent) understanding of what value is sought and what values are important in the creation and assessment of it. External evaluations can be wonderful things, but if you are relying on the external evaluator to figure out what is important, and why and the best way to use the findings to inform your decisions and actions – you are giving away too much power. It’s going to be inefficient and expensive. Chances are, you are not going to be happy. We have seen it too many times. Contracting for evaluation activities makes a lot of sense, contracting for evaluative thinking doesn’t. It’s a bit like hiring out self-awareness; you can (and probably should) get help developing it, but ultimately only you can do the work.
The good news is that foundations are uniquely positioned to push evaluative thinking and to use evaluation strategically, because for the most part, foundations can do what they need to do to be effective (Hall, 2004). The external pressures that typically create a system of accountability are less present in foundations. Organizations like the Center for Effective Philanthropy, the Center for Evaluation Innovation, the National Committee for Responsive Philanthropy and others seek to influence foundation practices through data and insights, but foundations don’t have people requiring them to evaluate in any specific ways. This creates a double-edged sword.
On the one hand, it creates the opportunity of being able to challenge the status quo and push for evaluative thinking rather than settle for evaluation activities, recognizing that can mean difficult conversations about who determines value and how that value is created and determined. On the other hand it also creates the conditions of not having to have one’s convictions and assumptions challenged and operating with a belief of impact without a careful consideration of it. Given our current political context in which political parties and media organizations are doubling down on polarizing tactics and undercutting critical examination of evidence, there is a needed and important role that foundations can play in modeling and supporting the process for understanding, generating, and assessing value in inclusive and meaningful ways. In the next part of this article we explore what it takes to develop an evaluative mindset.
Kelly M. Hannum. Ph.D. is the President of Aligned Impact LLC an evaluation consulting firm based in Greensboro, NC. She has worked with clients all over the world and across sectors to enhance their development efforts.
Jara Dean-Coffey, MPH, is Founder and Principal of the Luminare Group (formerly jdcPartnerships) a minority woman-owned and women-led practice in the San Francisco Bay Area established in 2002. Her work is in three areas, strategy and evaluation, capacity building and pushing practice. Jara has worked extensively with public, social, and philanthropic sector organizations.
Buckley, J., Archibald, T., Hargraves, M. and Trochim, W. (2015). ‘Defining and Teaching Evaluative Thinking: Insights from Research on Critical Thinking’. American Journal of Evaluation. December.
Fournier, D.M. (2005). “Evaluation.” Pp. 139-40 in Encyclopedia of Evaluation, edited by S. Mathison. Thousand Oaks, CA: Sage.
Hall, P. (2004). A historical perspective on evauation in foundations. In M. T. Braverman, N. A. Constantine, & J. K. Slater, Foundations and evaluation: Contexts and practices for effective philanthropy (pp. 27-49). San Francisco: Jossey-Bass.