Join Corey Newhouse, founder of Public Profit, to explore the intersection of values and evaluation in the nonprofit sector. Corey provides profound insights into the significance of aligning organizational values with the process of data collection and evaluation, challenging the extractive roots of traditional evaluation practices. Together Alexandra and Corey delve into the crucial role of values in shaping evaluation questions, data collection methods, and analysis, highlighting the need for a more inclusive and equitable approach. Corey talks about the Equitable Evaluation Initiative’s principles and explores practical strategies for integrating values into the evaluation process. From language access to equity gap scores, they discuss tangible steps that nonprofits can take to humanize the data collection process and enhance its rigor. Whether you’re a nonprofit professional or an advocate for social change, gain a deeper understanding of how to transform evaluation into a collaborative, values-driven endeavor that maximizes impact and promotes equity in the nonprofit world.
Resources:
- Public Profit
- Equitable Evaluation Initiative
- Center for Culturally Responsive Evaluation and Assessment
- We All Count
Click to read the auto-generated transcript of today’s episode
[00:00:00] Hello and welcome to Heart, soul, and Data where we explore the human side of analytics to help amplify the impacts of those out to change the world with me. Alexandra Maning.
[00:00:18] Alexandra: thank you so much for joining me today. I am thrilled to be joined by Corey, who’s going to talk to us on how we can do evaluation better in so many ways. So Corey, why don’t you go ahead and introduce yourself? Tell us where you’re joining the recording from, and how you came to the whole data and evaluation space.
[00:00:37] Corey: Absolutely. And thanks so much for having me. I’m really excited to be here as well. My name is Corey Newhouse. I’m the founder and principal of Public Profit. We work with mission driven organizations to help them use data to make better decisions and ultimately to deepen and broaden the impact that they have in the community.
[00:00:56] I currently am in lovely Eugene, Oregon on the West Northwest part of our country in the United States. Much of our team is based in the San Francisco Bay area. Just a little bit south of me.
[00:01:07] Alexandra: I love it. Thank you. So let’s talk a little bit about When we come to evaluation, right, we don’t always have the skills in house to do it correctly. Um, so I wanted to talk a little bit about how we can think of ways of building our capacity around doing evaluation, because oftentimes that’s like the biggest place we have for doing evaluation better, is even just building the ability.
[00:01:34] To do evaluation or to do evaluation at the next level. So I wondered if you could talk a little bit about some of the ways that we as smaller organizations can think about that capacity building.
[00:01:44] Corey: Absolutely. And I think there’s. Probably two or three different directions that I would encourage smaller organizations to take. One is to recognize that most organizations are already collecting a lot of great information, whether that is attendance records, case management notes, interviews, periodic reflections from the staff about what’s working well.
[00:02:08] Uh, so many of our colleagues in nonprofits are such exceptional practitioners, so reflective, so thoughtful, and that they’re generating data all the time. And sometimes what the difference is, or what kind of that next step is, is just really getting in some good habits around collecting that information more regularly.
[00:02:29] Writing it down, getting in the habit of doing some regular reflections. I think of that as getting around the cycle, lots of organizations of all kinds, collect a whole lot of information and really getting the hang of taking that pause to collect it, not just collect it.
[00:02:47] But to, to
[00:02:47] analyze
[00:02:48] it and to make meaning of it, I think is really helpful. I think the other thing to the, so another, another strand for smaller nonprofits to think about is. How might they leverage the support of an evaluation consultant, someone like Public Profit to really help them build up some infrastructure that then they can run on their own after that initial setup.
[00:03:09] As you alluded to, program evaluation can be really intense to set up. It can have a fairly high startup cost in terms of time. And so just getting that jumpstart in terms of having someone externally kind of helping you. We’ll find the data collection tools that you have. and then, you know, setting something up that is more replicable over time.
[00:03:32] And then the last strand that I’ll suggest, and this is something I think that a lot of small nonprofits do, is to partner with someone, a graduate student, a professor, someone, who is,doing applied evaluation or applied research, pardon me, evaluation or applied research, As part of a course requirement or their studies,to implement a, specific project, which is usually, free or very low cost for the nonprofit.
[00:03:56] and I think in some specific settings, that can be a really, really fantastic jumpstart for small nonprofits as well. Mm-hmm.
[00:04:03] Alexandra: I really liked all three of those, and they’re such great, different areas that we can get involved in to advance this capacity that reflect, you know, pause and reflect habit of building that I think is a really important one that we miss so quickly. I was just listening to, an accountant actually talking about how to do small business taxes better, and she mentioned Being good at doing your accounting isn’t about being good at numbers.
[00:04:31] It’s about being good at understanding what the numbers are telling you.
[00:04:36] And that just stuck with me because we are so used to saying like, oh, accounting is like a numbers thing, but really the numbers aren’t that complicated and there’s software that’ll do it. For you, where the value comes is sitting down and looking at, you know, your balance sheet or looking at the money coming in the money going out and understanding what it’s telling you and how that should impact the financial decisions that you make over the next month, over the next six months, et cetera.
[00:05:00] And I think to your point, that’s where we can get stuck with data as nonprofits, whether we’re looking at program data or donation data or whatever it might be that it might be there, or we might spend a lot of time. Trying to figure out even how to collect it or what to do with it in our CRM. But we missed that point where we stop and say, what is this actually telling us?
[00:05:20] Corey: Yeah. and I think that, some of that has to do with the nonprofit starvation cycle that so many organizations are just really underfunded really, really undercapacity. And I think also, it has, it has a lot to do with missteps that evaluation professionals like myself have made over the years in which.
[00:05:39] The program evaluation for way too many organizations is an extractive process that is meant to satisfy funders and nothing more. and what’s more demotivating than that? having to do paperwork, having to do or turns evaluation into paperwork that once a quarter, we’ve got to run around with our hair on fire to fill out the report that nobody knew was coming or somebody forgot was coming.
[00:06:05] Of course, that’s not interesting. Of course, that’s not helpful. and similarly, I think evaluators too often have been really focused on sort of whether their own intellectual curiosity or meeting those funders needs rather than really thinking about how data can really inform and enhance practice and things that people who are doing the work really care about.
[00:06:28] And some of that is I a change in the type of data that’s collected, but I think a lot of it is just really a change in mindset and sort of who, for whom are we collecting this information? who is doing the learning from it? and certainly there are plenty of funders that have a long way to go when, when kind of appreciating that they should be second or third in line when it comes to, data, you know, who’s getting the data and why.
[00:06:51] but I do think that there’s just, again, some really exciting opportunities with that mindset shift, both for. Nonprofit professionals and for evaluation consultants or professors or graduate students that the other folks that they’re collaborating with.
[00:07:06] Alexandra: And I think that there is, when you talk about moving beyond just like what we can do internally, then to that. Consultant space, right? You talked about that. Another way to expand your capacity is to bring in capacity that doesn’t even exist within your organization. And I think that is a really important part for us to be aware of as nonprofits that sometimes it doesn’t even make sense for us to have that capacity in house and it can be more effective, more efficient to have it, reside, you know, in a consultant, a third party that we can collaborate with.
[00:07:34] So can you talk a little bit more about how we can use evaluation consultants more effectively?
[00:07:41] Corey: Yeah, absolutely. I think that there are a couple different dimensions that come to mind. You know, one is as much as possible to have if you do want to work with an evaluation consultant to have them at least be around as early in the process as possible. And whether that is. it’s the start of a new school year, and you have a school year program, or you’re launching a new initiative, or whatever the start might be.
[00:08:08] and even if they’re just around to kind of get that context, listen in, kind of understand that the genesis of the thing that you’re doing. and ideally to be able to help you set up some of those systems, like attendance, like surveying, like case notes, that are really preparing your organization for success in the future.
[00:08:24] So you’re going to have that usable information and ways that you can work with them. Is on the flip side when an evaluation consultant is brought in at the end, or sometimes months or even years after the thing is over that just, in my opinion, that really just pushes us into compliance mode that yes, there’s always things to learn.
[00:08:43] Yes. There’s always things to improve, but usually if something’s over, it’s over. And so we’re missing out on that opportunity for that continuous learning that continuous improvement. And that co creation and so I think that just as much as possible to have your evaluation consultant around at the beginning, whatever, whatever that is helps a lot.
[00:09:03] And if you’re on a limited budget to totally understand, you can definitely talk with your consultant about ways to kind of ramp up their time. And so maybe they’re more and more observer mode. As things are starting up offering some, you know, some specific advice on what your intake survey should look like, or something like that.
[00:09:21] And then really dialing in later, but again, just so they can be there. They’re to understand the context is going to be real helpful. And then the 2nd, big thing, and this again gets back to some of the more extractive history of the evaluation profession, which many, many, many of us are working hard to shift is to really.
[00:09:40] Understand the ways in which your evaluation consultant is going to help your organization build its capacity. and that can take a lot of different, a lot of different features or a lot of different shapes, at the very least any, data collection tool that they’re developing. Belongs to you as an organization.
[00:09:58] You should be able to keep it. Ideally, they’re creating, they’re doing reports or other things in ways that are in software that your organization can use if you need to take it on. They’re doing good, good documentation of the things that they’re doing. So you can replicate it if you need to.
[00:10:13] And I think most importantly, that they’re really working with you as an organization to co create the evaluation questions, kind of what are those questions that we want to be able to ask and answer, as well as to really try to interpret the data collectively. And that doesn’t have to be a huge lift that doesn’t have to be all of a sudden everyone on your team, you know, is doing what they’re doing and being a professional evaluator.
[00:10:37] but it shouldn’t be this thing that’s just kind of happening off to the side because that then again is just a huge missed opportunity for both for your organization to be benefiting from this collaboration, building, building your capacity and. Equally important for the findings of evaluation to be useful for you.
[00:10:56] The easiest way to blow a whole bunch of money is to hire an evaluation consultant and not interact with them. until the end of the project, I guarantee it. their work is not going to help you to, you know, to continue to improve to demonstrate to your funders and community what you’re doing.
[00:11:14] it does need to be a collaboration.
[00:11:16] Alexandra: Sort of going back to what we started with that like pause and reflect that you want to make sure that you have a plan and an intention and then an execution of using the evaluation that your consultant, your partner is doing.
[00:11:31] Corey: Yes, yes, yes. and that that, you know, and that’s not to say that Every recommendation that your evaluator makes is going to be like, oh, my gosh, I never would have thought of that. Or, oh, my gosh, we’re going to do that tomorrow. But generally speaking, you’re working with this group or this person to help yield new insights, help verify things that you have a good gut sense about to open up new opportunities.
[00:11:54] And again, that, that can’t happen if they’re just kind of off on their own, doing their own thing.
[00:12:01] Alexandra: It’s come up several times already that one of the places that we’re most likely to do evaluation wrong is to leave it siloed or to leave it as sort of this external activity that isn’t integrated into not only the activity of the program, but integrated into how we actually run our organization, like the operations of our nonprofit.
[00:12:23] So can you talk a little bit more beyond just capacity building, we can. Make sure that we’re incorporating the evaluation that we are doing more consistently and sustainably into our ongoing operation.
[00:12:37] Corey: yeah. And there’s there’s a couple different tensions to navigate in that and it’s, it’s not for no reason that it is unusual for organizations to be able to make lots of data informed decisions and incorporated into their, daily lives. And so, for any organization that is really struggling with that or sort ofsees a better way of being.
[00:12:56] It’s not because it’s easy and somebody just hasn’t told you how to do it yet. It’s hard and ittakes some commitment. That said, I think that there’s, you know, few things that I’ve seen with organizations we work with that really help them to really turn that corner. and one is what I think of as the equivalent of growth mindset.
[00:13:17] But for organizations. So Dr. Carol Dweck has this wide body of research around what’s called a growth mindset or fixed mindset for individuals. And the very basic idea, if y’all don’t know Dr. Dweck, go look her up, she’s amazing, is people with a growth mindset believe that they see it.
[00:13:40] their limitations as an opportunity to improve. They feel that if you put in more effort, you are going to get better at it. of course, you know, it has limits. I’m 5 foot 2. I can put a lot of effort into basketball, and I’m not going to be in the NBA, but fundamentally, I believe if I put in the work to do something that’s important to me, I’m going to get better.
[00:13:58] Fixed mindset, on the other hand, is What it sounds like the idea that we have what we have. We are who we are and additional work or effort does not really influence the things that we’re able to do or our accomplishments and I think that there’s a parallel there in organization culture around do we just do what we’re doing turn the wheel because that’s what our plan is or are we a learning organization?
[00:14:23] Are we learning people who want to keep getting better? Want to keep getting those signals? Now, to do that, that requires a lot of psychological safety for the people who are in the organization, requires really good management, so that, the consequences of a mistake are not fatal professionally, but are seen as what they are, a mistake.
[00:14:44] that leads, that connects to perfectionism and white supremacy characteristics in organizations. Many, many more podcasts worth of content there. but it, think that it is possible and I think that, that organizational growth mindset really then cues people, cues teams up for success to be using data on a more regular basis and, to feel comfortable and excited about what are the signals that we’re getting?
[00:15:07] What’s the, information coming back to us, to help us keep getting better at what we’re doing and keep refining. the timeliness of information is another big challenge for nonprofit organizations. it takes
[00:15:18] time to do observations, it takes time to do interviews, it takes time to do surveys,and that, often in our experience, what one of the challenges nonprofits face is by the time they get that information back. All the people who filled out that survey are out of the program, or there’s a whole new staff, or there’s, you know, there’s been enough that has changed that it’s nice to know, but isn’t as actionable and some of that is just.
[00:15:45] Just because and until we build some time machines that we can sort of jump back and, you know, have the report before we do the thing are happening, but I do think that technology enables some really great opportunities for those kind of quicker bits of feedback, whether that’s through polling.
[00:16:03] I think a lot of educational and youth development organizations have great ways of getting input from the community in quick ways, whether that’s a feedback wall or, a quick, you know, quick conversations in the hallway or other ways to get those signals that may not be as heavy duty or comprehensive as a survey of everyone in the program, but can still get you some good signals. And then I think the third thing is another big challenge and intention that nonprofits face. I think as a result of the first two is that the data itself is incomplete and not very good. And it makes sense that if. You are in your job, you are told to do X, Y, and Z thing, and you don’t really know why, and you never get it back and it doesn’t help you do your job.
[00:16:48] Why do it? I wouldn’t do it. and so one of the, one of the big things that we really encourage nonprofits to do is to start using the data they have, even if it’s out of date, even if it’s a little weird,and get in that habit of looking at the data and that. As your team starts to build that tradition of looking at the information.
[00:17:09] The 1st time you’re mostly going to talk about what’s missing and why it’s wrong. And oh, my gosh, nice opportunity to say, how might we get fuller information? How might we get more up to date information and really leaning into that deep passion and commitment that mission driven professionals have for their work, have for their mission and having the data be.
[00:17:30] Thank you. Be a compliment to that rather than this evidence of that or this proof that they’re doing what they’re supposed to be doing.
[00:17:37] Alexandra: One thing that I think surprises a lot of people when we have these conversations around how to use data better is how little of it is about technical. intervention, right? We’re not talking about using this statistical test or writing this Excel formula. Like everything you listed there has to do with how we as humans are interacting with the data.
[00:17:56] And there’s a small piece, like you said, of making sure that the data are timely, which could involve some technology, making sure the data are more complete, which can have to do with some of the actual nitty gritty of data. But for the most part, this is about making sure that we’re engaging In a way with our data that gets it to where we as humans need it.
[00:18:15] When we need it in ways that we can engage with it and support that we connect to it with the right mindset, you know that we mentioned that earlier. And so I just, I loved so much of that because it is just again driving home that point that success with data. Beyond the actual technical part of collecting it and processing it is really about who we are as humans and what we bring to the table.
[00:18:39] Corey: Yes, and I think that that is a 1000%. I feel like evaluation is. A, there’s this, these ideas of adaptive issues and technical issues and evaluation is 90% an adaptive issue and 10% technical. And again, I think that’s where a good evaluation consultant can really be helpful because we have the technical knowledge, right?
[00:19:00] we studied up on it, we know about the statistical test, we know about the technology platforms, et cetera, et cetera. We’re really blessed that, that in this day and time, It’s pretty dang easy to generate data. that used to be part of the hard part. It is no longer. And so it is. Yes, absolutely.
[00:19:17] There’s technical aspects to it. And I think that sometimes people get scared or a little hung up on. Some of the technical elements, the some of the elements connected to traditional ideas of rigor, because they took that one statistics class in high school or college, and they have these vague memories of like, Oh, my God, like, it has to be this way, or like, somebody is going to throw a hood over my head and drag me out of the room, like, Oh, my gosh, like, like, it has to be has to be.
[00:19:43] And yes, absolutely good evaluation, good data. Does have to meet some technical specifications so that we’re drawing conclusions based on what is, you know, what is representative data? What is the kind of ideal or as close as we can get to the ideal? And again, as you say, I think so much of it is. About doing our best to get to that technical place and that place of methodological rigor, but far better to have a little bit of data that you use is really actionable that you’re using regularly than to have tons and tons of data that they don’t need at all, or they don’t use it all.
[00:20:19] Alexandra: I think another big part of this as well is that these human aspects, like you said, the adaptive aspects, need to be thought about at the beginning. Not once you’ve done all of it, because while they are human, the human parts and not necessarily an integrated part of the technical data collection, they may influence some of the choices that you make about the data to collect or collect.
[00:20:41] Thank you. Impact how you even approach the construction of an evaluation project. And so, you know, you mentioned very early on that one of the big mistakes people will make is not bringing in more advanced capacity or expanding that capacity early on, and they think we can just fix it at the end or make things work at the end.
[00:21:01] And it just struck me that, you know, for example, the timeliness. You may actually choose to collect different data because you realize that the original idea you had for the data is going to show up by the time it’s not helpful, and so it’s you may as well not even have it. So collect something else that’s actually going to get the turnaround faster or that growth mindset.
[00:21:20] Right, that you may not think that has anything to do with the data you collect, but in fact, you may want to be making sure you’re getting data that will inform learning from mistakes or may inform improvement efforts and the measurement, that connection of the benefit from your effort. So, I just, that struck me as something interesting too.
[00:21:36] Corey: and to build on that, I think kind of two things. One, a lot of our mission driven clients are taking really explicit focus on diversity, equity, inclusion, and belonging in their organizations. And there’s actually a really fascinating connection between evaluation and DEIB in that.
[00:21:53] Often in conversations about what it is like to be a more inclusive, belonging centered organization, we talk about enabling mistakes, letting go of the idea of perfection, widening our understanding of,of whose opinion should be centered when we’re making decisions. That all interconnects with evaluation, or it can.
[00:22:13] and then the other thing I’ll say and. I think this is again 1 of the 1 of the long standing issues in the evaluation field that is related to its extractive roots and I think related to the times in which data were harder to come by a lot of times the evaluations. skip right past the thing that the organization’s actually doing.
[00:22:35] and so I, I’m a huge proponent and this, as I was listening to you, so it made me think of it, you know, if we are, if we’re collecting information about the quality and the consistency of our services, that’s far more useful to us. that’s stuff that’s more in our control than how a person, and that it is more directly connected to.
[00:22:55] That final assessment or a person getting a job or whatever, you know, whatever your outcome is, and is, again, that more timely information. It’s going to, it’s going to, you know, get back to you in times that you need, and I literally saw a few years ago, a presentation of an evaluation design that had an arrow going around the program itself in terms of the data they were collecting.
[00:23:17] I will not name names, but I was gobsmacked that there was all of this stuff about So. Who the participants are and did it and all this information about their, you know, their grades and their test scores and it literally had an arrow going around the thing itself. And I thought to myself, Oh my gosh, That is going to be so useless for the organization. That’s not going to tell them a thing. and any of those, anything that comes out on the end, like, you’re not going to be able to attribute that to people’s participation and the thing, because you don’t know a darn thing about the thing itself.
[00:23:46]
[00:23:46] Corey: and. Again, I think there’s lots of reasons why those habits are so ingrained in us. And I think that that’s to your point about kind of being involved a little bit more early, thinking about what, whose questions are getting asked and answered. And it’s not that folks in mission driven organizations don’t care deeply about things like young people succeeding in school or people getting jobs or whatever, you know, whatever their outcome is.
[00:24:10] But they also need those good signals, adding good information about how, what they’re doing,is working or not, who’s showing up, who’s not, so they can be making those changes too.
[00:24:20] Alexandra: A lot of what you just talked about. Gets back to looking at. Two versions of values that the organization has, right? the values in terms of what is it that they’re investing in? What is it that is at the center of, of what they do and why they exist? And then the other half or the other version of values, you know, goes back to your inclusiveness values and those sorts of conversations.
[00:24:45] And so I’m wondering if you could talk a bit more about that actual process of bringing values into The data collection process into the evaluation definition process. just how do we actually marry those and make sure that they stay together rather than, as you said, a more extractive data process that has nothing to do with the values of the organization.
[00:25:05] Corey: So at the conceptual level, there is really astounding. Just groundbreaking work happening, both from a group called the equitable evaluation initiative as well as a broader area of practice called Korea, a culturally responsive evaluation and assessment. And I want to extra I want to talk about the principles from equitable evaluation initiative that again I think we’re just do such an amazing job of reframing a lot of what evaluation is in this for so principle one.
[00:25:41] And for those who can’t see me all of you, I’m reading off of their website so don’t get too impressed that I know these by by heart. The principle one is that evaluate evaluation and evaluative work should be in service of equity. Two, evaluative work should be designed and implemented commensurate with the values underlying equity work, meaning that a valid and oriented toward participant ownership.
[00:26:05] An evaluative work can and should answer critical questions, about the ways in which historical and structural decisions have contributed to the condition, the effect of the strategy on different populations or those underlying systemic drivers, and the ways in which the cultural context is, is tangled up in those structural conditions and the change initiative itself. That’s heavy stuff and really reorienting that is really different than. Did it work or not? Did, you know, was the work plan implemented as such? and I think, again, this principle one about evaluation, evaluative work, should be in service of equity is really powerful. and that, I think that can really.
[00:26:43] then kind of at that, that next level, if we accept some of these big principles, that then is going to change our thinking about,whose perspective is centered in deciding what the evaluation questions are. I heard from a colleague recently at grant makers for effective, organizations conference.
[00:26:59] She said, yeah, I’ve learned just to stop and look at my evaluation questions and think who is going to use the answers to these questions. And it stopped me in my tracks. I was like, Oh, dang, who will and that even a really simple gut check like that I think can be so powerful. I think it also really affects how you think about what evidence is and how you collect data.
[00:27:21] And so, you know, what does it look like to get most proximate to the people or communities that are being affected by the thing. What does it look like to. Not collect everything in writing asynchronously. what does it look like to, you know, to embrace story and narrative and visuals and a wider, a much wider variety of information as evidence.
[00:27:45] and that those are not easy things to do. And I think for most organizations, ours included, our clients included, we’re making those incremental steps that it’s not a, you know, all surveys are bad, all narrative is good kind of, kind of dichotomy. And I think it’s really important for. For nonprofits and their collaborators to think about that, and to be more look for more opportunities to incorporate their values, their organizational values into the ways that they’re even designing evaluations or setting up evaluations and the ways that they’re conducting them.
[00:28:21] Alexandra: I agree. That is some pretty heavy stuff, right?
[00:28:23] That’s very very intense. And I’m curious, you mentioned a couple of,of good places where we can start to make that translation from a big, heavy idea to actually how you do that. cause as I listened to that, my first reaction was how on earth would that even, how do you even make that into a reality?
[00:28:42] How do you translate this idea into actual actions that you’re taking that would result in living those principles? And you mentioned, Considering more closely like what the actual evaluation questions are, and I think that’s a really good practical place to start, is explicitly identifying the questions you’re asking in your evaluation.
[00:28:59] Because sometimes we take those for granted and we didn’t actually stop and think about are those the questions we want? Why did we pick those? Who picked those? Who defined them? and are they the ones we want? And I, I liked that addition of the then who’s using the answers. Because that’s a separate question from what the questions are.
[00:29:15] Like who will actually Use the answers, who will get the answers? Right? Who gets to have access to those? I was wondering if you had a few other areas or a few other suggestions of actual, sort of practical applications of that lived. Values and evaluation.
[00:29:31] Corey: yeah. and there’s, I think, a really wide variety of different tactics and strategies that, that can be used. and some, I think, are really straightforward. Things like language access are, you know, are, if you’re doing surveys, almost everybody does. Are they in the languages that your community speaks?
[00:29:49] again, I won’t name names, but I’m shocked at the number of times that All right. organizations that,and it’s a resource question. I 1000% understand that. And are, you know, is it in the languages and the reading levels that, that makes sense to people for communities that are more, have more oral cultures?
[00:30:07] are you, you know, are you making those kinds of, these kinds of input opportunities accessible that way? Those, I think are kind of, kind of some table stakes, if you will, kind of things that organizations can flow in. I think that there’s also. The question of, and this again I think is related to some of these equitable evaluation principles, where’s the accountability lie, who is being evaluated, we are very habituated to evaluating the nonprofit, and we’re very habituated to evaluating the individual, usually the client, even if it is implicit there.
[00:30:38] What would it look like to evaluate the funder? What would it look like to evaluate the adults providing this service? where are we locating that responsibility implicitly or explicitly? and again, I think it’s just the waters we’re swimming in as people in a highly individualized capitalist country with deep roots in racism.
[00:30:59] we are habituated to, to make some pretty snap decisions there. And so again, if we back up and just think differently about along these same lines, who’s getting the answers, why, who’s deciding the questions, who’s deciding what the answer is, that it can really improve things.
[00:31:15] Another thing I’ll mention, I want to mention, the, we all count project, woman named Heather Krauss, um, who is astounding and amazing. If you all don’t know her, go check her out. and she, she does a lot of really practical work around data equity, things like, how to collect information about people’s, racial, ethnic, gender identities in more inclusive ways, and One of the things we learned from her is something called an equity gap score, and it is dead simple.
[00:31:44] It’s so brilliant. It’s simply the score is the ratio between the group of the highest value of something in the lowest value of something. So for example, if people on the west side of town. Weigh 200 pounds on average making this up people on the east side of town way 100 pounds on average. That equity gap score is two.
[00:32:05] and what the equity gap score does is, is a nice bias interrupter is, and that it can show you at scale. Cause you can calculate it really easily on survey data, participation data, anything you want. Anything with an equity gap score of more than, much more than one deserves a closer look.
[00:32:21] Sometimes the implications are pretty minimal. You have a little bitty sample size or it’s a measure that isn’t that salient. But it’s a really great way to just habituate that bias interrupter, in your analysis so that you are being more consistent in the ways that you’re directing your attention.
[00:32:38] so again, I think there’s a lot of different ways that, and then, you know, I think. You know, like folks like our colleagues at the culturally responsive evaluation and assessment center, you know, they are doing some really amazing work around incorporating indigenous indigenous forms of knowledge into evaluation.
[00:32:56] How do we really rethink a deep level? What constitutes evidence? And how do we collect it? And so, again, I think there’s a really wide range. Things that some of which wonderfully are fairly straightforward to implement. you can make your surveys and your intake forms less terrible pretty easily.
[00:33:16] And then some others do require more work more innovation and deeper thinking.
[00:33:21] Alexandra: I love Heather Krauss, and one of the things that she also talks about is that looking at how we collect data, Isn’t just about being fair. It’s also a part of rigor, right? If you’re asking questions that routinely exclude a particular group, or you’re asking questions in a way that’s going to make it harder for part of your population to answer than another, you’re building inaccuracies into your data.
[00:33:46] Corey: yes, culturally responsive. And I learned something really similar from Jared in coffee. Who’s the founder of the equitable evaluation initiative that culturally appropriate methods are rigor. It’s not a nice to have. It’s not a like, you live in the blue bubble. That’s cute. Like, it is rigor and. Again, that there are, there’s, you know, a whole lot of work that we can all be doing when it comes to like, to actually living into that to, to see that as anything other than rigor in this very diverse multicultural society that we live in,is foolish.
[00:34:20] Alexandra: I remember listening to an evaluator who was, doing a sort of broader evaluation of programs working with people who are experiencing homelessness and she was talking about that. They had a screening about sleep deprivation and it, you know, it was like 16 questions or whatever. And as she started looking at this, she realized how many of the questions didn’t apply to somebody who didn’t own a car, right?
[00:34:43] One of the questions was, do you fall asleep driving a car? I’m like, You’re asked, like, how is someone who doesn’t have a car going to answer that question? They’ll probably say no, because they don’t have a car, but that’s the wrong answer, right? Maybe they’re so exhausted they would fall asleep if they were driving a car.
[00:34:58] but you’re not going to pick that up. And so again, yeah, just that idea that this has to be part of. Rigor not a nice to have. so if we sort of back up, I think these are so many very practical ways that we can particularly bring values into the way that we do evaluation. Are there some other sort of last recommendations you might have just generally about how we can Do evaluation better because I think, as you say, we do leave a lot to be desired sometimes in how we conduct and engage with our evaluation.
[00:35:31] Corey: I think at the heart of it is just really pushing ourselves to be thoughtful about. what will we do with the answer? who, for whose question of practice is getting answered? and that’s not to say that there’s no, no reason to answer a funder’s question or question of practice at all.
[00:35:49] And I think just to really push ourselves there, because it does take time and effort on many people’s behalf. Depending on the context, it can be traumatizing or re traumatizing for people to give you the information that you’re asking. And so let’s, let’s be real hard on ourselves about what we’re going to do with the answer, so that we’re really making the most of that experience.
[00:36:12] And then I think as part of that, you know, let’s really push ourselves to. humanize the process as, as much as we can,and evaluation data collection is a process of abstraction that, that is what it is. And I think it, there are ways to humanize it. There are ways to make it, yeah, just a less extractive process, a more collaborative process,that aligns so much with the stated values of so many mission driven organizations.
[00:36:37] And align with so much of what they’re already doing when it comes to their services and the ways they show up for people and evaluation is just kind of one of those trailing oftentimes kind of the trailing part of that transformation and organization.
[00:36:50] Alexandra: I was reminded of a story. My mom told me after trying to go donate blood and she got so fed up trying to fill in the intake form. she finally just walked out again
[00:37:01] because there was like one because there was one issue where like the intake nurse had put her birthday in wrong somewhere early on and in order for her to fix it like she had to start the whole thing all over again it couldn’t be edited and there were like several other things where she was like I’m going to spend 25 minutes Filling in your forms.
[00:37:20] And so when I think about humanizing how we collect data, part of it is just don’t make it so onerous that you piss off the people who are trying to be engaged with it,
[00:37:28] which is a small part of humanizing, but it
[00:37:30] just made me think about
[00:37:32] Corey: and it raises those questions about whose time is more valuable. who is it on to fix it? and you know, is it the beleaguered nurse at the blood donation station or your mom? Like, those can be tough calls. But again, I think we’re so habituated to think just associating.
[00:37:48] Money or wealth with value and, and whiteness with wealth and therefore with value that we don’t even think of it. We don’t like we don’t even question those assumptions. And as you say, it’s It’s probably, you know, it’s going to be a perpetual state of improvement and, you know, and, and short term balance rather than sort of everything being fixed.
[00:38:09] But there’s, you know, there’s a lot of room for opportunity or a lot of opportunity for improvement.
[00:38:13] Alexandra: Well, and what I thought about, too, wasn’t just, like, should it be my mom who fixes it or the nurse, but, like, why doesn’t their system let them edit one question, right? Like, why should anybody have to go in and redo the entire thing rather than just being able to go in and say, I need to change this one field.
[00:38:28] Let me type in a quick edit or make a comment that will update that one field, right? You know, so it’s like an example of the system not taking care of the humans,
[00:38:35] right,
[00:38:35] and,
[00:38:35] forcing the humans to adapt to that system.
[00:38:38] Corey: Yeah, and it makes me think about the, again, the nonprofit starvation cycle, because Dollars to Donuts, they’re running a database program that was cutting edge in 1996, and there’s one person in the state who knows how to keep it up, and heaven help us when she retires, and yeah.
[00:38:57] Alexandra: Exactly. Exactly. Well, thank you so much for your time today. I deeply appreciate all of your insights and the wonderful things that you’ve shared today. for people who want to connect with you, follow you, learn from you, maybe work with you, how can they find you?
[00:39:11] Corey: Yeah. Thank you so much. So, again, my name is Corey Newhouse, and my organization is called Public Profit. Uh, you can visit us at www. publicprofit. net. also I am one of two Corey Newhouses in North America, according to Google, and I’m the one with curly reddish brown hair. So I’m Pretty easy to find online.
[00:39:33] Uh, if you type in my name, it’ll take you, very close to, if not directly, to the Public Profit website. we’re also, if you’d like to connect with me on LinkedIn, very active there and would love to be in touch.
[00:39:43] Alexandra: Awesome. Thank you so much.
[00:39:46] Corey: Alright, thank you.
[00:39:46] You have been listening to Heart, soul, and Data. This podcast is brought to you by Moroccans, an analytics education, consulting, and data services company devoted to helping nonprofits and social enterprises amplify their impacts and drive through data. You can learn more at Moroccans Dot. M E R a k i n o s.com.
Corey Newhouse
Corey Newhouse has been a dedicated advocate for educational equity and social justice since the 1990s. As the visionary founder of Public Profit, she spearheads a mission to empower purpose-driven organizations. Her dynamic leadership shapes the team’s strategic direction, external relationships, and business development. Prior to that, Corey was a Senior Policy Associate with Children Now, supporting the policy team with data and evaluation, and as an Associate with HTA, a strategy and fundraising consulting firm.
Comments are closed