News & Publications



#APPAMWestat Speaker Interview: Virginia Knox

May 30, 2019 08:14 AM
We interviewed Virginia Knox, Vice President at MDRC, on the role of mixed methods research in evaluating social programs and her vision for the methodology. Knox, like her colleague Stephen Bell before her, previewed her talk well with her answers. Join us at our June 5th #APPAMWestat forum to experience her full presentation.

Why do you think quantitative analysis is often prized over qualitative? Why is a mix of the two better?

VK: This is an intriguing question since, of course, we all combine quantitative and qualitative analysis every day to make decisions. As an individual, when deciding between walking to work and taking the subway, I weigh some factors that are easy to quantify (cost, time, calories burned, carbon emissions) and some that are qualitative or harder-to-measure (my enjoyment of nice weather). Policy analysis and program evaluation, though, help us to make collective decisions by systematically weighing the benefits and costs of alternative ways to use public resources. I think the evaluation field relies considerably on quantitative analyses because ranking what one option accomplishes relative to another requires quantitative measures of their performance. The limitations of quantitative analyses depend on the goals of a particular evaluation, but we often need qualitative perspectives to fully understand the problem that an intervention or policy is trying to solve, to meaningfully understand the nature of the intervention as it operates, and later to accurately interpret the “how and why” that underlie any quantitative assessments of program performance.
You’ve spent a big part of your career evaluating social programs. In your experience, what benefit does evaluation of social programs gain from combining quantitative and qualitative analysis?
VK: There are too many benefits to list in this short space, but perhaps the most important is that open-ended qualitative inquiry adds a dose of humility to an evaluation. Rather than assume that evaluators can know from the outset all of the ways in which a new program or policy might affect people’s outcomes, a more inductive approach — such as conducting qualitative interviews or observations of the setting — explicitly acknowledges that we have a lot to learn from managers, program staff, or participants who are directly engaged in the activity we’re evaluating. They can often provide insights about why a new approach is succeeding or falling short, if we design our evaluations to include their perspectives. (For more on the benefits of combining inductive and deductive reasoning, see a recent post by Branda Nowell and Kate Albrecht in our Implementation Research Incubator.)
What are some ways that you’d like to see the envelope pushed more in mixed methods evaluation?
VK: At MDRC, we are trying to move beyond multi-method evaluations, which include both quantitative and qualitative components in a study. When it is feasible and appropriate, we want to develop mixed-methods evaluations that intentionally integrate findings from quantitative and qualitative methods, either to improve the study as it unfolds sequentially or to combine deep, nuanced inquiry with broad, generalizable conclusions. For example, a set of interviews with program staff or observations early in a study can inform the final version of a quantitative survey instrument. This is just one example of how a mixed methods study can be designed that we will talk about on June 5th.

Virginia Knox is scheduled to present Pushing the Envelope of Mixed Methods Evaluation to Learn from Prospective Impact Studies at 9:50 AM at the #APPAMWestat forum.


Back to news