University of Wisconsin, Madison
Both researchers and practitioners shared their experience implementing randomized control trials (RCTs) during the Saturday morning super-session. The session began with an introduction by Louise Geraghty from the Abdul Latif Jameel Poverty Action Lab. Louise offered up several questions for the panelists regarding the choice of implementing RCTs, how their organizations and institutions approached collaborative evaluation, and what lessons we can take from their successes and challenges.
Two different research-evaluation teams shared their perspectives. First, Weston Merrick from the Minnesota Office of Management and Budget and Adam Sacarny from Columbia University shared their experience collaborating on an RCT to evaluate an intervention aimed at health care providers in Minnesota. The intervention sought to encourage providers to use a prescription monitoring program aimed at reducing over-prescription of opioids. The RCT model was ideal for this case because the team was able to disentangle a specific intervention from other existing models, and directly address the policy goal of encouraging providers to use the monitoring program.
The second team of evaluators discussed their work with educators in Puerto Rico. Emily Goldman from La Universidad del Sagrado Corazón discussed her collaboration with Damarys Varela Veléz and Janine Ortiz from the Puerto Rico Department of Education. This team set out to evaluate the EDUGESPRO program, a principal training program. After several unprecedented events struck Puerto Rico including multiple hurricanes and the COVID-19 pandemic, the research team had to quickly adapt their strategies, but also discovered strong political will for robust evidence to guide Puerto Rico’s recovery.
Both teams highlighted the importance of long-term relationship building to successful evaluations. Emily noted the importance of embedding herself in the PR DOE and building trust between her academic team and the implementers and administrators actually carrying out much of the work. The Minnesota team also emphasized the importance of building robust data and people infrastructure as early on in a project as possible and rehearsing any tricky data or analytic processes before they are implemented.
Finally, the panelists ended with some important advice for those designing research projects or evaluations: ensure that your research is tailored for the specific policy goals of the practitioners, and that any outcome you produce will mutually benefit all stakeholders.