Thursday, September 22, 2016

Member Spotlight: Roger J. Chin

Member Spotlight: Roger J. Chin

PRINT PAGE
mathematica

Using Implementation Science to Advance the Adoption of Evidence-Based Programs

October 10, 2014 12:00 PM

Recently, Mathematica Policy Research hosted a forum that examined the emerging field of implementation science. Using Implementation Science to Advance the Adoption of Evidence-Based Programs was moderated by Diane Paulsell, Mathematica Policy Research and featured Debra Joy Pérez, The Annie E. Casey Foundation; Lauren Supplee, U.S. Department of Health and Human Services Office of Planning, Research and Evaluation Administration for Children and Families (OPRE); and Allison Metz, The National Implementation Research Network. Ann Peterson, Director of Mathematica’s Center for Improving Research Evidence (CIRE) gave the opening remarks.

Implementation science creates generalizable knowledge that can be applied across contexts and settings to answer common research questions, such as:

  • When is an organization ready to adopt a new innovation, and what are the best practices for doing so?
  • What supports and systems are needed for effective implementation?
  • What organizational and contextual factors support or impede effective implementation?
  • What dosage and quality of services must be provided to produce meaningful impacts?
  • What strategies are effective for engaging participants in services?
  • How can innovations be adapted for replication in diverse practice settings and with different populations?

As a newly emerging field, the definition of implementation science and the type of research it can encompass varies according to setting and sponsor. “Some practice guidance is available,” said Paulsell, “but there are still many gaps in our knowledge. Guidance may not take organizational constraints into account.” She mentioned that more systematic implementation research is needed in the applied practice setting to better inform practice and technical assistance. “There is also a need for common operational definitions of constructs, to allow for cross-model and cross-discipline comparisons.”

Supplee provided some common evidence guidelines from the National Science Foundation that is used by OPRE with regards to implementation science. Foundational and early stage/exploratory research helps develop, test, or refine theories and establish logical connections. Design and development research leads to additional work to understand theory or lead towards rigorously testing strategy. “This could be qualitative, descriptive statistics, or impact results,” said Supplee. And efficacy, effectiveness, and scale-up research, when implemented in trials, helps better understand conditions under which the trial was executed—including counterfactual condition.

Metz discussed active implementation and co-creating capacity. “Effective interventions, implementation methods, and enabling contexts result in socially significant outcomes,” she said. “Active implementation frameworks provide the foundation for co-creation of roles and responsibilities.” Implementation teams can provide an accountable structure to move intervention through stages of implementation for any new innovation, with a focus on data-based decision-making, alignment of funding and policy, and problem-solving sustainability.

"In this conversation, one of the things we want to be mindful of is that when we think about implementation science and adaptation, a really important issue is that any intervention really should be assessed through the lens of adaptation," said Pérez. "You can't expect something that was done in one context to actually have applicability in a multitude of other contexts without taking into some level of adaptation." She elaborated, explaining that there exists a temptation to see adaptation as simply the issues of variability beyond race, ethnicity, and population--but it's much more than that. It has to do with the beliefs, the values, and the experiences of the community that intercede in that intervention.

“What can be said about a different kind of knowing that doesn't require a randomized control group?” said Pérez. “I want us to think about implementation science as one mechanism to make a contribution without being tied to a false standard.” She indicated that as implementation science moves forward, a key role of leaders in the evaluation sector is to train and mentor a new generation of researchers to really understand community. “We need to embrace these different modes of evaluation as part and parcel of this current movement in implementation science.”

 

« Back

 
 
 
Association for Public Policy Analysis and Management (APPAM)
NEW ADDRESS! 1100 Vermont Avenue, NW, Suite 650 Washington, DC 20005
Phone: 202.496.0130 | Fax: 202.496.0134
Twitter Facebook LinkedIn Subscribe to me on YouTube

Home|About APPAM|Membership|Public Policy News|Conference & Events|Publications| Awards|Careers & Education|Members Only

Web site design and web site development by Americaneagle.com

© 2016 Association for Public Policy Analysis & Management. All Rights Reserved.
Site Map | Privacy Policy | Terms of Use | Events | Add Your Event