Friday, October 19, 2018

BPC Obtains Documents Revealing Labor Dept Cut Funds for Program Evaluation

The Bipartisan Policy Center obtained documents through the Freedom of Information Act that reveal the Department of Labor sharply cut funding for evaluating whether its programs are operating effectively. The documents show the Labor Department quietly slashed the amount of funding it transfers to its chief evaluation officer for this fiscal year’s program evaluation to $2 million. That compares with $13 million and $27 million for use in prior fiscal years 2018 and 2017, respectively.


Systematic Reviews as a Tool in Evidence-Based Decision Making

February 7, 2013 12:00 PM

On January 30, 2013, the Mathematica Policy Research Center for Improving Research Evidence (CIRE) presented the public forum Systematic Reviews as a Tool in Evidence-Based Decision Making: Improving Research and Informing Practice. With constant pressure to “do more with less,” policymakers and program administrators are turning to the existing research base for guidance on funding decisions, assistance with program development, and evidence of program effectiveness. For the research to be useful, however, decision makers must be able to draw accurate lessons from what can often be a large and bewildering assortment of relevant studies. Systematic reviews can be particularly useful in this regard because they identify, assess, and synthesize key pieces of evidence on policy or program effectiveness.

The panel, introduced by Roberto Agodini, Director of CIRE, included presenters Kathryn Stack, Office of Management and Budget (OMB) and Jill Constantine, Mathematica, and panelists Joy Lesnick, Institute of Education Sciences, U.S. Department of Education; Naomi Goldstein, U.S. Department of Health and Human Services (HHS); and Antionette Rath, Mount Laurel Township (NJ) Schools. The forum was recorded and is available on Mathematica’s website.

Policy and program decisions typically involve selecting one choice from among a set of options, and research about the effect of those options can help inform the decision process. For that research to be beneficial, decision makers need a way of drawing accurate lessons from what can be a large assortment of relevant studies. Systematic reviews are particularly useful for this process because they identify, assess, and synthesize key pieces of evidence on policy or program effectiveness.

Stack presented a few points of consideration regarding the use of systematic reviews. These present a way to figure out how much researchers collectively know regarding a particular topic of study. This presents a different way of thinking about federal grant program designs. Systematic reviews also forces researchers to address the question of what constitutes quality research.

Constantine discussed what a systematic review is and how it differs from a research summary. The key difference lies in what question the researcher wants answered. When designing a review, there are five key considerations to contemplate within the constraint of available resources: what is the review of, identifying the research, setting standards, training reliable reviewers, and summarizing findings based on the audience. Constantine also shared three distinct tools needed for success—protocol, documentation, and a reporting template.

The panel then discussed how systematic reviews worked within the context of their programs. Lesnick shared about the What Works Clearinghouse (WWC), a central and trusted source of scientific evidence for what works in education. The clearinghouse identifies and extracts relevant evidence and disseminates it through ready-to-use and custom reports while supplying tools and resources to support new analyses. The WWC engages stakeholders through tiered evidence grant-making, evidence-linked policy waivers, and facilitating access to technical assistance and federally managed evaluations.

Goldstein shared how the Administration for Children & Families (ACF) within the HHS sets its evaluation policy. Critical to evaluation is the question “What works compared to what?” How the “what” is a defined impacts the implementation method, while the definition of “works” indicates how broad the review’s impact will be. Through evaluation, the ACF and its partners learn systematically so that their services are as effective as possible. Continual improvement requires systematic approaches to using information; the ACF developed a policy that addresses the rigor, relevance, transparency, independence, and ethics of each evaluation they conduct.

Finally, Rath shared how the Mount Laurel Township school district in New Jersey uses systematic reviews. Many school districts rely on benchmark reviews and measures. Districts should be engaged to use, rely, and disseminate systematic review data, rather than be influenced by political, media, and traditional pressures.

Several questions were raised during the public discussion, including the use of meta- analysis in the systematic review process, the relationship between producers and developers of systematic reviews, and if these reviews are useful in real-time assessment during immediate crisis. At the conclusion of the forum, the panel and presenters stressed that systematic review is not the same as policy analysis, but a tool of the policy process. Policy analysis draws upon existing systematic reviews to help policy makers produce an informed decision. Through efforts to create and adhere to standards, systematic reviews can aid in the implementation of policy decisions.

Mathematica Policy Research holds various public forums on a variety of research topics and discussions through its five specialized research centers: the Center for Studying Disability Policy (CSDP), the Center for Improving Research Evidence (CIRE), the Center on Health Care Effectiveness (CHCE), and the Center for International Policy and Research Evaluation (CIPRE). The next forum is Growing Pains: How Disability, Risky Behaviors, and Expectations During Youth Influence Early Adult Outcomes by the CSDP on February 21, 2013.


« Back

Association for Public Policy Analysis and Management (APPAM)
NEW ADDRESS! 1100 Vermont Avenue, NW, Suite 650 Washington, DC 20005
Phone: 202.496.0130 | Fax: 202.496.0134
Twitter Facebook LinkedIn Subscribe to me on YouTube

Home|About APPAM|Membership|Public Policy News|Conference & Events|Publications| Awards|Careers & Education|Members Only

Web site design and web site development by

© 2018 Association for Public Policy Analysis & Management. All Rights Reserved.
Site Map | Privacy Policy | Terms of Use | Events | Add Your Event