1100 Vermont Avenue, NW, Suite 650
Washington, DC 20005
Phone: 202.496.0130 | Fax: 202.496.0134
APPAM - Association for Public Policy Analysis & Management

Peter H. Rossi Award

Introducing the 2018 Recipient: Mark Lipsey, Vanderbilt University

We are delighted to announce that Mark W. Lipsey of Vanderbilt University has been selected to receive the 2018 Peter H. Rossi Award for Contributions to the Theory or Practice of Program Evaluation.

Mark_LipseyProfessor Lipsey recently stepped down as the Director of the Peabody Research Institute at Vanderbilt University, a unit devoted to research on interventions for at-risk populations. After a more than forty year career in program evaluation, he has transitioned to what he calls “semi-retirement” but maintains an appointment as Research Professor in the Peabody College Department of Human and Organizational Development.

As one nominator wrote, “In sum, Mark Lipsey embodies all the best characteristics that Peter Rossi himself embodied. He is thoughtful, thorough, and creative and uses his gifts to explore issues of great importance, the findings for which then yield highly applicable practice and policy implications.”

Lipsey will deliver a Super Session talk at the APPAM Fall Research Conference entitled “Roiling the Waters: Controversy Over the First Longitudinal Randomized Student of a State Pre-K Program.” This session will be on on Friday afternoon, November 9, at 1:30 pm and reactions will follow from Eric A. Hanushek, Stanford University, Rebecca A. Maynard, University of Pennsylvania, and Larry L. Orr, Johns Hopkins University. He will then receive the award during the Presidential Address and APPAM Awards beginning at 5:00 pm. All conference attendees are welcome to join both Super Session and the Presidential Address, as well as the Presidential Reception at 6:30 pm.

Professor Lipsey’s research has been supported by major federal funding agencies and foundations and recognized by awards from the University and major professional organizations. His published works include textbooks on program evaluation, meta-analysis, and statistical power as well as articles on applied methods and the effectiveness of school and community programs for youth.

Among these, he coauthored the 6th, 7th, and 8th editions of the Rossi et al. program evaluation textbook, Evaluation: A Systematic Approach, the most enduring and widely used text in the field. His other publications on program evaluation methods and concepts over the years touch on issues of program theory and experimental and quasi-experimental design among others. An especially salient theme has been ways to characterize the practical significance of the statistical effect size estimates generated by intervention research and the statistical power demands for detecting such practical effects in field-based evaluation studies. The earliest expression of this interest was the book, Design Sensitivity: Statistical Power for Experimental Research (1990), written for use in graduate seminars on experimental and quasi-experimental design to complement emphasis on internal validity as a key design consideration. The most recent expression is a monograph commissioned by IES on the representation of statistical effect sizes in practical terms, written with the help of a team of doctoral students (Lipsey et al., 2012: Translating the Statistical Representation of the Effects of Education Interventions into More Readily Interpretable Forms).

Professor Lipsey’s direct program evaluation research has focused on interventions for at-risk children and youth. The most recent example is the first randomized longitudinal study of a state-funded prekindergarten program, for which he is the principal investigator (most recent report is Lipsey, Farran, & Durkin, in press: Effects of the Tennessee Prekindergarten Program on Children’s Achievement and Behavior through Third Grade). The controversial findings of this study have generated considerable commentary and various published responses from the research team. Most of his program evaluation work, however, has involved application of meta-analysis to the results of the body of available controlled studies in selected intervention areas. This work (with colleagues) has examined substance abuse programs for adolescents, school dropout prevention, and school-based interventions for aggressive and disruptive behavior, among others.

The most extensive work of this sort has been done on treatment programs for juvenile offenders. This work began in the late 1980s, with the earliest comprehensive report appearing in 1992 (Lipsey: Juvenile Delinquency Treatment: A Meta-analytic Inquiry into the Variability of Effects), the most recent one in 2009 (The Primary Factors That Characterize Effective Interventions with Juvenile Offenders: A Meta-analytic Overview), with many related papers in between. This work has been recognized as one of the major influences in overturning the “nothing works” myth about rehabilitation of offenders established in the 1970s in criminology (see Cullen, 2005). As such, it follows in the footsteps of his highly cited paper on the extent to which meta-analysis was reversing the conclusions drawn from conventional research reviews about the effectiveness of psychosocial interventions (Lipsey & Wilson, 1993: The Efficacy of Psychological, Educational, and Behavioral Treatment: Confirmation from Meta-analysis).

The distinctive feature of this meta-analysis work is an emphasis on the variation in reoffense outcomes, specifically, an attempt to identify the program characteristics most strongly associated with favorable outcomes. The most recent phase of this work has focused on using those results to develop evidence-based practice guidelines and an instrument for assessing the expected effectiveness of local programs based on how well their profiles match those guidelines. This scheme is intended to better bridge between effectiveness research and effective practice, and has been, or is currently being implemented in more than ten state juvenile justice systems and one in Australia. The several associated publications have been intentionally directed toward outlets that reach juvenile justice practitioners; a recent and illustrative example is Howell, Lipsey, Wilson, & Howell, 2014 (A Practical Approach to Evidence-based Juvenile Justice Systems).


About the Award

The APPAM Policy Council approved the Peter H. Rossi Award, for Contributions to the Theory or Practice of Program Evaluation, on April 8, 2005. Funding for the award comes from an endowment managed by the University System of Maryland Foundation, Inc. This endowment welcomes additional donations. The University of Maryland School of Public Policy hosts a separate website for the Rossi Award, welfareacademy.org/rossi.

The Rossi award honors the lifetime achievements of Peter Rossi (1921–2006) by recognizing important contributions to the theory or practice of program evaluation. The award may be for a recent paper, publication, or for an entire body of work.

The Rossi Award is given out every other year. The award was presented in 2016 to Rudolph Penner, Ph.D., Robert Reischauer, Ph.D., and Alice Rivlin, Ph.D. at the 2016 Fall Research Conference. Read more about the winners. The 2016 winners received a plaque, recognition at the 2016 APPAM Fall Research Conference, reimbursement for travel expenses to the meeting, and a cash award in the amount of $2,000.

Any recent paper, publication or entire body of work will be considered. The selection committee may, from time to time, decide to establish time limits for what may be considered. When appropriate, joint awards will be made for co-authored works or joint products. The paper, publication, or body of work may involve any aspect of planning, conducting, or analyzing evaluations of social programs and may be directed to lay or professional audiences. The work should reflect the importance of precision and objectivity in setting the evaluation framework, design, execution, and reporting, as well as the value of evidence-based presentation or translation for varied audiences. Illustrative examples include works on the state of evaluation or knowledge in a particular substantive field, new approaches to program evaluation, and program evaluation and its role in the political decision-making process.

The selection committee is chaired by Douglas J. Besharov, University of Maryland, and includes two former presidents of APPAM and two past Rossi awardees, to serve for staggered, three-year terms.

Nominations may be made by any individual or organization. (Individuals may nominate their own work.) The letter of nomination (with the nominee’s current address, e-mail address, and phone number) should detail the work’s contributions to the field of evaluation and should include the paper or relevant parts of the body of work. Nominations should be e-mailed to Professor Besharov, Besharov@umd.edu.

Prior Recipients of the Rossi Award

  • 2016: Rudolph Penner, Robert Reischauer, and Alice Rivlin, Congressional Budget Office
  • 2014: Larry L. Orr, Johns Hopkins University
  • 2012: Thomas D. Cook, Northwestern University
  • 2010: Howard Bloom, MDRC
  • 2009: Rebecca Maynard, University of Pennsylvania
  • 2008: Judith Gueron, MDRC
  • 2007: Grover (Russ) Whitehurst, U.S. Institute of Education Sciences
  • 2006: Rob Hollister, Swarthmore College
  • 2005: Frederick Mosteller, Harvard University
© 2018 Association for Public Policy Analysis & Management. All Rights Reserved.