Monday, September 19, 2016

JPAM Featured Article: "Testing the School-to-Prison Pipeline"

As part of our ongoing effort to promote JPAM authors to the APPAM membership and the public policy world at large, we are asking JPAM authors to answer a few questions to promote their research article on the APPAM website.

PRINT PAGE

Session Recap: Survey Data Versus Administrative Data in Evaluation

November 7, 2014 03:12 PM

By Liu Yi, Harvard University

Data source is an essential part for research. As evaluation relies more on evidence from research and analysis, whether administrative data and survey data have significant difference or not is becoming a heated discussion among researchers and practitioners today.

Burt Barnow from George Washington University used differences in impacts for previously evaluated RCTs as an example to try figuring out the trade-offs between survey and administrative data. In this research, they do confirm that the source of data makes a substantial difference in impact estimates. He also addressed that more attention should be paid to know more about how to accurately measure impacts. It is unwise to ignore the large deviations in impact estimates by source of data.

With questions about what the consequences of the change in data source, Reuben Ford from Social Research and Development Corporation using the Canadian Experiment as an example to formulate the answers to this question. Changing data from “Admin + Survey data” to “admin data” only, they provide evidence that data sources will affect program impact in a relative robust way. Moreover internal validity and conclusions for policy need not be affected, even when levels of outcomes of interest are affected.

With motivations that earnings impacts for respondents are greater than for full populations, Richard Dorsett, National Institute of Economic and Social Research, used his presentation to explain uncertainties in administrative data and also addressed respondent impacts in this research. They found that administrative data allowed validity of survey–based data estimates to be explored. Meanwhile, Dorsett also pointed out that weighting on the basis of post–randomization outcomes helps, but then is no longer experimental.

Jacob Alex Klerman, Abt Associates, Inc. gave several responses to each paper and shared his own thoughts on survey data and administrative data. He said that many of today’s presentations focused on the right and wrong between the admin and survey data, which he believed was not a right direction to take. Survey data does have advantages because people think they can ask the question they want, while administrative data is not the exact material targeted for their research. However, Klerman pointed out that using survey data didn't mean people could get the true answers for themselves. So he suggested instead of finding right and wrong between the two types, researchers should put them in different kind of level.

 

« Back

 
 
 
Association for Public Policy Analysis and Management (APPAM)
NEW ADDRESS! 1100 Vermont Avenue, NW, Suite 650 Washington, DC 20005
Phone: 202.496.0130 | Fax: 202.496.0134
Twitter Facebook LinkedIn Subscribe to me on YouTube

Home|About APPAM|Membership|Public Policy News|Conference & Events|Publications| Awards|Careers & Education|Members Only

Web site design and web site development by Americaneagle.com

© 2016 Association for Public Policy Analysis & Management. All Rights Reserved.
Site Map | Privacy Policy | Terms of Use | Events | Add Your Event