Working Papers Series
Papers below are in pdf.
This paper follows a cohort of initially high-performing Missouri students from grade-3 through grade-9 and examines whether attending a low-achieving school impacts their subsequent standardized exam scores, as well as the grade in which they first take Algebra I. Two key findings emerge. First, attending a low-achieving school does not affect the standardized exam performance of initially high-performing students once school quality (as measured by value-added) is accounted for. Second, high-performing students who attend low-achieving schools are more likely to take Algebra I later relative to their counterparts who attend higher-achieving schools.
JEL Codes: I20
Keywords: high-performing students, school quality, student achievement, tracking
Forthcoming in Teachers College Record
There is increased policy interest in extending the test-based evaluation framework in K-12 education to include student achievement in high school. High school achievement is typically measured by performance on end-of-course exams (EOCs), which test course-specific standards in subjects including algebra, biology, English, geometry, and history, among others. However, unlike standardized tests in the early grades, students take EOCs at different points in their schooling careers. The timing of the test is a choice variable presumably determined by input from administrators, students and parents. Recent research indicates that school and district policies that determine when students take particular courses can have important consequences for achievement and subsequent outcomes, such as advanced course taking. The contribution of the present study is to develop an approach for modeling EOC test performance that disentangles the influence of school and district policies regarding the timing of course taking from other factors. After separating out the timing issue, better measures of the quality of instruction provided by districts, schools and teachers can be obtained. Our approach also offers diagnostic value because it explicitly separates out the influence of school and district course-taking policies from other factors that determine student achievement.
JEL Codes: I20
Keywords: value-added, end-of-course exam, end-of-course testing, course timing
The Icarus Syndrome: Why Do Some High Flyers Soar While Others Fall?
This paper follows a cohort of initially high performing Missouri students from grade-3 through grade-9 and examines which school factors influence their academic success. Three key findings emerge. First, in terms of performance on standardized tests, schools that are effective in promoting academic growth among low performing students are also generally effective with high performing students. Second, high performing students who attend disadvantaged schools are more likely to take Algebra I later relative to their counterparts who attend more advantaged schools. Third, somewhat surprisingly, increasing the number of high performing students in a school negatively affects high performing student outcomes.
JEL Codes: I20, I24, I28
Keywords: economics of education, high performing students, No Child Left Behind, exam score performance
Substantially revised as WP1407
The specifics of how growth models should be constructed and used to evaluate schools and teachers is a topic of lively policy debate in states and school districts nationwide. In this paper we take up the question of model choice and examine three competing approaches. The first approach, reflected in the popular student growth percentiles (SGPs) framework, eschews all controls for student covariates and schooling environments. The second approach, typically associated with value-added models (VAMs), controls for student background characteristics and aims to identify the causal effects of schools and teachers. The third approach, also VAM-based, fully levels the playing field so that the correlation between school- and teacher-level growth measures and student demographics is essentially zero. We argue that the third approach is the most desirable for use in educational evaluation systems. Our case rests on personnel economics, incentive-design theory, and the potential role that growth measures can play in improving instruction in K-12 schools.
JEL Codes: I20
Keywords: Teacher evaluation, school evaluation, value-added models, value-added versus SGP
We compare teacher preparation programs in Missouri based on the effectiveness of their graduates in the classroom. The differences in effectiveness between teachers from different preparation programs are very small. In fact, virtually all of the variation in teacher effectiveness comes from within-program differences between teachers. Prior research has overstated differences in teacher performance across preparation programs for several reasons, most notably because some sampling variability in the data has been incorrectly attributed to the preparation programs.
JEL Codes: I20
Keywords: teacher training, value added, data clustering, teacher preparation, teacher preparation program effectiveness
Test Measurement Error and Inference from Value-Added Models
Cory Koedel, Rebecca Leatherman & Eric Parsons
It is widely known that standardized tests are noisy measures of student learning, but value added models (VAMs) rarely take direct account of measurement error in student test scores. We examine the extent to which modifying VAMs to include information about test measurement error (TME) can improve inference. Our analysis is divided into two parts – one based on simulated data and the other based on administrative micro data from Missouri. In the simulations we control the data generating process, which ensures that we obtain accurate TME metrics with which to modify our value-added models. In the real-data portion of our analysis we use estimates of TME provided by a major test publisher. We find that inference from VAMs is improved by making simple TME adjustments to the models. This is a notable result because the improvement can be had at zero cost.
JEL Codes: I20
Keywords: value added models, value added, teacher value added, test measurement error, teacher evaluation
Published in The B.E. Journal of Economic Analysis and Policy 2012