Search for EdWorkingPapers here by author, title, or keywords.
Standards, accountability, assessment, and curriculum
How do college non-completers list schooling on their resumes? The negative signal of not completing might outweigh the positive signal of attending but not persisting. If so, job-seekers might hide non-completed schooling on their resumes. To test this we match resumes from an online jobs board to administrative educational records. We find that fully one in three job-seekers who attended college but did not earn a degree omit their only post-secondary schooling from their resumes. We further show that these are not casual omissions but are strategic decisions systematically related to schooling characteristics, such as selectivity and years of enrollment. We also find evidence of lying, and show which degrees listed on resumes are most likely untrue. Lastly, we discuss implications. We show not only that this implies a commonly held assumption, that employers perfectly observe schooling, does not hold, but also that we can learn about which college experiences students believe are most valued by employers.
In multisite experiments, we can quantify treatment effect variation with the cross-site treatment effect variance. However, there is no standard method for estimating cross-site treatment effect variance in multisite regression discontinuity designs (RDD). This research rectifies this gap in the literature by systematically exploring and evaluating methods for estimating the cross-site treatment effect variance in multisite RDDs. Specifically, we formalize a fixed intercepts/random coefficients (FIRC) RDD model and develop a random effects meta-analysis (Meta) RDD model for estimating cross-site treatment effect variance. We find that a restricted FIRC model works best when the running variables' relationship to the outcome is stable across sites but can be biased otherwise. In those instances, we recommend using either the unrestricted FIRC model or the meta-analysis model; with the unrestricted FIRC model generally performing better when the average number of in-bandwidth observations is less than 120 and the meta-analysis model performing better when the average number of in-bandwidth observations is above 120. We apply our models to a high school exit exam policy in Massachusetts that required students who passed the high school exit exam but were still determined to be nonproficient to complete an ``Education Proficiency Plan" (EPP). We find the EPP policy had a positive local average treatment effect on whether students completed a math course their senior year on average across sites, but that the impact varied enough such that a third of schools could have had a negative impact.
From 2010 onwards, most US states have aligned their education standards by adopting the Common Core State Standards (CCSS) for math and English Language Arts. The CCSS did not target other subjects such as science and social studies. We estimate spillovers of the CCSS on student achievement in non-targeted subjects in models with state and year fixed effects. Using student achievement data from the NAEP, we show that the CCSS had a negative effect on student achievement in non-targeted subjects. This negative effect is largest for underprivileged students, exacerbating racial and socioeconomic student achievement gaps. Using teacher surveys, we show that the CCSS caused a reduction in instructional focus on nontargeted subjects.
Numerous high-profile efforts have sought to “turn around” low-performing schools. Evidence on the effectiveness of school turnarounds, however, is mixed, and research offers little guidance on which models are more likely to succeed. We present a mixed-methods case study of turnaround efforts led by the Blueprint Schools Network in three schools in Boston. Using a difference-in-differences framework, we find that Blueprint raised student achievement in ELA by at least a quarter of a standard deviation, with suggestive evidence of comparably large effects in math. We document qualitatively how differential impacts across the three Blueprint schools relate to contextual and implementation factors. In particular, Blueprint’s role as a turnaround partner (in two schools) versus school operator (in one school) shaped its ability to implement its model. As a partner, Blueprint provided expertise and guidance but had limited ability to fully implement its model. In its role as an operator, Blueprint had full authority to implement its turnaround model, but was also responsible for managing the day-to-day operations of the school, a role for which it had limited prior experience.
After increasing in the 1970s and 1980s, time to bachelor’s degree has declined since the 1990s. We document this fact using data from three nationally representative surveys. We show that this pattern is occurring across school types and for all student types. Using administrative student records from 11 large universities, we confirm the finding and show that it is robust to alternative sample definitions. We discuss what might explain the decline in time to bachelor’s degree by considering trends in student preparation, state funding, student enrollment, study time, and student employment during college.
Despite calls for more evidence regarding the effectiveness of teacher education practices, causal research in the field remains rare. One reason is that we lack designs and measurement approaches that appropriately meet the challenges of causal inference in the context of teacher education programs. This article provides a framework for how to fill this gap. We first outline the difficulties of doing causal research in teacher education. We then describe a set of replicable practices for developing measures of key teaching outcomes, and propose causal research designs suited to the needs of the field. Finally, we identify community-wide initiatives that are necessary to advance effectiveness research in teacher education at scale.
Test-based accountability pressures have been shown to result in transferring less effective teachers into untested early grades and more effective teachers to tested grades. In this paper, we evaluate whether a state initiative to turnaround its lowest performing schools reproduced a similar pattern of assigning teachers and unintended, negative effects on the outcomes of younger students in untested grades. Using a sharp regression discontinuity design, we find consistent evidence of increased chronic absenteeism and grade retention in the first year. Also, the findings suggest negative effects on early literacy and reading comprehension in the first year of the reform that rebounded somewhat in the second year. Schools labeled low performing reassigned low effectiveness teachers from tested grades into untested early grades, though these assignment practices were no more prevalent in reform than control schools. Our results suggest that accountability-driven school reform can yield negative consequences for younger students that may undermine the success and sustainability of school turnaround efforts.
This study investigates the influence of principal tenure on the retention rates of the teachers they hire over time. We analyzed the hiring practices and teacher retention rates of 11,717 Texas principals from 1999 to 2017 employing both individual and year fixed effects. Main findings indicate that a principal who stays in the same school for at least three years begins to hire teachers who stay to both three- and five-year benchmarks at increasingly higher rates. However, the average Texas principal leaves a school after four years and while we do find small positive gains in the initial retention rates of teachers at the next school, the majority of principal improvement in teacher retention does not appear to be portable.
To evaluate how Advanced Placement courses affect college-going, we randomly assigned the offer of enrollment into an AP science course to over 1,800 students in 23 schools that had not previously offered the course. We find no substantial AP course effects on students’ plans to enroll in college or on their college entrance exam scores. Yet AP course-takers enroll in less selective colleges than their control group counterparts. Negative treatment effects on college selectivity appear to be driven more by low student preparation than teacher inexperience and by students’ matriculation decisions rather than institutional admissions decisions.
For nearly three decades, policy-makers and researchers in the United States have promoted more intellectually rigorous standards for mathematics teaching and learning. Yet, to date, we have limited descriptive evidence on the extent to which reform-oriented instruction has been enacted at scale.
The purpose of the study is to examine the prevalence of reform-aligned mathematics instructional practices in five U.S. school districts. We also seek to describe the range of instruction students experience by presenting case studies of teachers at high, medium and low levels of reform alignment.
We draw on 1,735 video-recorded lessons from 329 elementary teachers in these five U.S. urban districts.
We present descriptive analyses of lesson scores on a mathematics-focused classroom observation instrument. We also draw upon interviews with district personnel, rater-written lesson summaries, and lesson video in order to develop case studies of instructional practice.
We find that teachers in our sample do use reform-aligned instructional practices, but that they do so within the confines of traditional lesson formats. We also find that the implementation of these instructional practices varies in quality. Furthermore, the prevalence and strength of these practices corresponds to the coherence of district efforts at instructional reform.
Our findings suggest that unlike other studies in which reform-oriented instruction rarely occurred (e.g. Kane & Staiger, 2012), reform practices do appear to some degree in study classrooms. In addition, our analyses suggest that implementation of these reform practices corresponds to the strength and coherence of district efforts to change instruction.