Explaining The Consistently Inconsistent Results of Charter Schools

This is the second in a series of three posts about charter schools. Here is the first part, and here is the third.

As discussed in a previous post, there is a fairly well-developed body of evidence showing that charter and regular public schools vary widely in their impacts on achievement growth. This research finds that, on the whole, there is usually not much of a difference between them, and when there are differences, they tend to be very modest. In other words, there is nothing about "charterness" that leads to strong results.

It is, however, the exceptions that are often most instructive to policy. By taking a look at the handful of schools that are successful, we might finally start moving past the “horse race” incarnation of the charter debate, and start figuring out which specific policies and conditions are associated with success, at least in terms of test score improvement (which is the focus of this post).

Unfortunately, this question is also extremely difficult to answer – policies and conditions are not randomly assigned to schools, and it’s very tough to disentangle all the factors (many unmeasurable) that might affect achievement. But the available evidence at this point is sufficient to start draw a few highly tentative conclusions about “what works."

The first – and most obvious – way to see whether certain policies appear to “work” in the charter context is to look at direct tests of those associations – i.e., whether certain school policies and features are associated with higher (relative) performance among charter schools. There are only a handful of studies that have done so.

This experimental study of New York City charters found some level of support for a few measures, including a longer school day/year, time devoted to reading instruction, “small rewards/small punishment” discipline policies, a school mission statement emphasizing academic achievement and teacher pay systems not based exclusively on experience and education. It bears mentioning, though, that many of the schools in this study (and the others below) adopt the policies in bunches, which further complicates the ability to measure their associations.

Mathematica’s lottery study of charter middle schools in 15 states also failed to find many particularly strong or consistent associations, but there was some evidence of higher achievement in schools using ability grouping, as well as those with smaller enrollments (there was an effect of school time in both subjects, but it didn’t persist once controls were added to the models). None of the “environmental” practices, including measures of innovation and accountability, were related to relative gains (also see here and here).

There was some additional support for school time in this study of Massachusetts charters, which also found a positive association between performance and self-reported adherence to a “no excuses” philosophy toward student behavior and achievement.*

Finally, a very recent Mathematica/CRPE analysis of charter management organizations found that CMOs using comprehensive behavior policies and intensive teacher coaching tend to get better results, and there was very limited evidence for school time as well.

These studies, in addition to being limited in number, scope and ability to directly test causality, don’t yield much in terms of consistent findings on “what works” (it’s worth noting that most include only a small number of schools). But, as a whole, they provide some multi-source support for extended time and, to a lesser extent, policies focused on discipline (and, perhaps, achievement).**

Besides these direct tests of association, we might also start getting clues as to why some few charters seem to produce results by taking a more observational approach – i.e., looking at high-performing charter chains to see whether they share certain features. KIPP schools are by far the most high-profile example of a successful chain, and a 2010 analysis of 22 KIPP schools found significant and substantial gains among students in both math and reading. These results could not, as is sometimes claimed, be chalked up to attrition of low performing students (although it is possible that peer effects stemming from this attrition play a role, perhaps a significant one).

KIPP and other highly-publicized chains such as Achievement First and Aspire are from all indications well-run schools in many respects, and that should not be discounted. But they also share a few key characteristics that may at least partially explain their success.

Namely, most get at least a fair amount of private funding, and virtually all of them provide massive amounts of additional school time (in KIPP’s case, up to 50-60 percent more, with massive funding help via large donations). In fact, I cannot find a single major charter chain that doesn’t provide at least 15-20 percent more time, and some extend it by 40-50 percent or more. That’s the equivalent of a few extra regular public school months. The prevalence of extended time in high-performing charters squares with the above-mentioned evidence on the effects of school time.

It’s also very clear that intensive tutoring programs can boost test scores substantially, and these programs are found in some high-profile charters, such as the MATCH school in Boston. In addition, this evaluation of a pilot program creating “no excuses” charters in Houston found a very strong effect of math tutoring on achievement gains.

Finally, most of these schools have some form of stricter-than-average discipline policy (including parental contracts), which is also consistent with some of the direct evidence above. KIPP, for instance, requires parents and students to sign contracts agreeing to fulfill academic and  behavioral associations, while many KIPP schools maintain what seem to be exceedingly strict disciplinary standards, which, if violated, can lead to suspension or expulsion.

Overall, after 20 years of the charter school movement, it’s clear that the debate is at an impasse, and what’s needed is greater understanding of why – not whether – a handful of charters get exceptional results. A review of the evidence at this point, though scarce and limited, does provide a couple of hints.

An emphasis on discipline seems to have some support, both in direct tests of associations as well as in a surface review of practices in high-profile charters. This might be something to which regular public schools should pay more attention, as the importance of a safe, orderly learning environment is well-established (see here). Needless to say, regular public schools would probably approach the details of these policies in a different way.

The strongest evidence, however, is that for extended time and perhaps tutoring (as well as the funding that enables these practices).

This does not match up particularly well with the rhetoric of “innovation." If there are any consistent lessons from the charter experiment, at least in terms of test-based effects, they seem to tell us what we already know – that performance can be improved with more resources, more time and more attention. These interventions are not cheap or new, and they’re certainly not “charter-specific” in any way.

Nevertheless, these results showing which policies and practices are associated with performance at the level of individual schools are not necessarily a complete picture. There will always be a few high-flying chains and schools that do well, but there is some pretty solid evidence that groups of charters, run by different organizations using a variety of different approaches, achieve solid results, on average, serving the same population. This seems to be the case, for example, in New York City and Boston. There is also variation in relative charter performance between states that might be instructive.

These issues will be discussed in a subsequent post.

- Matt Di Carlo

*****

* This finding was a little strange, since it seems to have been based on whether a school (or whomever completed the researchers’ survey) described itself as “no excuses," even though this philosophy is often associated with other policies, such as extended time and discipline, that were also tested. In addition, the paper does not indicate how this question was worded or measured (if it wasn’t a dichotomous variable).

** It bears mentioning that school time has an “advantage” of sorts in this literature – unlike many other policies, it is easily measured, and may therefore appear in more analyses.

Permalink

Please bring New Orleans Public Schools into your discussion as that is a large scale effort to provide maximal choice to all parents to choose between charter and non-charter. That is one heck of a large scale experiment. Charters there have done wonders, albeit starting from a low base, serving 71% of the entire student population (up from 56% 4 years ago), based on students voting with their feet. I am particularly interested in your analysis of the results as detailed and discussed here:

http://educationnext.org/new-schools-in-new-orleans/

The issue of money is important but to discuss it you need to point out that what public school districts have done when given more money is simply hire more teachers to reduce class size rather than increase instructional time. If you conclude that public schools need more money to emulate charter schools you need to examine why the vast disparities in per pupil expenditures per state have essentially zero correlation with performance as measured by the NAEP. Class size has zero correlation with NAEP performance as well, even when broken down by race and economic status.

In short, more money to traditional public schools won't matter because it will be used to hire more teachers giving unions more members, requiring more overpaid administrators, more school bonds to enrich the local contractors, but doing absolutely zero for the kids.

Permalink

Thanks for your comment, Michael.

I'll be discussing a few whole districts in a subsequent post. I would include New Orleans, but I can't do that without a high-quality analysis that doesn't rely on changes in cross-sectional proficiency rates. I saw that CREDO had done something for NOLA in a newspaper article, but was unable to find the full report. If you know where to find it, please post post the link.

Thanks again,
MD

Permalink

The report sited indicates, "Combining public and private sources of revenue, KIPP received, on average, $18,491 per pupil in 2007-08. This is $6,500 more per pupil than what the local school districts received in revenue"

I am curious how infrastructure is calculated into these numbers. Does the additional $6500 include expenses for building / equipment / supplies? The amount that "local school districts received in revenue" would not, and I'm curious whether this is a fair apples-to-apples comparison.

The "Expenditures" section of the report does not seem to address it. It shows "total current expenditures per pupil" as being only slighter (~$457).

In fact, the report simply says, "we cannot determine whether or how KIPP spends its private sources of revenues"

The fact that a rather large sum of money cannot be accounted for is a remarkably huge gap in the report, especially since the report is now being used to suggest that whatever level of achievement superiority charters enjoy is due in part to this "higher revenue".

Can you share any insight?

Permalink

It's a bad idea to cite anything Gary Miron writes about KIPP, given his fallacious measure of attrition and the problems Mike Reno raises.

Permalink

Mike,

Thanks for the comment. The sentence to which you refer asserts that KIPP receives private funding and provides additional time. The citations in parentheses are meant to support each claim individually but not necessarily to connect them. I agree, however, that the wording does not make this completely clear.

As for how KIPP and other charters spend money, this is an open question in many places (data specificity and availability vary), and I personally think this is a huge problem. But my speculation – that the few “standout” charters tend to get a lot of private money – remains.

Also see this report on NYC charter funding:
http://nepc.colorado.edu/files/NEPC-NYCharter-Baker-Ferris.pdf

And these data on KIPP schools in Texas:
http://schoolfinance101.wordpress.com/2011/11/06/mpr%e2%80%99s-unfortun…

Thanks again,
MD