Given our extreme reliance on test scores as measures of educational success and failure, I’m sorry I have to make this point: proficiency rates are not test scores, and changes in proficiency rates do not necessarily tell us much about changes in test scores.
Yet, for example, in the Washington Post editorial about the latest test results from the District of Columbia Public Schools, at no fewer than seven different points (in a 450 word piece) do they refer to proficiency rates (and changes in these rates) as “scores.” This is only one example of many.
So, what’s the problem? Let’s look at a very simple example. Say we have a hypothetical class of five students who take a test, and they are compared to the five students who take the same test the next year. The “proficient” cutoff score is 60.
In year one, the proficiency rate was 40 percent (two kids made it), and the average score was 56. In year two, the proficiency rate went up to 60 percent, which might be seen as a massive improvement. But three of five students scored lower than their predecessors, and the average test score actually declined.
So, it is entirely possible that proficiency rates decreased in DCPS while the average score increased, and vice-versa. It is also possible that proficiency rates can go up and down while the average score remains stable. The same goes for changes in rates of students who fall into the other common categories, such as below basic, basic, and advanced.
It’s bad enough that we don’t usually follow students over time when we assess progress, and that most of the year-to-year changes in districts’ test scores are often little more than random fluctuation. But changes in the percentage of students scoring proficient (and basic, below basic, etc.) are a particularly poor measure of progress.
So, let’s not mistake proficiency or other rates for scores, especially when we are looking at year-to-year changes in these rates. They may look like the kind of 0-100 percent scores we all got in school, but rates are actually one big step removed. Calling them scores is misleading and inaccurate.
Average scores, on the other hand, while inadequate themselves, are probably more meaningful descriptive statistics when we’re assessing school performance. As with proficiency rates, they do not (usually) follow students over time, so they too are snapshots that don’t allow us see how students progress (or fail to progress) as they move through school. But unlike proficiency rates, averages at least represent a summary statistic that speaks to the “typical” student, not just those above or below a cutoff score (one that is often arbitrary, by the way). In districts’ or schools’ “progress reports,” changes in the average score should accompany changes in proficiency rates.
But there is a good reason why nobody was able to discuss average DC student scores along with the proficiency rates: DCPS does not seem to provide average scores (not that I could find). These data are definitely available to DCPS, as they are used to construct the proficiency rates. Other districts provide them. They should be public information.