# The Ever-Changing NAEP Sample

The results of the latest National Assessment of Educational Progress long term trend tests (NAEP-LTT) were released last week. The data compare the reading and math scores of 9-, 13- and 17-year olds at various points since the early 1970s. This is an important way to monitor how these age cohorts’ performance changes over the long term.

Overall, there is ongoing improvement in scores among 9- and 13-year olds, in reading and especially math, though the trend is inconsistent and increases are somewhat slow in recent years. The scores for 17-year olds, in contrast, are relatively flat.

These data, of course, are cross-sectional – i.e., they don’t follow students over time, but rather compare children in the three age groups with their predecessors from previous years. This means that changes in average scores might be driven by differences, observable or unobservable, between cohorts. One of the simple graphs in this report, which doesn’t present a single test score, illustrates that rather vividly.

The graph, which is pasted below, compares 13-year olds in 1978 with those in 2012 in terms of three key characteristics, all of which make for pretty striking comparisons (data in an appendix show that the breakdown is similar for the other two age groups). Let’s quickly review all three.

First, 80 percent of test takers in 1978 were white. Today, the proportion is 56 percent, mirrored by a sharp increase in Hispanic students. It is, however, a pretty stark difference, even over three decades. Moreover, the appendix shows that the proportion was 71 percent in 1999, so a chunk of the overall shift since 1978 actually occurred fairly recently.

Next, by grade: In 1978, 28 percent of NAEP-LTT test takers were in 7th grade or lower, compared with 39 percent in 2012. Although standards and curriculum are different today, it’s worth noting that the 13-year old sample has changed as far as where they are in the K-12 system.

Third, there is the difference in parental education. The proportion of the 2012 sample with parents who completed college is over twice as high (54 percent) as in 1978 (26 percent). Conversely, the percentage of 13-year olds with parents who have a high school diploma or less is half its 1978 level. Again, some of this change is recent – for example, the proportion with a high school diploma or less was 27 percent in 1999, compared with 20 percent in 2012.

In short, the student population, and thus the NAEP samples, are changing, over the short- and longer terms. Any concurrent changes in testing performance may just as easily be due to these and many other shifts in the characteristics of the test takers– including unobservable factors that cannot be gleaned from breakdowns by subgroup – as to any change in school performance. This most certainly does not mean that schooling quality is unimportant, only that raw NAEP scores by themselves do not measure it very well, and they’re not supposed to.

- Matt Di Carlo

Great points, as usual. Another important point is that a greater percentage of 17 year olds remain enrolled in school now than in past years and the increased percentage of such kids are very likely drawn from the lower distribution of performance. This could very well explain the flat nature of the scores for 17 year olds.

Is there an easy place to see a comparison that controls for this? I.e., matches similar students over time?