SOARING TEST SCORES….Here’s the headline in the Washington Post today:

Test Scores Soar After ‘No Child’

Now, this is a peculiar headline since the second paragraph of the accompanying story admits, “The study’s authors warned that it is difficult to say whether or how much the No Child Left Behind law is driving the achievement gains.” And indeed, the study from the Center on Education Policy (available here) goes to considerable pains to emphasize that the trend they’re reporting started before NCLB was enacted. This, along with other factors, makes it very difficult to say whether, or how much, NCLB is responsible for the gains since 2002.

But put that aside for a moment. An even better question is: even if state test scores are rising, does that indicate that student achievement is also increasing? Bob Somerby suspects that rising scores might actually be due to dumbed down tests, and unfortunately, the study itself suggests he’s right. Here’s the paragraph that jumped out at me:

When the percentage of students scoring at the proficient level on state tests is compared with the percentage scoring at the basic level on the National Assessment of Educational Progress (NAEP), states show more positive results on their own tests than on NAEP. Moreover, the states with the greatest gains on their own tests were usually not the same states that had the greatest gains on NAEP.

Chapter 6 of the report goes into this in detail (see pp. 68-70), but the bottom line is that there is virtually no correlation at all between rising state test scores and rising NAEP scores — and like it or not, NAEP has long been considered the gold standard for consistent and reliable measurement of student achievement. In fact, the distribution of the scores is rather curious. As you can see in the chart on the right (for middle school reading results), there are some states where scores rose on both tests (top right) and some where they fell on both tests (bottom left). No problem there. It’s what you’d expect if both tests were doing a decent job of measuring performance.

However, there are virtually no states that improved on NAEP but fell on their own tests. A rising NAEP score really does seem to indicate better performance that shows up no matter what test you take. Conversely, there are loads of states that showed improvement on their own test even though results fell on NAEP. Peculiar, no? It’s almost as though the states tests aren’t really testing actual performance very well.

The report suggests several reasons why the results of state tests might not align with NAEP, and score inflation is one of them. More important, I suspect, is the first reason they list: alignment with state curriculum standards. State tests are designed to be tightly constrained to state curriculum standards, and teachers are tightly constrained to teach precisely to those standards. From the report:

For example, Jacob (2007) found that for Texas 4th graders, the difference in scores between NAEP and the state exam was largely explained by differences in the set of math skills (multiplication, division, fractions, decimals, etc.) covered by the two tests, and the format in which test items were presented (word problems, calculations without words, inclusion of a picture, etc.).

Obviously kids are going to do better on a test that perfectly replicates what they’re familiar with from class. Frankly, though, fourth graders are all taught basic arithmetic, and if merely making small changes in the format of the problems causes the NCLB gains to disappear, then NCLB isn’t doing much to genuinely improve basic skills. More data, please.

Our ideas can save democracy... But we need your help! Donate Now!