Late last week, a big new report purporting tell the public how honest state test score results were came out from two pro-standards nonprofits (Achieve and the Collaborative for Student Success).
Basically, the report compared state-produced achievement scores to national NAEP results. Some states like New York were ranked as top truth tellers. Others like Tennessee were making dramatic progress in closing the so-called “honesty gap.” States like Georgia were said to have an honesty gap of 60 percentage points.
Fueled in part by a catchy approach to a mind-numbing issue and a press call in advance of the report’s release, the study generated a fair amount of news coverage and mainstream editorial-writing: New York Daily News, ChalkbeatNY, ChalkbeatTN, The Tennesseean, Daily Caller, Ed Week, McClatchy News Service.
However, FairTest, the standardized testing watchdog organization, was quick to issue a press release condemning the new report, claiming that it comparing NAEP scores, NAGB proficiency levels, and state proficiency levels was inappropriate. FairTest admits that some states have set low bars for proficiency, but claims that the National Assessment Governing Board’s proficiency levels “have been widely criticized as arbitrary and inaccurate by many independent analysts, including the National Academy of Science and the Government Accountability,” according to FairTest’s Bob Schaefer. “All claims about what constitutes “proficiency” are little more than value judgements, which are often based on political or ideological agendas.”
So what’s the scoop? In an email, Achieve’s Mike Cohen admits that human judgement has always been part of the standards-setting process, in both state tests and national ones, but that the NAEP definition of proficient has the advantage of being the result of a stable, high-quality test. Cohen also points out that new Common Core testing projections “fall right in line with the NAEP levels, with respect to the % of kids who score proficient.”
What do NCES and NAGB have to say about the Achieve report, which relies on their data? Nobody at the two organizations seems in any great rush to disavow the organization’s use of the data, but nobody seems in any great rush to endorse it, either. For reasons that weren’t entirely clear to me, representatives for NCES told me to talk to NAGB. NAGB representatives said they were declining to comment and told me to talk to NCES. I’m told that NCES is soon coming out with a study mapping state proficiency rates to NAEP scores, which could confirm or conflict with the Achieve study.
As for the journalism involved: Many of the outlets that covered the story hedged their bets by making clear in the headline or elsewhere that the results were claims from a report. I’m not sure if readers understand that a headline beginning with “Report: …” signifies uncertainty but I can’t claim that the outlets weren’t trying to make this clear. It’s not apparent if many of them called NCES or NAGB, much less FairtTest or anyone else who might have been expected to disagree. EdWeek did a good job explaining the political context behind issuing the report, rather than just describing its results and quoting supporters.