This story was co-authored by Emmanuel Felton.
The results have started to come in from some of the new Common Core-aligned exams given this spring. And the news is good. Surprisingly good.
Two multi-state groups, Smarter Balanced Assessment Consortium and Partnership for College and Career Readiness (PARCC), spent years making standardized tests to judge how well students have mastered the Common Core, a set of educational standards which detail what students should be able to do in math and English in each grade.
Students in 18 states took Smarter Balanced Assessment Consortium exams for the first time this spring. Missouri and West Virginia released their official statewide results this week, while Oregon and Washington reported preliminary information from the vast majority of their districts last month. All four states reported students exceeding expectations. (PARCC results are not expected until later this fall.)
In the more than 40 states that adopted Common Core, the switch to the standards has been inextricably tied to new tests. Both Smarter Balanced and PARCC promised their computer-based exams would be more difficult than what states had given previously. The standards themselves are also generally considered more demanding than whatÂ most states had in place before, and educators around the country are bracing for a massive drop in pass rates.
That’s what happened in New York and Kentucky when those states switched to their own Common Core-aligned exams. The Smarter Balanced field test, a trial run of the exam given in 21 states in 2014, showed English proficiency rates in the low 40s and math proficiency rates in the 30s.
But all four of these states did better than that field test on the English exam and all but West Virginia and Missouri’s eighth graders improved on the math exam. (Missouri’s eighth graders who are taking Algebra, generally the higher performing students, did not take the Smarter Balanced exam.)
Smarter Balanced officials said that it was too early to hypothesize what could have caused these increases.
There could be many reasons, according to Joseph Martineau – senior associate at the National Center for the Improvement of Educational Assessment – including that students were more motivated to take this test more seriously, that these four states are among the higher performers and that students made genuine gains between last year and this year. “I think it’s a combination of these things and we really can’t pull these things apart,” said Martineau, who served as the deputy superintendent in charge of testing at the Michigan Department of Education during the state’s transition to Smarter Balanced.
It’s not just that the states made significant improvements over the field test. The drops in their scores from old state exams were much smaller than the 30-plus percentage points declines in New York and Kentucky. And in some cases, scores actually improved from the state exams in 2014 to Smarter Balanced 2015.
(Oregon is not included in this graphic because its state test score data was not disaggregated by grade.)
Both consortia looked at the National Assessment of Educational Progress, or NAEP, when determining their cut scores for proficiency – albeit in slightly different ways – leading to speculation that the results might look similar to the national exam. In June, Smarter Balanced told Hechinger that its results likely wouldn’t be “wildly different” from NAEP. Yet for these states, particularly in English, pass rates on Smarter Balanced far exceed those from the 2013 NAEP exam.
So, does this mismatch indicate a problem with the tests?
Again, Smarter Balanced officials declined to offer thoughts on the comparison, but did emphasize that the NAEP is only given to a sample of students in each state and that the most recent results of that test are now two years old.
Martineau says NAEP will likely always be the hardest test around – and that’s okay. “If you compare NAEP with Smarter Balanced and PARCC, you will see the expectations are similar but it’s a matter of very high aspirations for students verses high but realistic aspirations for students.”
[Cross-posted at The Hechinger Report]