The Board of Regents agrees annual changes to state tests render them incomparable to past years’ results, but there’s less accord over what that means.
When results of the spring math and English tests for third- through eighth-grade students were released last month, state and district officials highlighted statewide and district-level improvements over 2015 proficiency rates.
But changes to this year’s tests, including fewer questions and no time limits, brought a wave of skepticism about whether the results could be compared at all.
“Why do we keep saying the kids are improving or not improving when we created a new baseline?” Regent Judith Johnson asked at Monday’s Regents meeting. “To me, that is a contradiction.”
The board discussed how the annual comparisons invariably seep into public discussion and eventually into political debates over state education laws. Some board members worried the appearance of improving scores would cause apathy among lawmakers.
“Even with the caveats that we put out there, the public believes that they can be compared because the headline goes out that the test scores go up,” Regent Roger Tilles said.
Tilles went on to express concern that students were labeled as failures by the annual tests – even those with learning disabilities and English-as-a-second-language pupils – and suggested he wished the board could “opt out” of them altogether.
Federal law requires the state to administer annual assessments of math and English proficiency for students in third through eighth grades.
The board, however, did not discuss specifics of how to better use the annual test results or communicate what they mean to the public.
Education Commissioner MaryEllen Elia has repeatedly said this year’s test results – because of the changes to the tests and procedures – are not an “apples to apples” comparison to last year’s test. She repeated that point at Monday’s meeting.
But on Monday, Elia still shared with the board a 55-page presentation that, in dozens of charts and graphs, showed how the test scores have changed over the past four years for different categories of students.
In defending those comparisons, Elia argued the tests were comparably rigorous each year.
She also argued that testing data – even if not useful for year-to-year comparisons – gives schools and districts the results of a single test given to all of its students. Those results can help districts identify areas of concern or excellence.
Even if board members agree it’s important to have a consistent test for year-to-year comparisons, the spring tests are unlikely to become the baseline moving forward.
As part of the Regents’ response to statewide concerns over the annual tests – best embodied by the over 20 percent of students who refused to take them each of the past two years – more changes are likely.
State officials are working to review the state education standards. Those changes will be followed by curriculum and assessment changes over at least the next couple school years.
“All measurement systems have inherent weaknesses,” Regent James Tallon said. “We made changes in response to considerable criticism and in elements of uncertainty.”
Reach Gazette reporter Zachary Matson at 395-3120, [email protected] or @zacharydmatson on Twitter.