With a decline in scores in SAGE tests among Utah public school students, and a slight increase in ACT test scores, educators can choose to either look at the situation as a glass half full or a glass half empty. In either case, they will be seeing only half a glass, and that’s certainly not preferable to steady year-to-year improvement in student achievement.
Scores in the Student Assessment of Growth and Excellence tests declined for the first time in the four years the exams have been administered, though the opt-out rate has risen steadily. The test is not required, and in some schools, nearly half of the students choose not to take it. That certainly skews the numbers, but just how isn’t clear. It’s possible that more high-achieving students are avoiding the test compared to less-engaged students, but that would take some significant data mining to determine. The ACT test, which is encouraged among college-bound students, may be more of a bellwether when it comes to the progress of higher-achieving students. Last year’s ACT scores rose slightly above the year before, which is an encouraging sign.
There are conflicting opinions about the value of SAGE tests and other forms of annual examination in public schools and how much they should play into curriculum development. But the fact remains we need some credible means to measure progress in areas of academic pursuit. Flawed as it may be in that context, the SAGE scores still offer a means for educators to make assessments on the effectiveness of curricula and teaching techniques.
The tests do reveal that some districts consistently outperform others, even with opt-out rates taken into account. The Canyons District, for example, showed increases in SAGE scores last year, with averages in all categories of testing above the state average. The district attributes that to its commitment to research-based instructional practices. To the extent the tests are an appropriate yardstick of progress, what the Canyons is doing should be of interest to administrators in other districts. State leaders say they will probe the heaps of data the SAGE scores provide and try to find answers as to why declines have outpaced gains in the last year. That’s precisely why examinations like the SAGE tests are meaningful, though they may carry more value in the way of mid-course guideposts than a precise measuring stick that would show if we’re moving closer to the goal of excellence in education.
Schools should not discount the fact the SAGE results show the largest percentage of decline in the category of proficiency in science, with math a close second. That should be of paramount concern given the critical need to turn out students well-versed in these important categories of economic growth. Overall, Utah educators should be pleased we have generally seen rates of incremental progress in testing scores in the past decade, but they should commit to looking closely at the new data to make sure they are doing what they need to in order to keep that long-term trend heading in the right direction.