Sound the alarm for worrisome results of the Spring 2021 state summative assessment.

Break the Glass and Pull the Alarm! The Signal From Spring 2021 State Testing Is Clear

A Call to Policymakers to Support Multi-Year Acceleration Efforts

I and many others were concerned that the uncertainty with interpreting Spring 2021 state summative test scores because of all the “noise” associated with the pandemic would blur any signal we were trying to hear from the data. Unfortunately, that signal is so loud that we can’t miss it. In spite of any uncertainty, we have more than enough information to pull the policy alarm! One of the reasons why the results from Spring 2021 state testing are so valuable is that they should make crystal clear to policymakers and others that it will require many years to help students get back on track.

I was an early and vocal skeptic about requiring states to administer their state summative assessments this past year. I still maintain that states needing flexibility should have been granted more relief than they were as long as they had another plan in place to collect data on student well-being and learning. That said, test scores from states with adequate participation rates portray significant and widespread pandemic effects on student learning. Combined with the results from the interim assessment companies, we see a worrisome picture of student learning and growth. Yes, we need to be much more careful than normal when analyzing the data and interpreting the results, especially the comparisons associated with individuals and groups of students due largely to different constellations of students testing across locales and years. Like Harry Truman’s wish for a one-handed economist, we cannot waffle over the obvious results we are seeing from states across the country.

A Consistently Disturbing Pattern

The Center for Assessment, led by my colleagues Damian Betebenner, Nathan Dadey, and Leslie Keng, has analyzed the data from more than a dozen states. The results summarized by Betebenner and colleagues provide compelling evidence that the disruptions to normal schooling were profound and widespread even if there were differences across the states, subject areas, and grade spans. In fact, they noted that the performance declines were 2-3 times greater than those observed for students following Hurricane Katrina. Put in a way that we psychometricians don’t like, we’re seeing declines ranging from one-quarter to three-quarters of a year of learning no matter how school was structured in 2020-2021. The test results for students who learned remotely for most of the year were especially devastating. 

All of the analyses conducted by the Center team tried, to the extent possible, to address issues that result from changes to the tested population between 2019 and 2021 using either the approaches outlined by Damian Betebenner and Rich Wenning that include both achievement and growth analyses and/or the approaches described by Andrew Ho in his Three Metrics paper. Of course, no analyses can fully account for the differential effects of the pandemic and for unmeasured variables (e.g., the choices that parents make regarding having their child attend school in person or remotely when given the option) and unmeasured students (i.e., students who did not participate in testing and the many students who did not enroll in public schools at all in 2020-2021), but the consistency of the pattern across so many states using a wide variety of tests and analytic approaches is convincing. In fact, based on our analyses of missing data, these results would have been more severe had almost all students been enrolled and included in state testing.

Policymakers Must Step Up

Congress and the President authorized a massive influx of federal funding through the Elementary and Secondary Schools Emergency Relief (ESSER) Act to pay for programs to address “learning loss.” This money has been a tremendous help in supporting states and districts in launching programs and other initiatives to address student learning needs. Almost all of this federal funding, however, will expire by September 30, 2023, with the possibility of some funding lasting until September 2024. 

State legislators and governors must step up and continue funding schools well beyond typical levels to ensure acceleration efforts can continue until students are well along the path to recovery, an effort that will likely extend beyond September 2023. I disagree with many of the causal inferences in Mike Petrilli’s recent commentary, but he argues convincingly that student learning suffered for many years as a result of the funding drops following the Great Recession of 2008-2009. We should not make the same mistake here. 

It Takes More Than Money

In addition to providing financial resources, state education leaders must provide guidance to the districts that are struggling to figure out how to help students move forward. Many states are already doing this, but the type of acceleration required is unprecedented. Therefore, new research-based initiatives and innovations should be employed to help educators meet these challenges.

Assessment, if well used, can provide evidence to support and monitor improvement efforts. State assessments are best suited to monitoring large-scale educational trends and can shed light on where interventions are working and where they are falling short so state leaders can direct support and guidance. However, teachers need assessments closer to the day-to-day curriculum and instruction to best support student learning. The effectiveness of formative assessment has been well-documented, but so has the need for professional development to ensure high-fidelity implementation. Fulfilling this need would be a worthwhile use of ESSER funds. Teachers also need assessments to evaluate student learning of specific units of instruction as regular progress checks. States can aid this work by supporting the development of assessments that can be coherently embedded in some of the most popular high-quality curriculum programs in the state. Many have suggested having commercial interim assessments serve this purpose. They can’t. Essentially all of these interim assessments are too far removed from the specific curriculum to provide useful information for monitoring these learning interventions. 

Districts Taking Action

Ultimately, this work has to take place in districts and schools. I recently had the privilege of attending a meeting with the phenomenal leaders of the Gwinnett County School District in Georgia. Gwinnett leaders and teachers are focused on looking forward rather than trying to remediate every skill students did not learn the previous year.

Among the many initiatives in Gwinnett, their Summer Enrichment + Acceleration (SEA), a three-week, in-person program that occurred this past June and July, focused on accelerating and expanding on student learning for the current grade and previewing what students will learn at the next grade level. Classes were small, with 15 or fewer students, and instruction was designed to be fun and engaging. Gwinnett’s leaders are continuing these acceleration efforts through the 2021-2022 school year with intensive tutoring for students needing more than small group instruction. Paid for largely by ESSER funds, the district plans to offer the SEA program again next summer. Importantly, Gwinnett is offering this valuable opportunity to students in all grades, as well as those young children who were set to enroll in the district in 2021-2022 (for an engaging description of SEA, watch this interview with Associate Superintendent Dr. Clay Hunter). Gwinnett’s leaders recognize that these efforts will have to continue for many years, likely well beyond when federal funding runs dry.

The Role of State Test Data

One of the reasons why I argued about the value of test data in 2021 was because we could not wait for state test results in order to begin addressing learning issues. High-functioning school districts, like Gwinnett, started planning interventions early in the 2020-2021school year before they saw any state test results, especially once they knew they had access to significant federal funding. They knew enough from classroom assessments and opportunity-to-learn indicators to determine that students were struggling, something high-functioning districts are generally able to do every year.

That said, the Spring 2021 state test results provide a clear signal to policymakers about the scope and depth of the pandemic learning disruptions. We should look to state testing in Spring 2022 to confirm what we learned in 2021 and to evaluate the progress made during the 2021-2022 school year. Schools and districts cannot wait for the next cycle of state test results. They must immediately intervene for essentially all students and use curriculum-embedded assessments to closely monitor student learning.  

Share: