Reflecting on 2023 and Looking Ahead to 2024
It is a privilege to reflect on another terrific year at the Center for Assessment and exciting to turn the page on a new year. We were active on many fronts in 2023. We continued to support our more than 40 state and district partners as their educational systems slowly climb back from the pandemic interruptions.
While we’re called the Center for Assessment, we focus as much on accountability as assessment in both our work and writing. Perusing our blogs and papers from last year shows a fairly even split between the two.
One thing that I love about being at the Center is that I get to work with a collection of brilliant thinkers, all committed to improving the education of all students. Unsurprisingly, we have diverse perspectives, which means that we rarely have a “Center position” on an issue.
Yes, we have shared values, but we often take different positions on particular topics. Heck, just ask about the role of consequences in validation in one of our staff meetings, and watch out! In 2023, some of these differences played out in our writing about through-year assessment and accountability system redesign.
Confronting Though-Year Assessment
Through-year assessment systems have dominated assessment discussions for the past several years, as reflected in our many blogs, papers, convenings, and webinars.
One of the two major papers the Center released last year, “Through-Year Assessment: Ten Key Considerations,” summarized the current thinking about these initiatives. The paper’s authors, Nathan Dadey, Carla Evans, and Will Lorié, outlined the issues states should consider if they are thinking about—or have already implemented—a through-year assessment system. This important paper consolidated and extended a lot of the Center’s writing on this topic and is serving as the go-to resource for through-year assessment.
It’s also true that our policy- and measurement-savvy associates vary in their level of optimism that through-year designs can solve many of the problems their advocates are claiming these systems can solve. That’s what I mean about working with brilliant people: We differ in our views about important topics, and as a team, we benefit from those differences.
Our assessment contributions extended beyond through-year assessment. The Center continues to focus on balanced assessment systems, as we have for more than 20 years. In fact, several of us—Carla Evans, Erika Landl, Caroline Wylie, Brian Gong, and I—have contributed to a soon-to-be-published National Academy of Education volume called Reimagining Balanced Assessment Systems. The book is designed to recenter the notion of balance in a way that supports high-quality classroom instruction, learning, and assessment. We’ll be writing a lot more about this in 2024.
Considering both through-year assessment and balanced assessment systems highlights what many of us worry about when evaluating through-year assessment proposals. Balanced assessment systems are an attempt to create an organized set of assessments designed to serve multiple uses based on conceptually coherent designs, particularly in terms of the underlying model of learning the assessments are intended to support. Crucially, in a balanced assessment system, each assessment is included to serve one main purpose, such as instructional feedback or large-scale monitoring.
On the other hand, most of the through-year assessment designs we’ve seen attempt to serve multiple purposes with the same set of assessments. One of those purposes has to be to inform accountability determinations, since that’s required by federal law.
The second purpose is often related to some aspect of instructional improvement. It’s no secret that some of us at the Center are more skeptical than others of through-year assessment designs in the current context. We have these doubts because the way they’re conceptualized runs counter to the key ideas that underpin balanced assessment systems.
Further, we have not seen evidence that any single assessment—or even set of assessments—can serve such disparate purposes. Again, not all Center professionals are as pessimistic as others, but we are all united in the need for evidence to support strong claims.
Working Toward Better Accountability
We did a lot of thinking about how to improve state accountability systems to better serve school improvement and student learning. Capitalizing on our varied perspectives, our recommendations this year ranged from radically overhauling the current accountability approach to working within the current system to improve things now.
Given our pragmatic orientation, it is not surprising that more of our writing focused on the latter orientation. We captured this viewpoint in our second major paper of 2023, “The Path Forward for School Accountability: Practical Ways to Improve School Accountability Systems Now.”
Chris Domaleski and four colleagues discussed practical strategies for improving accountability now, within or alongside ESSA’s requirements. The authors did not dispute the promise of broader reforms longer-term, but they provided insights into the many ways states can improve their accountability systems without waiting for a reauthorization of the federal law.
We also published several blog posts about the importance of better connecting school accountability with school improvement. We ended the year with a series of three posts by Juan D’Brot and Chris Brandt, who offered suggestions for improving the validity of accountability systems.
We dedicated our annual conference, the Reidy Interactive Lecture Series (RILS), last September to exploring potential improvements and reforms for state and district accountability enterprises. We learned from multiple experts and practitioners about how we might improve our thinking and practices to best serve students and educators. All of the materials are still available in our resource library.
School Accountability and Assessment in 2024
We will undoubtedly continue to make progress on the assessment and accountability issues I just recounted from 2023. But as I look ahead, two related topics are calling out for attention in 2024.
It’s hard to read the news or scroll through social media without seeing something about the latest developments in artificial intelligence. The prevalence of high-quality chatbots is changing life and learning all around us. Many assessment researchers are already deeply engaged in thinking about how we can best capitalize on these new technologies while doing our best to avoid potential pitfalls.
André Rupp and Will Lorié highlighted some of these issues as they considered the potential utility of artificial intelligence in 10 domains of assessment development and implementation. The Center plans to continue our learning journey in this exciting new area and share our thinking and recommendations throughout 2024 and beyond.
We hear many groups and individuals calling for assessment innovation. Other than getting rid of or scaling back the end-of-year state assessment, I’m not clear about what they mean.
If we really unpack the meaning of innovation, like we did at our annual conference in 2021, we might find that we do not need assessment innovation, per se. We might just need to attend to high-quality balanced assessment system and/or rich tasks embedded in high-quality curriculum to support student learning. We intend to dig into this notion of assessment innovation and hopefully provide advice on the types of high-leverage practices that we think would be fruitful for district and state leaders to pursue.